Curriculum Developer - New Delhi, India - Databricks

    Databricks
    Default job background
    Description
  • DiscoverCustomersPartners
  • Databricks PlatformIntegrations and DataPricingOpen Source
  • Databricks for IndustriesCross Industry SolutionsMigration & DeploymentSolution Accelerators
  • Training and CertificationEventsBlog and PodcastsGet HelpDive Deep
  • CompanyCareersPressSecurity and Trust
  • Ready to get started?
  • Curriculum Developer (Big Data / AI) Delhi, India

    As a Technical Curriculum Developer at Databricks, you will apply your experience in the Data and AI space to develop and maintain learning materials following curriculum development best practices. Under the direction of Senior Technical Curriculum Developers, you will develop content focused on the Databricks Data Intelligence Platform - specifically, around topics like data engineering, machine learning, AI, etc.

    As a member of Databricks Curriculum Development Team, you'll -

  • Work closely with global stakeholders, product managers and engineers, technical subject matter experts, and other technical curriculum developers to deliver high-impact learning content for both self-paced and instructor-led training.
  • Keep yourself up to date with the latest and greatest that Databricks has to offer.
  • Outcomes

  • Use your technical expertise to produce multi-modal technical training content for Databricks customers, which includes conceptual videos, hands-on labs, technical demos, etc.
  • Collaborate with technical curriculum developers, technical trainers, and SMEs to develop training content.
  • Review feedback on content and take the necessary steps to fix bugs, improve content quality, etc.
  • Implement and contribute to solid instructional design/curriculum development principles.
  • Skills/Experience

  • 2-5 years in analyzing, developing, and delivering highly technical content, especially in the field of data engineering, data science, machine learning, or similar
  • Able to learn new technologies quickly
  • Strong communication skills, analytical skills, and data-driven decision-making
  • Experience with the software development lifecycle (Git, testing, modularization, etc.)
  • Strong understanding of data engineering and/or machine learning concepts and processes
  • Comfortable programming in and debugging Python and SQL
  • Familiarity with at least one common cloud provider (AWS, Azure, or Google Cloud)
  • Experience with Databricks capabilities related to data engineering (Delta Lake, Apache Spark, Databricks Jobs, Hadoop, etc.), machine learning (MLflow, etc.)