Databricks Architect - Bengaluru, India - Tarento Group

Tarento Group
Tarento Group
Verified Company
Bengaluru, India

1 week ago

Deepika Kaur

Posted by:

Deepika Kaur

beBee Recuiter


Description
Bachelor's or Master's degree in Computer Science, Information Technology.

  • 8+ years of experience in Data warehousing
  • Proven experience as a Databricks Architect or a similar role, with handson experience in architecting and implementing data analytics solutions on the Databricks platform.
  • Design and develop scalable data pipelines, data lake architectures, and data warehousing solutions on the Databricks platform using Spark and Delta Lake.
  • Implement and configure the Databricks environment, including clusters, notebooks, and libraries, ensuring optimal performance and resource utilization.
  • Integrate data from various sources, including structured, semistructured, and unstructured data, into Databricks for processing and analysis.
  • Proficiency in programming languages such as Python, Scala, or Java.
  • Experience with cloud platforms like AWS, Azure, or Google Cloud Platform (GCP).
  • Knowledge of data modeling, data warehousing concepts, and ETL processes.
  • Familiarity with big data technologies and distributed computing frameworks.
  • Strong problemsolving and analytical skills.
  • Excellent communication and collaboration skills to work effectively with crossfunctional teams

Job Requirement

  • Bachelor's or Master's degree in Computer Science, Information Technology.
  • 8+ years of experience in Data warehousing
  • Proven experience as a Databricks Architect or a similar role, with handson experience in architecting and implementing data analytics solutions on the Databricks platform.
  • Design and develop scalable data pipelines, data lake architectures, and data warehousing solutions on the Databricks platform using Spark and Delta Lake.
  • Implement and configure the Databricks environment, including clusters, notebooks, and libraries, ensuring optimal performance and resource utilization.
  • Integrate data from various sources, including structured, semistructured, and unstructured data, into Databricks for processing and analysis.
  • Proficiency in programming languages such as Python, Scala, or Java.
  • Experience with cloud platforms like AWS, Azure, or Google Cloud Platform (GCP).
  • Knowledge of data modeling, data warehousing concepts, and ETL processes.
  • Familiarity with big data technologies and distributed computing frameworks.
  • Strong problemsolving and analytical skills.
  • Excellent communication and collaboration skills to work effectively with crossfunctional teams

More jobs from Tarento Group