Databricks Architect - Bengaluru, India - Tarento Group
Description
Bachelor's or Master's degree in Computer Science, Information Technology.- 8+ years of experience in Data warehousing
- Proven experience as a Databricks Architect or a similar role, with handson experience in architecting and implementing data analytics solutions on the Databricks platform.
- Design and develop scalable data pipelines, data lake architectures, and data warehousing solutions on the Databricks platform using Spark and Delta Lake.
- Implement and configure the Databricks environment, including clusters, notebooks, and libraries, ensuring optimal performance and resource utilization.
- Integrate data from various sources, including structured, semistructured, and unstructured data, into Databricks for processing and analysis.
- Proficiency in programming languages such as Python, Scala, or Java.
- Experience with cloud platforms like AWS, Azure, or Google Cloud Platform (GCP).
- Knowledge of data modeling, data warehousing concepts, and ETL processes.
- Familiarity with big data technologies and distributed computing frameworks.
- Strong problemsolving and analytical skills.
- Excellent communication and collaboration skills to work effectively with crossfunctional teams
Job Requirement
- Bachelor's or Master's degree in Computer Science, Information Technology.
- 8+ years of experience in Data warehousing
- Proven experience as a Databricks Architect or a similar role, with handson experience in architecting and implementing data analytics solutions on the Databricks platform.
- Design and develop scalable data pipelines, data lake architectures, and data warehousing solutions on the Databricks platform using Spark and Delta Lake.
- Implement and configure the Databricks environment, including clusters, notebooks, and libraries, ensuring optimal performance and resource utilization.
- Integrate data from various sources, including structured, semistructured, and unstructured data, into Databricks for processing and analysis.
- Proficiency in programming languages such as Python, Scala, or Java.
- Experience with cloud platforms like AWS, Azure, or Google Cloud Platform (GCP).
- Knowledge of data modeling, data warehousing concepts, and ETL processes.
- Familiarity with big data technologies and distributed computing frameworks.
- Strong problemsolving and analytical skills.
- Excellent communication and collaboration skills to work effectively with crossfunctional teams
More jobs from Tarento Group
-
Elastic Search Expert
Bengaluru, Karnataka, India - 6 days ago
-
Keycloak Expert
Bengaluru, Karnataka, India - 6 days ago
-
Artificial Intelligence
Bengaluru, Karnataka, India - 1 week ago
-
Associate Architect
Bengaluru, Karnataka, India - 2 days ago
-
Microsoft Dynamic CRM
Bengaluru, Karnataka, India - 1 week ago
-
D365 Finance Solution Architect
Bengaluru, Karnataka, India - 1 week ago