No more applications are being accepted for this job
- Collaborate with BA s to tounderstand data requirements and translate them into efficientPySpark code running on the Databricksplatform
- Develop test and maintain ETLprocesses and data pipelines for collecting cleansing transformingand aggregating data from varioussources
- Proven experience as a DatabricksPySpark Developer or similar role with a strong understanding ofdata processing concepts andprinciples
- Proficiency in writing efficient andoptimized PySpark code for ETL processes data transformations anddata analysis
- Optimize PySpark jobs forperformance and scalability considering factors such as data volumedata structure and query complexity
- Must haveknowledge on data warehousing and hands on exp with any ETLtool
- Experience working with largescale datasets
- Handson experience with Databricksplatform and job scheduling. Understanding of SQL data modeling anddata warehousingconcepts
Business Analyst - Mumbai, India - Kezan consulting
Description
Role: BusinessAnalystExp: 5yearsLocation: MumbaiBudget: 10 lpaNP:ASAPIndustry criteria:ManufacturingKeywords:Role will require expertise in Data warehousing Databricks PysparkETL tool data analysis &reporting
JobDescription and expectations from therole:
datawarehouse,data analysis,data modeling,pyspark,datawarehousing,sql,databricks,etl tool