No more applications are being accepted for this job
- Design, develop, and maintain end-to-end data pipelines on AWS, utilizing serverless architecture.
- Implement data ingestion, validation, transformation procedures using AWS services such as Lambda, Glue, Kinesis, SNS, SQS, and CloudFormation.
- Write orchestration tasks within Apache Airflow.
- Develop and execute data quality checks using Great Expectations to ensure data integrity and reliability.
- Collaborate with other teams to understand mission objectives and translate them into data pipeline requirements.
- Utilize PySpark for complex data processing tasks within AWS Glue jobs.
- Bachelor's degree in Computer Science, Engineering, or related field.
- Strong proficiency in Python programming language.
- Hands-on experience with AWS services.
- Experience with serverless architecture and Infrastructure as Code (IaC) using AWS CDK.
- Proficiency in Apache Airflow for orchestration of data pipelines.
- Familiarity with data quality assurance techniques and tools, preferably Great Expectations.
- Experience with SQL for data manipulation and querying.
- Strong communication and collaboration skills, with the ability to work effectively in a team environment.
- Experience with Data Lakehouse, dbt, Apache Hudi data format is a plus.
Data Engineer - India - MethodHub
MethodHub
India
2 weeks ago
Description
About Method-Hub Software Pvt Ltd.
We have attached a Method-Hub Company profile for your reference. We are a startup company in Cloud and Infrastructure, Data Engineering, BI Analytics , AI and RPA. We are rapidly growing fast in the market and now we reached employee size of We have offices in the USA, Canada, Australia,100+ Development Center in Thailand, UK, Mexico and 100+ Development Center in Chandigarh, Chennai ,Hyderabad and Bangalore.
www.Method-
Job Title: Data Engineer
Responsibilities:
Qualifications:
Regards,
Sudha