Data Engineer - Gurugram, India - Airtel Payments Bank
Description
Job Description for a Data Engineer
Position: Data Engineer
Overview: We are seeking a skilled and motivated Data Engineer to join our dynamic team. The ideal candidate will play a crucial role in designing, developing, and maintaining our data architecture, ensuring the efficient flow and storage of information for analytical and business intelligence purposes.
Responsibilities:
1. Data Architecture:
• Design, develop, and maintain scalable and robust data architecture.
• Collaborate with data architects, data scientists, and analysts to understand data requirements.
2. Data Modelling:
• Develop and implement data models for efficient storage and retrieval of data.
• Optimize database performance and ensure data integrity.
3. ETL Processes:
• Design and implement ETL (Extract, Transform, Load) processes to move data from various sources to data warehouses.
• Ensure the timely and accurate loading of data.
4. Data Integration:
• Integrate data from multiple sources, ensuring consistency and reliability.
• Work on real-time data integration solutions when required.
5. Data Quality and Governance:
• Implement data quality checks and governance policies.
• Monitor and resolve data quality issues promptly.
6. Database Management:
• Administer and maintain databases, ensuring high availability and security.
• Perform regular backups, updates, and patches.
7. Collaboration:
• Collaborate with cross-functional teams to understand business needs and provide data solutions.
• Work closely with data scientists and analysts to support their data requirements.
8. Documentation:
• Create and maintain comprehensive documentation for data processes, architecture, and procedures.
• Train and support team members on data-related technologies.
Qualifications:
• Bachelor's degree in Computer Science, Information Technology, or a related field.
• Proven experience as a Data Engineer or similar role.
• Strong programming skills in languages such as Python, Java, or Scala.
• Proficiency in SQL and experience with relational and NoSQL databases.
• Knowledge of big data technologies such as Hadoop, Spark, and Kafka.
• Experience with cloud platforms like AWS, Azure, or Google Cloud.
• Familiarity with data warehousing solutions (e.g., Redshift, Snowflake).
• Strong problem-solving skills and attention to detail.
• Excellent communication and collaboration skills.
Preferred Skills:
• Experience with containerization and orchestration tools (e.g., Docker, Kubernetes).
• Knowledge of data streaming technologies (e.g., Apache Flink, Apache Kafka Streams).
• Understanding of machine learning concepts and data science workflows.
• Certification in relevant technologies is a plus.