Fullstack Data Engineer - Pune
19 hours ago

Job description
Job Description : Data Engineer
Role Overview
We are looking for a highly skilled Full Stack Data Engineer with expertise in data technologies like snowflake, Azure Data Factory, Databricks to design, develop, and optimize end-to-end data pipelines, data platforms, and analytics solutions. This role combines strong data engineering, cloud platform expertise, and software engineering skills to deliver scalable, production-grade solutions.
Key Responsibilities
• Design and develop ETL/ELT pipelines on platforms like Databricks (PySpark, Delta Lake, SQL), Informatica, Teradata, Snowflake.
• Architect data models (batch and streaming) for analytics, ML, and reporting.
• Optimize performance of large-scale distributed data processing jobs.
• Implement CI/CD pipelines for Databricks workflows using GitHub Actions, Azure DevOps, or similar.
• Build and maintain APIs, dashboards, or applications that consume processed data (full-stack aspect).
• Collaborate with data scientists, analysts, and business stakeholders to deliver solutions.
• Ensure data quality, lineage, governance, and security compliance.
• Deploy solutions across cloud environments (Azure, AWS, or GCP).
Required Skills & Qualifications
Core Databricks Skills:
• Strong in PySpark, Delta Lake, Databricks SQL.
• Experience with Databricks Workflows, Unity Catalog, and Delta Live Tables.
• Experience in snowflake, data engineering technologies like ETL, ELT
Programming & Full Stack:
• Python (mandatory), SQL (expert).
• Exposure to Java/Scala (for Spark jobs).
• Knowledge of APIs, microservices (FastAPI/Flask), or basic front-end (React/Angular) is a plus.
Cloud Platforms:
• Proficiency with at least one: Azure Databricks, AWS Databricks, or GCP Databricks.
• Knowledge of cloud storage (ADLS, S3, GCS), IAM, networking.
DevOps & CI/CD:
• Git, CI/CD tools (GitHub Actions, Azure DevOps, Jenkins).
• Containerization (Docker, Kubernetes is a plus).
Data Engineering Foundations:
• Data modeling (OLTP/OLAP).
• Batch & streaming data processing (Kafka, Event Hub, Kinesis).
• Data governance & compliance (Unity Catalog, Lakehouse security).
Nice-to-Have
• Experience with machine learning pipelines (MLflow, Feature Store).
• Knowledge of data visualization tools (Power BI, Tableau, Looker).
• Exposure to Graph databases (Neo4j) or RAG/LLM pipelines.
• Experience working in Informatica, Teradata in ETL, ELT
Qualifications
• Bachelor's or Master's in Computer Science, Data Engineering, or related field.
• 4–7 years of experience in data engineering, with deep expertise in Databricks.
Soft Skills
• Strong problem-solving and analytical skills.
• Ability to work in fusion teams (business engineering AI/ML).
• Clear communication and documentation abilities.
About Us
At Codvo, we are committed to building scalable, future-ready data platforms that power business impact. We believe in a culture of innovation, collaboration, and growth, where engineers can experiment, learn, and thrive. Join us to be part of a team that solves complex data challenges with creativity and cutting-edge technology.
Similar jobs
We are seeking an experienced GCP Data Engineer with strong expertise in designing building and managing scalable data pipelines analytical solutions on Google Cloud Platform (GCP). · Design develop maintain data pipelines ETL processes on GCP using services like Dataflow Datapr ...
1 week ago
+Job summary · We are seeking an experienced GCP Data Engineer with strong expertise in designing, · building, and managing scalable data pipelines and analytical solutions on Google Cloud Platform (GCP).++Design develop maintain · data pipelines ETL processes on GCP using servic ...
2 weeks ago
Associate - Data Engineer-Data Engineering-Big Data Engineering
Only for registered members
Associate - Data Engineer-Data Engineering-Big Data Engineering job is available in Pune, Maharashtra. · Required skills and experience include Databricks hands-on experience, strong expertise in Apache Spark and distributed data processing, solid experience with Delta Lake and L ...
1 week ago
The role involves developing and maintaining scalable ETL pipelines using PySpark, Databricks, and other big data technologies. It also includes working with various structured and unstructured data sources to build efficient data workflows and integrate them into a central data ...
2 weeks ago
We are seeking a motivated Junior Data Engineer to support our data engineering initiatives. · ...
1 month ago
We are Cognyte, the global leader in security analytics software, · looking for an exceptional and passionate Data Engineer to join our Research business unit in India. · We believe that diverse teams drive the greatness of ideas, products, · and companies. Whatever your gender, ...
1 day ago
Do you love solving real-world data problems with the latest and best techniques? And having fun while solving them in a team Then come join our high-energy team of passionate data people. Jash Data Sciences is the right place for you. · We are a cutting-edge Data Sciences and Da ...
1 hour ago
We are looking for a Data Engineer to join our team in Pune and work on enterprise-scale GenAI projects. · ...
1 month ago
We are looking for passionate and motivated Fresher Data Engineers who are eager to build a career in data engineering. · Assist in designing and building data pipelines and ETL processesSupport data extraction, transformation, and loading activitiesMaintain data quality, validat ...
1 month ago
We are a custom product engineering company that supports both multinational organizations and scaling startups to solve their most complex business challenges. · ...
1 month ago
Data Engineer Job Description at Synechron · Synechron seeks a highly skilled Senior Data Engineer who will design, develop, and optimize · ,data marts ,on-the-Azure ecosystem. · - Design build marts curated layers reusable models analytical reporting needs, · i · - · develop hi ...
6 hours ago
This is a full-time · on-site role for a Data Engineer, · based in Bangalore Urban. The Data Engineer · will be responsible for designing, · developing, and managing scalable data pipelines. · The role also includes troubleshooting · & optimizing data workflows to improve perfor ...
1 month ago
Senior Data Engineer · The Senior Data Engineer will design and build large-scale data pipelines and platforms. · Create scalable batch and streaming data pipelines. · Optimize Spark jobs for performance. · ...
1 week ago
Data engineer responsibilities include contributing to the evolution of our data infrastructure and artificial intelligence initiatives. · ...
1 month ago
We help health plans break down data silos to create a single, trusted data foundation. That foundation powers better decisions —so plans can improve outcomes, reduce waste, · and deliver better experiences for members and providers alike.About Us · Abacus Insights is transformin ...
3 days ago
We are looking for a Senior Backend Data Engineer who can work directly with clients to cleanse, transform, and structure data coming from multiple unstructured and semi-structured sources. · ...
3 weeks ago
We are seeking an experienced and driven Data Engineer with 5+ years of hands-on experience in building scalable data infrastructure and systems. This role requires deep technical expertise, strong problem-solving skills, and the ability to thrive in a fast-paced evolving environ ...
1 month ago
Develop and maintain ETL/ELT pipelines for data ingestion and transformation. Design and manage data warehouses, lakes, and distributed systems. · ...
1 month ago
We are setting up a brand-new Data Engineering program with an aim to make the data assets high quality & available for scientific study leading higher monetization in terms of increased revenue, improved operational efficiency and high-quality service. · ...
1 month ago
4+ years of experience in data engineering, building enterprise-scale data systems. · ...
1 month ago
We are hiring for Well-Known German Product based MNC Position : Azure Data Engineer Exp : 6 to 8 Years · ...
3 weeks ago