GM – Data Engineering - Pune, India - Antal International

    Default job background
    Description
    Job Description


    Job Title:

    GM – Data EngineeringJob Type:

    PermanentLocations:

    PuneExperience:
    23 to 28 YearsSkill Set :GCP, Snowflake, Airflow, DBT, SQL, Python, Data Extraction, Data Transformation, Data Architecture, Data Pipeline, Data Ingestion, data warehousing, Data Engineering, Data Lake, Data Governance, Data Flow, Data Quality

    Key Responsibilities:
    Local Line Management, Recruitment &

    Team Development:

    You will manage several multi-disciplinary data delivery teams consisting of Data & Analytics Engineers and Test Engineers with Data Scientists and Data Visualization specialists embedded as required.

    The data teams are expanding rapidly, and you will play a key role in recruitment across your teams and support the ongoing learning and development of your team members.


    Data Delivery:
    You will be responsible for the delivery performance of your teams and will ensure key delivery metrics are closely monitored to allow you to best provide support where needed, communicate effectively on progress, and identify opportunities to improve and optimize our data delivery processes

    Data Architecture & Solution Design:

    You will support continual improvement and optimization of our data architecture working closely with other data managers and our data architecture function to ensure we have a good understanding of emerging trends in the data landscape and respond appropriately to evaluate and consider them as part of our longer-term data strategy.

    You will also be involved in supporting solution design for delivery through your team in line with data architecture standards and principlesRequirements:

    Communication:
    You should demonstrate strong written and verbal communication skills and be comfortable communicating and building relationships with stakeholders at all levels up to and including C-level

    Management:
    You should have prior experience managing a data team, ideally in a medium to large-scale organization.

    This would be a perfect opportunity for someone looking to extend their management remit across multiple teams and gain experience of building-up data teams.

    Experience managing multi-disciplinary, off-site, and multi-cultural teams would also be beneficial.


    Agile Delivery:
    You should have experience working in an Agile delivery environment, ideally using Scrum and\or Kanban.

    SQL (mandatory): You should be able to demonstrate a strong understanding of SQL and be comfortable reading and writing complex SQL queries ideally across multiple platforms Cloud Platforms (mandatory): You should have experience working with key services on either GCP (preferred), AWS or Azure.

    Key services include cloud storage, containerization, event-driven services, orchestration, cloud functions and basic security/user management.

    Data Warehousing (highly desirable): You should have experience working on a medium to large-scale data warehouse solution irrespective of underlying technology.

    Ideally you will have experience working on the design and data modelling stages of a data warehouse projects and be comfortable with conceptual, logical and physical data modelling techniques as well as dimensional modelling techniques CI\CD & Automation (desirable): Any experience developing or supporting data CI\CD pipelines regardless of tooling would be beneficial.

    We use Microsoft Azure DevOps to run most of our CI\CD pipelines.

    We also rely heavily on Infrastructure as Code for cloud infrastructure deployment so any experience with technology such as Terraform would be beneficial in this respect.


    Data Visualization (desirable):

    Although we have dedicated data visualization specialists within the team, any knowledge of, or experience with, data visualization platforms such as Tableau (preferred), Power BI, Looker or Quick Sight would be beneficial.

    Ingest, cleanse and transform data from a wide variety of source systems into our cloud data lake to support advanced analytics, data warehousing and data scienceTechnical Skills & Knowledge:
    Advanced SQL knowledge with experience using a wide variety of source systems including Microsoft SQL ServerExperienced in Cloud Data Engineering on the Google Platform (experience of AWS and Azure also beneficial but delivery will be focused on GCP)

    Specific experience with the following services running on the GCP platform:
    Google Cloud Storage (GCS)

    Google Cloud Composer (or Apache Airflow) including development of DAG's Google Kubernetes Engine (GKE) or equivalent experience working with containerization Google Cloud FunctionsExperience working with Infrastructure as Code (IaC), specifically with Terraform (equivalent experience with similar technology also accepted)Experience of working in an Agile delivery team using automated build, deployment and testing (CI\CD, DevOps, DataOps)Experience with one or more programming language compatible with developing functionality on the above platform (Python o0r Java preferred)Knowledge or experience in the field of data warehousing and advanced analytics would be beneficial but not essential, specifically any experience in the following area:
    Dimensional Modelling Working with Snowflake's Cloud Data Warehouse (Google Big Query experience also beneficial)

    Working with dbt to develop, test, deploy and monitor data transformation codeQualification & Certification:
    B.E./B.Tech/ MTech in IT or Computer Science from a reputed institute (preferred) or Master's Degree in Quantitative Subjects e.g. Mathematics, Statistics & EconomicsCloud Certification in GCP with the following being preferred (AWS, Azure certifications also beneficial)

    Google Certified Cloud Architect Google Certified Data EngineerAny certification or formal training in the following areas would also be highly beneficial:
    Python Snowflake Cloud Data Warehouse dbt Terraform

    Check Your Resume for Match

    Upload your resume and our tool will compare it to the requirements for this job like recruiters do.