Surveillance Data Modeler/Architect - Pune, India - F337 Deutsche India Private Limited, Pune Branch

    Default job background
    Description

    About the team:

    Surveillance & Regulatory:

    We are responsible for delivering solutions that protect Deutsche Bank's financial and reputational interests from criminal or inappropriate behavior.

    Through real-time sanction and embargo filtering, operational risk controls and sophisticated fraud detection that will increasingly deploy artificial intelligence, our systems protect both clients and the bank, allow risks to be managed according to risk appetite, and keep Deutsche Bank in compliance with global and local regulations.

    You will be a key member of Surveillance analytics squad in Surveillance & Regulatory tribe. This position is with Corporate Bank - Transaction Surveillance Technology area.

    Transaction Surveillance Technology is an integral part of Cash Management and has been established as a first line of defence function to build appropriate controls to ensure regulatory compliance .


    Surveillance analytics squad assists in the following areas:
    Design and implementation of Data Engineering pipelines as defined by use cases defined by both internal and external stakeholders.

    Work with the Domain Architect, Data Architect, Data Control& Governance Squad, AFC (Anti-financial crime) and Solution Architect to arrive at optimum solution given a certain risk appetite for different payment / product / geography mixes.

    Help to respond to auditory and regulatory bodies request for data (with different geography, payment, customer, and product dimensions).What we'll offer youAs part of our flexible scheme, here are just some of the benefits that you'll enjoyBest in class leave policyGender neutral parental leaves100% reimbursement under childcare assistance benefit (gender neutral)Sponsorship for Industry relevant certifications and educationEmployee Assistance Program for you and your family membersComprehensive Hospitalization Insurance for you and your dependentsAccident and Term life InsuranceComplementary Health screening for 35 yrs.

    and aboveYour key responsibilitiesAs a Lead Data Engineer within the Surveillance analytics space, you will be responsible to lead the end to end communication for design and implementation of Data Engineering pipelines (POC to Production and Post Production Support) as defined by use cases defined by both internal and external stakeholders.

    You will closely work with Domain Architect, Data Architects, Data Analysts, Data Scientists, Data Control & Governance Team and Machine Learning Engineers to understand their needs from a technical, but also from data control & governance perspective.

    As a Lead Data Engineer you must have hands-on working experience to design and develop the data pipelines in a Google Cloud ecosystem using Apache Airflow (Cloud Composer), Apache Beam (Dataflow) or a combination of both, BigQuery and Cloud Storage depending on each use case.

    The standard pipeline may include batch data migration to cloud, data transformations of varying complexities, data enrichment and data serving for various use cases such as data analysis or machine learning training.

    As part of a development team, collaborate with various stakeholders to understand requirements, refine stories (agile methodology).

    Design, implement, test and support the solutions to above in production (in DevOps style).Take ownership for own career management, seeking opportunities for continuous development of personal capability and improved performance contribution.

    Lead a team (agile SCRUM teams) of engineers and provide technical leadership.
    Mentor junior team members.


    Good to have:


    Adopt an automation-first approaches to testing, deployment, security and compliance of solutions through Infrastructure as Code and automated policy enforcement.

    Take the ownership to create the development standards, best practices and lead the innovative solution approaches.

    Providing Level 3 support for technical components and contributing to problem and root cause analysisCollaborating with Functional Analysts and technical Specialists to complete workEnsure that the Bank's SDLC controls are always adhered to.

    Participate in the agile ceremonies and contribute in backlog refinement and planning sessions.
    Your skills and experienceAt least 12 years' experience. Specifically, having worked on at least on one big project of 2 years duration.

    Experience in Data Engineering, ETL(extract, transform, and load) applications on GCP platform with different communication protocols and data formats together.

    Hands-on experience with Data analysis, Data Governance, Data Pipeline performance tuning and Optimization.

    Exposed and involved in all SDLC phases (Conception, requirements, design, development, test, and rollout).Has provided technical leadership to a team of 5 to 10 people in at least one such projects of high complexity.


    Full technology stack:
    [Must Haves]

    Python, SQL[Strong Plus]

    Apache Airflow (Cloud Composer), Apache Beam (Dataflow), BigQuery, Cloud Storage ,[Support Techs]

    Git, GitHub Actions, Terraform , DevOps , OpenShift 4.0 , Kubernetes, Jenkins etc.
    Engineering qualification with strong hands-on experience of working on enterprise level bigdata platforms and solution.
    Candidates with experience of working with payment domain and GCP - Data Engineering Certification will be preferred.

    How we'll support youTraining and development to help you excel in your careerCoaching and support from experts in your teamA culture of continuous learning to aid progression.

    A range of flexible benefits that you can tailor to suit your needs.