Senior Manager Techology - Bengaluru, India - Mashreq

    Mashreq
    Mashreq Bengaluru, India

    1 week ago

    Default job background
    Banking / Loans
    Description

    Data & Analytics Driven Organization: One Data and Finance Reporting Function drive the vision to transform the Bank into a Data Driven Organization (DDO) using latest tools and technologies in Data space. Wholesale & Ops Data team is a stream within One Data Organization driving this agenda for Wholesale & Ops Banking function.

    · Data Engineer will design and maintain data pipelines of real-time and batch, creates data strategy, data architecture, best practices and who can lead the organization in making strategical improvements in the data engineering solutions.

    · Our core function spans multiple platforms like Enterprise Data Lake, Azure Data Lake, Business Intelligence, Advanced Analytics and Realtime Streaming platforms.

    · The work involves developing new requirements in both on-premises and cloud data lake's estimating the efforts, designing an effective solution, data architecture and management, change release management, resource management, handling the deliverables across the business groups in the Bank. Also, handling critical applications that influence bank's management decision making process, regulatory reporting (which can influence financial/non-financial penalties) and operational reporting.

    Key Result Areas:--

    • Data Architecture Design: As a Senior Data Engineer, one of the primary responsibilities is to design and maintain the data architecture. This involves creating scalable, efficient, and robust data pipelines, data models, and data integration strategies.
    • Business focus deliverables:
    • Managed to build scalable & robust data visualization platform supported as per organization strategical tools & technology.
    • Develops and maintains scalable Business Intelligence Solutions for Wholesales & Operations, Risk, Compliance & Client Experience (CX) Digitalization
    • Understand clients' problems and decide how to cater to their needs, business problems.
    • Proficient in requirement gathering, finalizing scope, consensus building across diverse stakeholders, formulating BRD's, concluding UAT, demos and trainings, tracking Agile Scrum & rituals.
    • Team Leadership and Management: Leading a team of data engineers is a crucial aspect of this role. KRAs include managing team members (both FTE's and EC's), assigning tasks, mentoring junior engineers, and fostering a collaborative and productive work environment.
    • Data Pipeline Development: Developing and optimizing data pipelines is essential to ensure smooth data flow from various sources to the data warehouse or data lake. This KRA involves implementing ETL/ELT processes, data ingestion, and transformation workflows.
    • Data Quality and Governance: Ensuring the quality, integrity, lineage, and security of data is another vital aspect of a Lead Data Engineer's role. This involves implementing data governance policies, data monitoring mechanisms, and error handling procedures.
    • Performance Optimization: Responsible for optimizing data processing and query performance. This may involve tuning database queries, optimizing data storage, and leveraging distributed computing technologies.
    • Technology Selection and Integration: Keeping up with the latest trends and advancements in data engineering technologies is crucial. Evaluating and integrating new tools and frameworks to enhance the data engineering processes can be a significant KRA.
    • Monitoring and Troubleshooting: Monitoring new and existing data pipelines to identify and resolve issues is an essential responsibility. This KRA involves setting up monitoring systems and establishing procedures for quick troubleshooting and problem resolution.
    • Scalability and Resilience: Ensuring that the data engineering infrastructure can scale with increasing data volume and handle failures gracefully is critical. This KRA involves designing for high availability and fault tolerance.

    Knowledge, Skills and Experience

    Core Skills:

    • 10 – 12 Years of Total Experience in Software Development.
    • 7+ years of experience Big Data, Hadoop, Data Lake, Data Mesh Architectures, Azure Cloud, Microsoft SQL Server, Unix, and platform migrations, Apache Spark, Pyspark, Hive, Azure data bricks, ADF.
    • Should have managed successful execution of minimum 1large engagement into production go live.
    • Should have good knowledge of Platform management of Cloud, DWH, Azure and Data Lake.
    • Good in Data Management Fundamentals and Data Architect, Modelling, Governance and Data security.
    • Strong in writing & tuning the Data Ingestion jobs, Spark applications, python, Hive, Azure Databricks, Azure Data Factory and Azure SQL.
    • Proficient in SQL, python, C#, Java, or another JVM-based language.
    • Proficient in Spark, Hive, Sqoop, ADLS, ADB, ADF and Kafka.
    • Public/Private cloud experience in AWS and/or Azure.
    • Good knowledge of RDBMS and NoSQL database design and best practices.
    • Experience in DWH to Data Lake offloading.
    • Proficient in Data Modeling and Data Governance concepts.
    • Good Domain Knowledge in Banking/Finance area.