technical analyst - Greater Noida, India - Coforge

    coforge background
    Description

    Job Description

    Arch Global MI Data Engineer - Hand-on experience in Snowflake , Scala and/or Java , Azure or AWS for development and deployment Job Responsibilities?Design and implement core data analytic platform components that would be shared extended to different analytics group within Arch?Perform data modeling and create data marts / data vaults.?Review approaches and completed data pipelines against platform best practices and patterns ?Maintain and design a common data flow pipeline data transformation activities such as Extract Transform Load (ETL). ?Support and troubleshoot the data flow activities on cloud data warehouse environments?Develop data pipeline code using Python, Java, AWS Lambda and/or Azure Data Factory?Perform requirements planning, monitoring, and end-to-end requirements management throughout the data asset development life-cycle?Direct and help other developers and analyst to ensure data platform patterns are adhered Desired Skills?Experience or knowledge of the following:?Designing, building, and documenting Data pipeline (ADF, Snowpipe, AWS/S3)?Microsoft PowerBI?Python and AWS Lambda?Linux/PowerShell scripting ?AWS Lambda or Azure functions?Experience using data modeling tools like dbt and Adept.?Prior experience in Java, Python or any similar programming languages.?Experience working with data visualization tools like Power BI.?Exposure to Azure or AWS Cloud environment for development and deployment.?Exposure to a data governance tool Collibra or similar.

    Posted On Arch Global MI Data Engineer - Hand-on experience in Snowflake , Scala and/or Java , Azure or AWS for development and deployment Job Responsibilities?Design and implement core data analytic platform components that would be shared extended to different analytics group within Arch?Perform data modeling and create data marts / data vaults.?Review approaches and completed data pipelines against platform best practices and patterns ?Maintain and design a common data flow pipeline data transformation activities such as Extract Transform Load (ETL). ?Support and troubleshoot the data flow activities on cloud data warehouse environments?Develop data pipeline code using Python, Java, AWS Lambda and/or Azure Data Factory?Perform requirements planning, monitoring, and end-to-end requirements management throughout the data asset development life-cycle?Direct and help other developers and analyst to ensure data platform patterns are adhered Desired Skills?Experience or knowledge of the following:?Designing, building, and documenting Data pipeline (ADF, Snowpipe, AWS/S3)?Microsoft PowerBI?Python and AWS Lambda?Linux/PowerShell scripting ?AWS Lambda or Azure functions?Experience using data modeling tools like dbt and Adept.?Prior experience in Java, Python or any similar programming languages.?Experience working with data visualization tools like Power BI.?Exposure to Azure or AWS Cloud environment for development and deployment.?Exposure to a data governance tool Collibra or similar. Skills Required

    SNOWFLAKE

    Posted On SNOWFLAKE Location

    Greater Noida

    Posted On Greater Noida Desirable Skills

    SCALA

    Posted On SCALA