Senior Data Engineer - Pune, India - ZENEX STAFFING SOLUTIONS PRIVATE LIMITED

    ZENEX STAFFING SOLUTIONS PRIVATE LIMITED
    ZENEX STAFFING SOLUTIONS PRIVATE LIMITED Pune, India

    Found in: Talent IN 2A C2 - 1 week ago

    Default job background
    Description

    Job Description :
    Data Engineer Snowflake.

    Exp : 5-9 years.


    Location Onsite :
    (Nagpur or Pune).


    Mandatory Skills :
    SQL, Python, Snowflake, Data Modelling, ETL, Snowpark.

    Good to Have Skills : DBT (Data Build Tool), API Integration (AWS Lambda), Git, AWS S3 Integration.


    Knowledge, Skills and Experience :


    Proficiency in crafting and optimizing complex SQL queries and Stored Procedures for data transformation, aggregation, and analysis within the Snowflake platform.


    • Experience with Snowflake cloud data warehousing service, including data loading, querying, and administration.
    • Ability to design and implement data models, applying both relational and dimensional modeling techniques, within Snowflake.
    • Indepth understanding of ETL processes and methodologies, leveraging Snowflake's capabilities.
    • Familiarity with DBT (Data Build Tool) for data transformation and modeling within Snowflake.
    • Expertise in integrating Snowflake with AWS S3 storage for data storage and retrieval.
    • Proficiency in Snowpark, enabling data processing using Snowflake's native programming language.
    • Skill in API integration, specifically integrating Snowflake with AWS Lambda for data workflows.
    • Adeptness in version control using GitHub for collaborative code management.
    • Adeptness in troubleshooting datarelated issues within the Snowflake ecosystem, ensuring data quality and consistency.
    • Skill in creating clear and concise technical documentation, facilitating communication and knowledge sharing.
    • Designing efficient and wellstructured data schemas within Snowflake.
    • Utilizing Snowflake's features for data warehousing, scalability, and performance optimization.
    • Leveraging Python programming for data processing, manipulation, and analysis in a Snowflake environment.
    • Implementing data integration and transformation workflows using DBT.
    • Writing and maintaining scripts for data movement and processing, using cloud integration.
    )