Snowflake Developer - Hyderabad, India - Career Soft Solutions Pvt. Ltd.

    Career Soft Solutions Pvt. Ltd.
    Career Soft Solutions Pvt. Ltd. Hyderabad, India

    2 weeks ago

    Default job background
    Technology / Internet
    Description

    Skillset: Snowflake

    Experience: Years

    Job Location: Hyderabad/Chennai/Bangalore/Bhubaneswar/Gurgoan/Pune

    Job Description:

    • Hands-on experience with Snowflake DB
    • Ability to handle all DML , DDL , Procedure creation , performance tuning etc is expected.
    • Proficient in performing root cause analysis of existing models and proposing a solution
    • Identify opportunities to improve, speed up and innovate the data models
    • Ability to build data processes, pipelines, workflows, dependencies, data structures and data transformations
    • Familiar with Snowflake architecture and warehouse implementation and management
    • Working experience of cloud computing architecture
    • Experience in working with ETL tools
    • Working experience with Unix or Python scripting language.
    • Should be able to write data processing script , connect to source db , join multiple tables and handle data in Python.
    • Should be familiar with all data processing libraries.
    • Understanding of Deployment process , GIT Tool is expected.
    • Strong English communication skills.
    • Strong analytical and interpersonal skills
    • Ability to prioritize and work on your own

    Roles and Responsibilities

    • Developing and maintaining data architecture and data models
    • Create standardized procedures for data flows using python scripting
    • Conduct efficient data integration with other third-party tools and Snowflake
    • Create and maintain the documentation of the architecture, data models and maintenance activities
    • Review and audit existing data models and solutions and propose better processes
    • Perform tuning, testing and problem analysis
    • Identify, design, implement and deploy new Snowflake based data architecture
    • Automate manual processes and optimize the data flows
    • Collaborate with the data science experts, BI developers and analysts to create custom data models and integrations with Snowflake
    • Maintain optimal data pipeline with ETL tools
    • Resolve production issues to ensure seamless data processing

    Regards,

    Vignesh