Jobs
>
Hyderabad

    Data EngineerETL Developer - Hyderabad, India - Kanerika Software

    Default job background
    Description
    As aData Engineer/ETL Developer you will be responsible for designingdeveloping and maintaining our global data warehouse and data martson the Snowflake Platform. You will play a pivotal role inarchitecture solutions for complex data integration challengesensuring the scalability performance and reliability of our datapipelines.

    Requirements

    .

    PrimaryResponsibilities:


    Designdevelop and implement ETL processes using Snowflake technologies toextract transform and load data from various source systems intoour global datawarehouse.

    Collaboratewith crossfunctional teams to understand business requirements andtranslate them into technical specifications for data integrationand ETLworkflows.

    OptimizeETL processes for performance scalability and reliability tosupport the processing of large volumes ofdata.

    Developand maintain data models schemas and structures within Snowflake tosupport efficient data storage andretrieval.

    Implementdata quality checks validation routines and error handlingmechanisms to ensure the accuracy and integrity ofdata.

    Workclosely with data engineers data analysts and business stakeholdersto troubleshoot issues identify opportunities for optimization anddrive continuous improvement of our datasolutions.

    Documenttechnical designs configurations and best practices for ETLprocesses and data warehousecomponents.

    Stayupdated on industry trends and emerging technologies related todata warehousing ETL and cloud computing and providerecommendations for adopting new tools and techniques to enhanceour datacapabilities.

    MustHaveSkills:

    1. StrongSQLknowledge.
    2. Deepunderstanding of data warehousing concepts dimensional modeling anddata mart designprinciples.
    3. Experiencewith Snowflake features such as SnowSQL Snowpipe tasks and storedprocedures ispreferred.
    4. Experiencein creating complex queries and storedprocedures.
    5. Proventrack record of architecting scalable and performant data warehousesolutions handling large volumes of data and complex datatransformations.
    6. Proficiencyin utilizing the Azure Cloud platform including Azure Data FactoryAzure Storage and AzureDatabricks.
    7. Proficiencyin scripting languages such as Python Shell scripting or similarfor automation and datamanipulation.
    8. Proficiencyin Job Scheduling tools(Tidal)

    GoodtoHaveSkills:

    1. Familiaritywith Informatica PowerCenter.
    2. Familiaritywith tools likeSSIS.
    3. Python/Java


    Benefits

    1.Culture:

    OpenDoor Policy: Encourages open communication and accessibility tomanagement.

    OpenOffice Floor Plan: Fosters a collaborative and interactive workenvironment.

    FlexibleWorking Hours: Allows employees to have flexibility in their workschedules.

    EmployeeReferral Bonus: Rewards employees for referring qualifiedcandidates.

    AppraisalProcess Twice a Year: Provides regular performance evaluations andfeedback.


    2.Inclusivity andDiversity:


    Hiringpractices that promote diversity: Ensures a diverse and inclusiveworkforce.

    MandatoryPOSH training: Promotes a safe and respectful workenvironment.

    3.Health Insurance and WellnessBenefits:


    GMCand Term Insurance: Offers medical coverage and financialprotection.

    HealthInsurance: Provides coverage for medicalexpenses.

    DisabilityInsurance: Offers financial support in case ofdisability


    4.Child Care & Parental LeaveBenefits:

    Companysponsoredfamily events: Creates opportunities for employees and theirfamilies tobond.

    GenerousParental Leave: Allows parents to take time off after the birth oradoption of achild.

    FamilyMedical Leave: Offers leave for employees to take care of familymembers medicalneeds.


    5.Perks and TimeOffBenefits:

    Companysponsoredoutings: Organizes recreational activities foremployees.

    Gratuity:Provides a monetary benefit as a token ofappreciation.

    ProvidentFund: Helps employees save forretirement.

    GenerousPTO: Offers more than the industry standard for paid timeoff.

    Paidsick days: Allows employees to take paid time off when they areunwell.

    Paidholidays: Gives employees paid time off for designatedholidays.

    BereavementLeave: Provides time off for employees to grieve the loss of alovedone.


    6.Professional DevelopmentBenefits:

    L&Dwith FLEX Enterprise Learning Repository: Provides access to alearning repository for professionaldevelopment.

    MentorshipProgram: Offers guidance and support from experiencedprofessionals.

    JobTraining: Provides training to enhance jobrelatedskills.

    ProfessionalCertification Reimbursements: Assists employees in obtainingprofessionalcertifications.

    Promotefrom Within: Encourages internal growth and advancementopportunities.



    . Primary Responsibilities: Design, develop, and implement ETLprocesses using Snowflake technologies to extract, transform, andload data from various source systems into our global datawarehouse. Collaborate with cross-functional teams to understandbusiness requirements and translate them into technicalspecifications for data integration and ETL workflows. Optimize ETLprocesses for performance, scalability, and reliability to supportthe processing of large volumes of data. Develop and maintain datamodels, schemas, and structures within Snowflake to supportefficient data storage and retrieval. Implement data qualitychecks, validation routines, and error handling mechanisms toensure the accuracy and integrity of data. Work closely with dataengineers, data analysts, and business stakeholders to troubleshootissues, identify opportunities for optimization, and drivecontinuous improvement of our data solutions. Document technicaldesigns, configurations, and best practices for ETL processes anddata warehouse components. Stay updated on industry trends andemerging technologies related to data warehousing, ETL, and cloudcomputing, and provide recommendations for adopting new tools andtechniques to enhance our data capabilities. Must-Have Skills:Strong SQL knowledge. Deep understanding of data warehousingconcepts, dimensional modeling, and data mart design principles.Experience with Snowflake features such as SnowSQL, Snowpipe,tasks, and stored procedures is preferred. Experience in creatingcomplex queries and stored procedures. Proven track record ofarchitecting scalable and performant data warehouse solutions,handling large volumes of data and complex data transformations.Proficiency in utilizing the Azure Cloud platform, including AzureData Factory, Azure Storage, and Azure Databricks. Proficiency inscripting languages such as Python, Shell scripting, or similar forautomation and data manipulation. Proficiency in Job Schedulingtools (Tidal) Good-to-Have Skills: Familiarity with InformaticaPower Center. Familiarity with tools like SSIS.Python/Java