Data Engineer - New Delhi, India - Terminus

    Terminus
    Terminus New Delhi, India

    2 weeks ago

    Default job background
    Description

    Role

    We are looking for a Data Engineer to join our team who will be involved in developing Machine Learning algorithms based programs to automate the data management and transformation at scale and optimize to prepare the data pipe for the AI /Analytical technology stack. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. The primary focus will be on choosing optimal solutions to use for these purposes, then maintaining, implementing, and monitoring them. You will also be responsible for integrating them with the architecture used across the company.

    Responsibilities will include:

    • Design, develop, maintain, and deploy data and ETL pipeline solutions using Apache Airflow, Kubernetes, Python (RESTful server side APIs) stacks and cloud services (AWS)
    • Build algorithms and prototypes
    • Contribute to the implementation of data architecture(s) and data model design
    • Develop solutions that leverage existing frameworks and libraries, align with the overall company architecture and vision
    • Participate in architecture discussions and help shape the overall direction of Terminus platforms
    • Work in a team environment using Agile project approaches
    • Maintain documentation related to data architecture standards, protocols, frameworks, technique, and opportunities for documentation improvement
    • Explore ways to enhance data quality and reliability
    • Actively develops skills, knowledge, and abilities to maintain currency with new developments in data architecture, as well as related areas such as data integration, application programming interfaces, and data management

    Reporting:

    This role reports to Tech Lead Data Engineer. This position will work/ interact with the engineering team and product team in the United States.

    Minimum Requirements

    • Experience designing, building, and maintaining data processing systems
    • 5+ years of Python programming experience
    • 3+ years of of SQL experience
    • 5+ years of APIs experience
    • 2+ years of AWS experience

    Nice to Have

    • Machine learning concept knowledge
    • Scripting, reporting & data visualization
    • Knowledge of ChatGPT

    Location: Remote

    More information is available at