Data Modeler - Bengaluru, India - Course5i

    Course5i
    Default job background
    Description

    OVERVIEW


    Course5 IntelligenceWe enable organizations to make the most effective strategic and tactical moves relating to their customers, markets, and competition at the rapid pace that the digital business world demands.

    Founded in 2000, our business areas include Market Intelligence, Big Data Analytics, Digital Transformation, Artificial Intelligence, and Analytics.

    Rapid advances in Artificial Intelligence and Machine Learning technology have enabled us to create disruptive technologies and accelerators under our Course5 Intelligence suites that combine analytics, digital, and research solutions to provide significant and long-term value to our clients.

    More information can be found at

    Global OfficesUnited States | United Kingdom | United Arab of Emirates | India | Singapore


    BRIEF JOB RESPONSIBILITIES:

    Job Responsibilities:

    • Responsible for designing, deploying, and maintaining Data models for DWL Layer.
    • Responsible for creating STM and Data Models (ER Model) for every single Entity.
    • Understanding the Work Intake from Client on granularity and business level and Model the requirement to the New or Existing business model.
    • Evaluate new and upcoming data solutions and make recommendations for adoption to existing and new Data Entities.
    • Responsible for handling, maintaining and optimizing global EDW Design and evaluating the Entity to the common format as per its functional role.
    • Creating Dynamic DDL's and Scripts as per the requirements.
    • Guiding ETL Team to make them understand the Data Model and functional role of the Entity
    • Responsible for migration of Dev and QA Data entities.
    • Creating STM and Data Models in ER.
    • Identify gaps and improve the existing platform to improve quality, robustness, maintainability, and speed.
    • Perform development, QA, and dev-ops roles as needed to ensure total end to end responsibility of solutions.

    Requirements & Qualifications:

    • Experience building, maintaining, and improving Data Processing Pipeline / Data routing in large scale environments.
    • Fluency in common query languages, API development, data transformation, and integration of data streams.
    • Strong experience with large dataset platforms such as (e.g. Azure SQL Database, Teradata etc )
    • Experience with Azure Synapse is preferable.
    • Fluency in multiple programming languages, such as Python, Shell Scripting, SQL, Java, or similar languages and tools appropriate for large scale data processing.
    • Experience in any ER Tool
    • Experience with acquiring data from varied sources such as: API, data queues, flat-file, remote databases.
    • Must have basic Linux administration skills and Multi-OS familiarity (e.g. Microsoft Windows, Linux).
    • Data Pipeline and Data processing experience using common platforms and environments
    • Understanding of traditional Data Warehouse components (e.g. ETL, Business Intelligence Tools).
    • Creativity to go beyond current tools to deliver the best solution to the problem.
    • 5+ years working on data processing environments.
    Course5

    is proud to be an equal opportunity employer.

    We are committed to equal employment opportunity regardless of race, color, religion, sex, sexual orientation, age, marital status, disability, gender identity, etc.

    If you have a disability or special need that requires accommodation, please keep us informed about the same at the hiring stages for us to factor necessary accommodations