DataOps Engineer - Navi Mumbai, Mumbai, Mumbai City - confidential

    confidential
    confidential Navi Mumbai, Mumbai, Mumbai City

    5 days ago

    Full time ₹900,000 - ₹2,500,000 (INR) per year *
    Description

    We are seeking a highly skilled and experienced DataOps Engineer to join our dynamic team. The ideal candidate will have a strong background in DataOps practices with a focus on AWS and Azure DevOps, Databricks setup and management, PostgreSQL administration, Docker management, CI/CD setup, and Azure/AWS infrastructure management. This role is critical in ensuring the seamless integration and deployment of data infrastructure, enabling efficient and reliable data operations.

    • Responsibilities:DevOps Management:
    • Design, implement, and manage DevOps pipelines for automated build, test, and deployment processes with tools like Git, Azure DevOps, Jenkins, GitActions etc., Mostly for data workloads.
    • Build and manage Databricks, snowflake, Kafka, and other cloud native data services/tools setup.
    • Collaborate with development teams to integrate code changes and ensure seamless delivery to production.
    • Databricks Setup and Management:
    • Set up and manage Azure Databricks environments for large-scale data processing and analytics.
    • Optimize Databricks clusters and manage costs while ensuring high availability and performance.
    • PostgreSQL Administration:
    • Administer PostgreSQL databases, ensuring their optimal performance, security, and reliability.
    • Perform routine database maintenance tasks such as backups, restoration, and tuning queries for performance.
    • Docker Management:
    • Develop and manage Docker containers to ensure consistency across development, testing, and production environments.
    • Monitor containerized applications for performance and resolve any issues related to container orchestration.
    • CI/CD Setup:
    • Design and implement CI/CD pipelines to automate software deployments and data pipelines.
    • Ensure that the CI/CD pipelines are scalable, secure, and capable of handling large volumes of data.
    • Azure/AWS Infrastructure Management:
    • Manage Azure infrastructure components such as Virtual Networks, Storage Accounts, and Resource Groups.
    • Monitor and optimize the performance of the Azure environment, ensuring scalability and reliability.
    • Implement security best practices across Azure services to protect data and applications.
    • Collaboration and Communication:
    • Work closely with data engineers, software developers, and IT teams to integrate DataOps processes across the organization.
    • Provide technical guidance and mentorship to junior team members on DataOps best practices.
    • Monitoring and Optimization:
    • Implement monitoring solutions to track the health and performance of data pipelines and infrastructure.
    • Continuously optimize processes and infrastructure for cost-effectiveness and efficiency.

    Requirements:

    • Bachelor s degree in Computer Science- Information Technology, or a related field.
    • Minimum of 7 years of hands-on experience in a DataOps or DevOps role, with a strong focus on data infrastructure and cloud platforms.
    • Proficiency in Azure DevOps for managing code repositories, CI/CD pipelines, and build/release processes.
    • Extensive experience in setting up and managing Databricks and Snowflake environments.
    • Strong PostgreSQL administration skills, including performance tuning, backups, and security.
    • Experience with Docker for containerizing applications and managing container orchestration.
    • Hands-on experience in setting up and managing CI/CD pipelines.
    • Expertise in Azure/AWS infrastructure management, including monitoring, security, and cost optimization.
    • Exposure to Sage maker is nice to have.
    • Certification in Azure DevOps, Databricks, or PostgreSQL, AWS Solution or developer certificate.
    • Experience with other cloud platforms (GCP) is a plus.
    • Knowledge of scripting languages (e.g., Python, PowerShell) for automation tasks.
    • Strong problem-solving skills and attention to detail.
    • Excellent communication and collaboration abilities.
    • Ability to work independently and as part of a team in a fast-paced environment.
    * This salary range is an estimation made by beBee
Jobs
>
Dataops engineer