Kafka architect - Bengaluru, India - Rakuten India

    Default job background
    Description

    About Rakuten:
    Rakuten India is the Development Centre and key technology hub of the Rakuten Group, Inc.

    We enable our businesses with our depth of knowledge in multiple streams of technology such as Mobile and Web Development, Web Analytics, Platform Development, Backend Engineering, Data Science, Machine Learning, Artificial Intelligence and much more.

    Our unique 24/7 support center ensures reliability and sustenance of the Rakuten Ecosystem.

    With dedicated centres of excellence for Mobile Application Development, Data Analytics, Engineering, DevOps and Information Security, we ensure the success of multiple units of Rakuten Group, Inc.

    With 1700+ employees and growing, Rakuten India is housed in Crimson House Bangalore in the heart of the city.

    Our History:
    In 1997, Rakuten first began with Rakuten Ichiba, a B2B2C marketplace, with just six employees, one server, 13 merchants.

    With the mission of "empowering people and society through innovation and entrepreneurship," the Rakuten group rapidly grew with regional headquarters across the world.


    In 2016, Rakuten India opened it doors in Bangalore, India, the tech city known as the Silicon Valley of India This research and development center became a key technology hub of the Rakuten Group, championing some of the products and platforms that run the businesses.

    With and growing) Rakutenians working on the very same mission as Rakuten Group, Inc, we believe that technology and business needs must challenge each other for true innovation to rise and make a telling business impact We have team members who work support Rakuten's global strategy across businesses such as e-Commerce, Digital, Marketing Platforms, Ecosystem Services and so on.

    "Walk together" is our guiding philosophy and together we continue to grow stronger by taking Rakuten's businesses to the next level with not just existing products but also create some in the relevancy of Artificial Intelligence and Machine Learning.

    The Role


    We are looking for a Kafka Architect who can Work alongside multiple Production Support, Development, System Admin, Project Managers and Engineering teams.


    Key Responsibilities:


    deploy, and manage Kafka clusters in a production environment, ensuring high availability, scalability, and engineering teams to understand data processing requirements and design Kafka solutions that meet their needs.

    Kafka configurations and performance to ensure efficient utilization of resources and low latency data maintain Kafka-related tools, utilities, and automation scripts to streamline deployment, monitoring, and management tasks.guidance and best practices to engineering teams on Kafka usage, data modelling, and integration patterns.closely with operations teams to monitor Kafka clusters, troubleshoot issues, and perform capacity planning.updated with the latest Kafka features, enhancements, and best practices, and evaluate their applicability to our environment.


    Technical skill you should have:


    Understanding of Kafka architecture, including brokers, topics, partitions, producers, consumers, and replication of Kafka distributed nature, fault tolerance, scalability and high availabilityExperience with streaming data processing frameworks like Apache Flink, Apache Spark Streaming, or Kafka Streams.

    Knowledge of Kafka integration pattern and best practises for integration with various data sourceKnowledge of containerization and orchestration technologies such as Docker and Kubernetes.

    Experience with cloud platforms like AWS, Azure, or Google Cloud Platform.
    Relevant certifications such as Confluent Certified Developer or Confluent Certified Administrator.
    Knowledge of containerization and orchestration technologies such as Docker and Kubernetes.
    Very good knowledge in database technology for both NoSQL DB and SQL DB. Working experience with in-memory DB such as Couchbase would be an advantage.
    Establish scalable implementation of data management framework, platform and tools stack.
    Good knowledge to develop, implement and managing BAU operational process for data quality functionality and capability such as Data Quality Statistics, Data Quality Scorecards, Data Quality Analysis and Workflows Design