No more applications are being accepted for this job
- Develop and optimize Spark applications to process large-scale data efficiently.
- Collaborate with cross-functional teams to design and implement data-driven solutions.
- Troubleshoot and resolve performance issues in Spark jobs.
- Stay up-to-date with the latest trends and advancements in Spark and Scala technologies.
- Proficient in data pipelines, Kafka, Kafka streams, connectors, etc.
- Strong experience with Apache Spark, Spark Streaming, and Spark SQL.
- Solid understanding of distributed systems, Databases, System design, and big data processing frameworks.
- Familiarity with Hadoop ecosystem components (HDFS, Hive, HBase) is a plus.
Engineer - Noida, India - Delhivery
Description
Are you a passionate Spark and Scala developer looking for an exciting opportunity to work on cutting-edge big data projects? Look no further Delhivery is seeking a talented and motivated Spark & Scala Expert to join our dynamic team.
Responsibilities:
Requirements: