Azure Databricks + PySpark + Scala - All India
1 month ago

Job summary
We are seeking an experienced professional with 5 or more years of experience to work on Azure Databricks using Spark and PySpark.
Job description
, consectetur adipiscing elit. Nullam tempor vestibulum ex, eget consequat quam pellentesque vel. Etiam congue sed elit nec elementum. Morbi diam metus, rutrum id eleifend ac, porta in lectus. Sed scelerisque a augue et ornare.
Donec lacinia nisi nec odio ultricies imperdiet.
Morbi a dolor dignissim, tristique enim et, semper lacus. Morbi laoreet sollicitudin justo eget eleifend. Donec felis augue, accumsan in dapibus a, mattis sed ligula.
Vestibulum at aliquet erat. Curabitur rhoncus urna vitae quam suscipit
, at pulvinar turpis lacinia. Mauris magna sem, dignissim finibus fermentum ac, placerat at ex. Pellentesque aliquet, lorem pulvinar mollis ornare, orci turpis fermentum urna, non ullamcorper ligula enim a ante. Duis dolor est, consectetur ut sapien lacinia, tempor condimentum purus.
Access all high-level positions and get the job of your dreams.
Similar jobs
Freelance Data Scientist
3 weeks ago
You will be responsible for designing, developing and optimizing data pipelines using Scala or PySpark for large telecom datasets. · ...
Associate Data Analyst
5 days ago
This job is responsible for preparing accurate ad-hoc and recurring reports for business decisions using PySpark or Scala queries. Additionally, you will validate datasets for integrity , accuracy , and consistency to ensure reported data quality . · ...
Python Pyspark Developer
4 weeks ago
As a Data Engineer you will be primarily responsible for coding data pipelines in Spark specifically PySpark. · Proficiency in Python for automationFamiliarity with Azure Airflow and Databricks good to haveAbility to code in PySpark or Scala Spark ...
Python Pyspark Professional
6 days ago
Highly skilled Big Data Engineer to design develop and optimize scalable data pipelines for analytics business intelligence initiatives. · Hands-on experience with Hadoop ecosystem Apache Spark · Programming expertise in Python PySpark Scala Java · ...
Azure Data Bricks
1 month ago
You will be responsible for working with Azure Data Bricks, Data Factory and other Azure Data components to build CI/CD pipelines in Data environments. · Experience with Azure Data components like Azure Data Factory (ADF) and Azure Data Bricks (ADB). · Proficiency in Python, Pysp ...
Pyspark - Machine Learning
1 month ago
You have over 7 years of experience in Big Data with a strong expertise in Spark and Scala. · Proficiency in Big Data primarily Spark and Scala · A strong knowledge in HDFS, Hive, Impala, Unix, Oracle and Autosys · ...
Azure Data Bricks
1 month ago
You will be responsible for working with Azure Data components such as Azure Data Factory, Azure Data Bricks, Azure SQL Database, · Azure SQL Warehouse SYNAPSE Analytics. · Your role will involve using Python Pyspark Scala Hive programming languages building CI/CD pipelines in d ...
Senior Data Engineer
1 month ago
Senior Data Engineer designing developing optimizing cloud-based data solutions Google Cloud Platform GCP Analytics professional hands-on expertise BigQuery Python PySpark SQL Apache Airflow Java Scala. · ...
Big Data
4 weeks ago
We are seeking an experienced Big Data Developer with a minimum of 5-8 years of experience in Hadoop/big data technologies. You will have hands-on experience in the Hadoop eco-system including HDFS, MapReduce, Hive, Pig, Impala, Spark, Kafka, Kudu, · Solr. Additionally you should ...
Apache Spark E0
2 weeks ago
You are a motivated Apache Spark Developer (Entry Level) joining the data engineering team. · ...
Data Bricks Engineer
2 weeks ago
+You will be responsible for the following: · +,+Strong in Azure Databricks. · ,+Experience in Azure Synapse Analytics, Azure Data Lake Storage (Gen2). · ,++ · + ...
Azure Spark Developer
1 month ago
You will be responsible for engaging with Business / Stakeholders to provide status updates on the progress of development and issue fixes. · ...
Data Engineer
1 week ago
We are looking for a skilled Data Engineer to join our team at Senzcraft. As a Data Engineer, you will be responsible for working on various tasks such as building scalable data pipelines using PySpark, Scala & Python · Developing batch/streaming data solutions using Kafka, BigQ ...