Aws/azure de with Snowflake - Pune, India - Abzooba
Description
Key Responsibilities:
- Selecting and integrating any Big Data tools and frameworks required to provide requested capabilities
- Monitoring performance and advising any necessary infrastructure changes
Required Experience, Skills and Qualifications:
- MUST have HANDS-ON experience on Hadoop tools/technologies like Spark, Map Reduce, Hive, HDFS.
- HANDS-ON expertise and excellent understanding of big data toolset such as Sqoop, Sparkstreaming, Kafka, NiFi, Snowflake.
- Proficiency in any of the programming language: Scala, Python or Java with total of 4years experience
- Experience in Cloud infrastructures like MS Azure, Amazon AWS. GCP is a plus. Good working knowledge in NoSQL DB (Mongo, HBase, Casandra)
- Must have experience of designing, developing and deploying big data project/s into production
- Implemented complex projects dealing with the considerable data size (TB/ PB) and with high complexity in the production environment.
- Hortonworks (HDPCA/HDPCD/HDPCD-Spark) or Cloudera certification is an added advantage
- Bachelor's degree or higher in a quantitative/technical field (e.g. Computer Science, Statistics, Engineering) and software development experience with proven handson experience in Big Data technologies
- Experience in developing/architecting environments in the Hadoop ecosystem using HDP and HDF
- Demonstrated strength in data modelling, ETL development, and Data. Experience in designing and implementing an enterprise data lake
- Experience in Big Data Management and Big Data Governance
- Some experience with Kubernetes, Docker containers etc.
Additional Skills:
- Exposure to 1 or more big data management tools like Talend, Informatica (IBD), Zaloni Data Platform etc.
- Exposure to data cleansing/data wrangling tools like Trifecta etc.
- Exposure to Big Data Visualization tools like Tableau, Qlik etc.
More jobs from Abzooba
-
Data Architect
Pune, India - 4 days ago
-
Platform Admin
Pune, India - 1 week ago