Data Architect - Hyderabad, Telangana, India - Clairvoyant

Clairvoyant
Clairvoyant
Verified Company
Hyderabad, Telangana, India

2 weeks ago

Deepika Kaur

Posted by:

Deepika Kaur

beBee Recuiter


Description

About Clairvoyant:


Clairvoyant is a global technology consulting and services company founded in 2012, headquartered in Chandler, US, and has delivery centers across the globe.

We help organizations maximize the value of data by providing data engineering, analytics, machine learning, and user experience consulting and development projects to multiple Fortune 500 clients.

Clairvoyant clients rely on its deep vertical knowledge and best-in-class services to drive revenue growth, boost operational efficiencies, and manage risk and compliance.

Our team of experts with direct industry experience in data engineering, analytics, machine learning, and user experience has your back every step of the way.

"
Our Values:
Passion, Continuous Learning, Adaptability, Teamwork, Customer Centricity, Reliability"


Position - Data Architect (AWS)


Location - Pune, Gurgaon


Experience - 10-12 Years


Must have Skills:

-
10+ years of proven experience as a Data Architect in Big Data Engineering.
-
Strong expertise in Solution, data modeling, data warehousing, and ETL processes.
-
Proficiency in AWS Python, PySpark, Hadoop, and SQL.

  • Indepth knowledge of big data technologies and ecosystems.
  • Excellent problemsolving skills and the ability to address complex technical challenges.
  • Strong communication and leadership skills.

Role & Responsibilities:


  • Data Modeling: Develop and maintain data models to represent our complex data structures, ensuring data accuracy, consistency, and efficiency.
  • Technical Leadership: Provide technical leadership and guidance to development teams, promoting best practices in software architecture and design.
  • Solution Design: Collaborate with stakeholders to define technical requirements and create solution designs that align with business goals and objectives.
  • Programming: Develop and maintain software components using Python, PySpark, and Hadoop to process and analyze large datasets efficiently.
  • Big Data Ecosystem: Work with various components of the Hadoop ecosystem, such as HDFS, Hive, and Spark, to build data pipelines and perform data transformations.
  • SQL Expertise: Utilize SQL for data querying, analysis, and optimization of database performance.
  • Performance Optimization: Identify and address performance bottlenecks, ensuring the system meets required throughput and latency targets.
  • Scalability: Architect scalable and highly available data solutions, considering both batch and realtime processing.
  • Documentation: Create and maintain comprehensive technical documentation to support the development and maintenance of data solutions.
  • Security and Compliance: Ensure that data solutions adhere to security and compliance standards, implementing necessary controls and encryption mechanisms.

Education:

Bachelor's degree in Computer Science, Software Engineering, MIS or equivalent combination of education and experience

More jobs from Clairvoyant