Azure Data Architect - Pune, Maharashtra, India - Clairvoyant

Clairvoyant
Clairvoyant
Verified Company
Pune, Maharashtra, India

2 weeks ago

Deepika Kaur

Posted by:

Deepika Kaur

beBee Recuiter


Description

Position - Azure Data Architect**
Location -
Pune (Hybrid)

Experience - 8-12 years.**Clairvoyant is a global technology consulting and services company founded in 2012, headquartered in Chandler, US, and has delivery centers across the globe. We help organizations maximize the value of data by providing data engineering, analytics, machine learning, and user experience consulting and development projects to multiple Fortune 500 clients. Clairvoyant clients rely on its deep vertical knowledge and best-in-class services to drive revenue growth, boost operational efficiencies, and manage risk and compliance. Our team of experts with direct industry experience in data engineering, analytics, machine learning, and user experience has your back every step of the way.

"Our Values:
Passion, Continuous Learning, Adaptability, Teamwork, Customer Centricity, Reliability"


As a Software Technical Architect in our Big Data Engineering team, you will play a crucial role in designing, developing, and maintaining complex data solutions.

You will collaborate with cross-functional teams to define and implement scalable, high-performance data architectures.

Your primary responsibilities will include data modelling, leading software design and development efforts, and ensuring the reliability, scalability, and performance of our big data infrastructure.


  • 8+ years of proven experience as a Software Technical Architect in Big Data Engineering.
  • Strong understanding of Data Warehousing, Data Modelling, Cloud and ETL concepts
  • Experience with Azure Cloud technologies, including Azure Data Factory, Azure Data Lake Storage, Databricks, Event Hub, Azure Monitor and Azure Synapse Analytics
  • Proficiency in Python, PySpark, Hadoop, and SQL.
  • Designing and building of data pipelines using API ingestion and Streaming ingestion methods.
  • Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential.
  • Thorough understanding of Azure Cloud Infrastructure offerings.
  • Strong experience in common data warehouse modelling principles including Kimball, Inmon.
  • Indepth knowledge of big data technologies and ecosystems.
  • Excellent problemsolving skills and the ability to address complex technical challenges.
  • Strong communication and leadership skills.
  • Experience developing security models.

Must Have:


  • Strong understanding of Azure data platform
  • Azure Data Factory, Azure Databricks, Synapse, Event Hubs, ADLS, Delta Files, Azure Monitor, Azure Security, Azure DevOps
  • Experience in developing NO SQL solutions using Azure Cosmos DB.
  • Good knowledge on setting up Data Governance in Azure
  • Hands on experience on Python Programming Language and excellent code debugging skills.
  • Excellent knowledge on SQL
  • Experience with data modelling, ETL, and data warehousing concepts and implementation
  • Strong customer engagement skills to understand customer needs for Analytics solutions fully.
  • Excellent communication and teamwork skills
  • Strong data analysis and analytical skills.
  • Experience in working in a fastpaced agile environment.
  • Strong problem solving and troubleshooting skills

Good to Have:


  • Experience with Power BI Reporting
  • Experience with Microsoft Purview
  • Experience on processing streaming data


  • Programming Languages

  • Java or Scala
  • Familiarity with other cloud platforms (e.g., AWS, GCP) is a plus.
  • Azure Architect Certification

Responsibilities:


  • Analyses current business practices, processes, and procedures as well as identifying future business opportunities for leveraging Microsoft Azure Data & Analytics Services.
  • Engage and collaborate with customers to understand business requirements/use cases and translate them into detailed technical specifications.
  • Collaborate with project managers for project/sprint planning by estimating technical tasks and deliverables.
  • Data Modelling: Develop and maintain data models to represent our complex data structures, ensuring data accuracy, consistency, and efficiency.
  • Technical Leadership: Provide technical leadership and guidance to development teams, promoting best practices in software architecture and design.
  • Solution Design: Collaborate with stakeholders to define technical requirements and create solution designs that align with business goals and objectives.
  • Programming: Develop and maintain software components using Python, PySpark, and Hadoop to process and analyze large datasets efficiently.
  • Big Data Ecosystem: Work with various components of the Hadoop ecosystem, such as HDFS, Hive, and Spark, to build data pipelines and perform data transformations.
  • SQL Expertise: Utilize SQL for data querying, analysis, and optimization of database performance.
  • Performance Optimization: Identify and address performance bottlenecks, ensuring the system meets required throughput and latency targets.
  • Scalability: Architect scalable and highly available data solutions, considering both batch

More jobs from Clairvoyant