Senior Data Engineer - Hyderabad, India - Anblicks
Description
Role TitleLead Data Engineer - SnowflakeReporting To:
LocationIndia – Ahmedabad or HyderabadWho we are ?
Anblicks
is Data and AI Company – We bring value to your data.
Anblicks is a Data and AI company, specializing in data modernization and transformation, that helps organizations across industries make decisions better, faster, and at scale.
Anblicks is headquartered in Addison, Texas, and employs more than 550 technology professionals, data analysts, and data science experts in the USA, India, and Australia.
Anblicks is committed to bringing value to various industries using CloudOps, Data Analytics, and Modern Apps. Global customers benefited from our Anblicks Ignite Enterprise Data Platform and Accelerators.Why Join Anblicks ?
Anblicks,
a company, advancing in leaps and bounds, places tremendous emphasis on values. A fundamental ideology is crucial to manoeuvre an organisation towards success. Through a stable value system, Anblicks is enabling an unprecedented transformation. Not just a digital transformation; it is something as expansive as people and a best-in-class global culture.
Key Facts:
than 550 Technology Professionalsthan 200 Customer Servedthan 900 Project Completedby happy clients including Fortune 500 companiesBooks authored by Employeesin India, USA & Australia
Role PurposeWe are seeking a Lead Data Engineer to join our team of data experts.
The ideal candidate will have a passion for designing and implementing end-to-end data solutions, from data ingestion and processing to analytics and reporting.
As a Lead Data Engineer, you will work closely with the Data Architect and other stakeholders to understand their data needs and provide solutions to meet those needs.
The ideal candidate will have a strong background in designing, developing, and maintaining data pipelines and ETL processes using technologies such as Snowflake, DBT, Matallion in Data warehousing.
As a Lead Data Engineer, you will work closely with the Data Architect and other stakeholders to understand their data needs and provide solutions to meet those needs.
You will also be responsible for leading a team of data engineers and ensuring that all data engineering projects are completed on time and within budget.
Role Responsibilitieswith the Data Architect and other stakeholders to understand their data needs and provide solutions to meet those needsand implement end-to-end data solutions using technologies such as Snowflake, Apache Spark, and Hadoopa team of data engineers and ensure that all data engineering projects are completed on time and within budgetnovel query optimization, major security competencies with encryption.performance issues and scalability issues in the system.management with distributed data processing algorithmsownership right from start to finish.
monitor, and optimize ETL
and ELT processes with data modelssolutions from on-premises setup to cloud-based implement the latest delivery approaches based on data architecture.documentation and tracking based on understanding user integration with third-party tools including architecting, designing, coding, and testing phases.
or master's degree in computer science, Information Systems, or a related fieldyears of experience in data engineering and data architectureExperience in working with
AWS S3/ Azure ADLS Storage Accounts and Snowflake .Strong experience in
data engineering fundamentals (SQL, RDBMS, Data Models, Data Structures, orchestration, Devops etc.)Knowledge of SQL language and cloud-based technologiesStrong experience
building data pipelines with Spark and Python/ScalaStrong experience
building ELT pipelines (batch and streaming) in Snowflake cloud warehouseGood working knowledge of
leveraging DBT (SQL and Python models)
to perform transformations in SnowflakeAble to write structured and efficient queries on large data sets using Statistical Aggregate functions and Analytical functions and reporting datamarts.
working with Snowflake concepts like Snowpipe, Streams, Tasks, Cloning, TimeTravel, Data Sharing, Data Replication e.t.c.
Handling large and
complex datasets like JSON, ORC, PARQUET,CSV filesfrom various sources like AWS S3,Azure DataLake Gen2 .Understanding customer requirements, analysis, design, development and implementation into the system, gather and define business requirements and enhancing business processes.
Snowflake tools like Snowsight and SnowSQL and any partner connects .Performance tuning and setting up resource monitorsSnowflake modeling – roles, databases, schemasSQL performance measuring, query tuning, and database tuningETL tools
with cloud-driven skillsSQL-based databases like Oracle SQL Server, Teradata, etc.
Snowflake warehousing, architecture, processing, administrationData ingestion into SnowflakeEnterprise-level technical exposure to Snowflake applicationswith data modelling is a plus.problem-solving and analytical skillsto work independently and as part of a team.working in an Agile building relationships with clients and in practice development activities.written and oral communication skills; Ability to communicate effectively with technical and non-technical open to travel.