Kaseya

Data Engineer (BB-367E3)

Found in: Talent IN

Description:
Responsibilities: Create and maintain optimal data pipeline architecture Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies. Researching on the latest datastores, database architecture, and data warehousing and doing Proof of Concept (POC) implementations Brainstorming with the team on re-architecting an existing product for scale Recommend and implement new ways to vastly improve data efficiency and quality Manage and support the data integrity and reliability of data services Assemble large, complex data sets that meet functional / non-functional business requirements Maintain highest levels of development practices including technical design, solution development, systems configuration, test documentation/execution, issue identification and resolution, writing clean, modular and self-sustaining code, with repeatable quality and predictability AWS toolchain (S3/Glue/EMR/RedShift). Strong in writing complex queries with nested joins and derived tables Optimize Redshift Data warehouse by implementing workload management, sort keys & distribution keys Hands on experience of Amazon Redshift Architecture and can create new database, new schemas i.e. snowflake, views, etc. Qualifications: Experience building and optimizing ‘big data’ data pipelines, architectures and data sets Strong analytic skills related to working with structured/unstructured datasets Build processes supporting data transformation, data structures, dimensional modelling, metadata, dependency, schema registration/evolution and workload management Strong experience in database system AWS Redshift 4+ years of experience as a Data Engineer or in a similar role Experience with data warehousing, data modeling and building ETL pipelines Strong experience in SQL 4-6 years of total experience in building DW/BI systems At least 2 years of experience in end-to-end design and implementation of large scale DW/BI systems Experience with ETL and working with large-scale datasets Extremely proficient in writing performant SQL working with large data volumes Proficiency in writing and debugging complex SQLs Bachelor's degree in engineering or equivalent Work Location - Whitefield (Bangalore) Timings - 2 pm to 11 pm IST

calendar_today5 days ago

report

location_on Bengaluru, India

work Kaseya

Apply:
I expressly authorise the Terms and Conditions

Similar jobs