beBee background
Professionals
>
Chennai
Anandhi Anandhi

Anandhi Anandhi

Snowflake, PYTHON, PlSql, AWS, IICS, snow PIPE

Technology / Internet

Chennai, Chennai district

Social


About Anandhi Anandhi:

Having overall 7.6 years of experience in Snowflake development, data migration and ETL concepts.
Strong understanding of data modeling, data architecture, database design and ETL Concepts.
Set up continuous data ingestion using Snow pipe, external stage to integrate data between AWS S3 to Snowflake DB.
Used Snow sql, internal stage  to load the data internally to Snowflake.
Hands on creating Snow stream, Snow task, Merge statements to capture the Continuous Data Changes(CDC) on base tables.
Hands on data migration between Oracle database to Snowflake DB using IICS.
Sound knowledge in writing Complex SQL queries, Procedures, User Defined Functions(UDFs), Analytical Functions, Sequence, Joins and Subqueries.
Used cloning technique in Snowflake to take backup of databases for various environments.
Used Time Travel to access the historical data and recovered the deleted data.
Used fail-safe to recover the data that exceeds the retention time.
Used secure views and materialized views to share the data to the partner application as per the data requests.
knowledge in data sharing between Snowflake accounts for data analysis and reporting purposes.
Followed agile methodology with 3 weeks of sprint.

Experience

EXPERIENCE
Senior Engineer | Cigniti Technologies | July 2023 – Present
Engineer | HDATA Info Systems | Jan 2018 –July 2023
PROJECTS

PROJECTS

CURRENT:

Role    : Snowflake Developer
Client : Merrill Lynch (July 2023 - Present)
Environment : Snowflake DB, Oracle, AWS S3, Informatica IICS, Snow SQL, SQL Developer, JIRA.

Created Snow pipe, external stage, task for continuous data ingestion from the AWS S3 bucket to Snowflake DB.
Monitoring and capturing Data Manipulation changes in the source system and merge into target tables using snow stream, snow tasks, merge statement in order to ensure that the data remains consistent and accurate throughout the process.
Used Snow sql to load the data internally from internal stage to Snowflake by using COPY command.
Created Views, Procedures, User Defined Functions, Sequence and SQL queries for loading the data from Cloud system to Snowflake DB.
Created Snowflake Objects like Databases, Schemas, Stages, Tables, File Formats.
Applied Masking techniques to protect sensitive data from unauthorized users.
Handled semi structured datas like JSON using VARIANT data type, Lateral Flatten.
Used IICS for Transformation and ETL workflows.
Converted the business needs into complex SQL queries and subqueries using Analytical functions, Joins, Operators.
Cloned the data for backup purposes.
Worked closely with data analysts, data scientists, business stakeholders and other developers to understand data requirements and delivered solutions.
Daily activities of attending Client Calls, Scrums, Sprint Meetings.
KTs to newly joined Engineers and adopted Agile methodology for timely delivery.


PREVIOUS:

Role    : Snowflake Developer
Client : CIT Bank  (Jan 2018 –July 2023)
Environment :Snowflake DB, Oracle, AWS S3, Informatica IICS, Snow SQL, SQL Developer, JIRA.

Data Migration from Oracle Database to Snowflake using Flat files from Oracle DB to Snowflake DB.
Created external stage for transferring data from AWS Cloud to Snowflake.
Loaded the data internally from internal stage to Snowflake by using COPY command IN Snow sql.
Set up and manage continuous data ingestion pipelines using Snow pipe and tasks.
Implemented and managed security measures, including role-based access controls and data masking technique.
Used Snow stream object to create the audit table data backup whenever there is DML operation on base tables.
Developed Snowflakes Object such as warehouses, Schemas, Database, Tables, Views, Pipes, Stages, Scaling Policies.
Have hands on experience on loading semistructured data using VARIANT Data type and Snowflake flatten function.
Developed Snowflake Procedures, Functions and UDTF (User Defined Table Function)and Triggers.
Used Informatica Cloud IICS to Design and implement ETL workflows to extract, transform and load data from various sources into Snowflake as per the client’s requirement.
Applied transformation rules to ensure high data quality and consistency as per the mapping document.
Implemented complex queries, subqueries, analytical functions, group functions, joins, operators for transformations.
Participated in daily scrum meetings and weekly project planning and status sessions.
Followed Agile methodology with 3 weeks of sprint.

 

 

Education

BTech(IT)- Arulmigu Meenakshi Amman College of Engineering
score-81%

D.R.B.C.C.C Hr. Sec School - HSC - 84%
D.R.B.C.C.C Hr. Sec School,
SSLC- 86%

Professionals in the same Technology / Internet sector as Anandhi Anandhi

Professionals from different sectors near Chennai, Chennai district