Snowflake, Dbt, Python, Airflow - Kochi, India - UST

UST
UST
Verified Company
Kochi, India

2 weeks ago

Deepika Kaur

Posted by:

Deepika Kaur

beBee Recuiter


Description

Role Proficiency:
Provide expertise on data analysis techniques using software tools. Under supervision streamline business processes.



Outcomes:


  • Design and manage the reporting environment; which include data sources security and metadata.
  • Provide technical expertise on data storage structures data mining and data cleansing.
  • Support the data warehouse in identifying and revising reporting requirements.
  • Support initiatives for data integrity and normalization.
  • Assess tests and implement new or upgraded software. Assist with strategic decisions on new systems. Generate reports from single or multiple systems.
  • Troubleshoot the reporting database environment and associated reports.
  • Identify and recommend new ways to streamline business processes
  • Illustrate data graphically and translate complex findings into written text.
  • Locate results to help clients make better decisions. Solicit feedback from clients and build solutions based on feedback.
  • Train end users on new reports and dashboards.
Set FAST goals and provide feedback on FAST goals of repartees

Measures of

Outcomes:


  • Quality number of review comments on codes written
  • Data consistency and data quality.
  • Illustrates data graphically; translates complex findings into written text.
  • Number of results located to help clients make informed decisions.
  • Number of business processes changed due to vital analysis.
  • Number of Business Intelligent Dashboards developed
  • Number of productivity standards defined for project
Number of mandatory trainings completed


Outputs Expected:

Determine Specific Data needs:

  • Work with departmental managers to outline the specific data needs for each business method analysis project

Critical business insights:

  • Mines the business's database in search of critical business insights; communicates findings to relevant departments.

Code:

  • Creates efficient and reusable SQL code meant for the improvement
manipulation
and analysis of data.

  • Creates efficient and reusable code. Follows coding best practices.

Create/Validate Data Models:

  • Builds statistical models; diagnoses
validates
and improves the performance of these models over time.


Predictive analytics:

  • Seeks to determine likely outcomes by detecting tendencies in descriptive and diagnostic analysis

Prescriptive analytics:

  • Attempts to identify what business action to take

Code Versioning:

  • Organize and manage the changes and revisions to code. Use a version control tool for example git
bitbucket. etc.


Create Reports:

  • Create reports depicting the trends and behaviours from analyzed data

Document:

  • Create documentation for worked performed. Additionally
perform peer reviews of documentation of others' work


Manage knowledge:

  • Consume and contribute to project related documents
share point
libraries and client universities


Status Reporting:

  • Report status of tasks assigned
Comply with project related reporting standards and processes


Skill Examples:

  • Analytical Skills: Ability to work with large amounts of data: facts figures and number crunching.
  • Communication Skills: Communicate effectively with a diverse population at various organization levels with the right level of detail.
  • Critical Thinking: Data Analysts must review numbers trends and data to come up with original conclusions based on the findings.
  • Presentation Skills facilitates reports and oral presentations to senior colleagues
  • Strong meeting facilitation skills as well as presentation skills.
  • Attention to Detail: Vigilant in the analysis to determine accurate conclusions.
  • Mathematical Skills to estimate numerical data.
  • Work in a team environment
Proactively ask for and offer help


Knowledge Examples:

  • Database languages such as SQL
  • Programming language such as R or Python
  • Analytical tools and languages such as SAS & Mahout.
  • Proficiency in MATLAB.
  • Data visualization software such as Tableau or Qlik.
  • Proficient in mathematics and calculations.
  • Efficiently with spreadsheet tools such as Microsoft Excel or Google Sheets
  • DBMS
  • Operating Systems and software platforms
Knowledge regarding customer domain and sub domain where problem is solvedAdditional Comments:

  • Experience in python, specifically in the area of data engineering in a commercial setting (e.g. hands on coding experience) (pandas (or dask, vaex etc), Airflow)
  • Good understanding of ETL/ELT patterns, idempotency and other data engineering best practices.
  • Extensive Experience with data modelling (3rd normal form, star schemas, wide/tall projections)
  • Experience of dealing with Metadata and best practices for cataloguing datasets in Snowflake/other Warehouses
  • Excellent SQL knowledge, including an understanding of how to write optimised SQL code, good general knowledge of different SQL engines, and what considerations they bring when optimising.
  • Experience with integrating DataWarehouse/Data pipelines with Data Governance tools like Collibra
  • Familiar

More jobs from UST