Snowflake, Dbt, Python, Airflow - Kochi, India - UST
Description
Role Proficiency:
Provide expertise on data analysis techniques using software tools. Under supervision streamline business processes.
Outcomes:
- Design and manage the reporting environment; which include data sources security and metadata.
- Provide technical expertise on data storage structures data mining and data cleansing.
- Support the data warehouse in identifying and revising reporting requirements.
- Support initiatives for data integrity and normalization.
- Assess tests and implement new or upgraded software. Assist with strategic decisions on new systems. Generate reports from single or multiple systems.
- Troubleshoot the reporting database environment and associated reports.
- Identify and recommend new ways to streamline business processes
- Illustrate data graphically and translate complex findings into written text.
- Locate results to help clients make better decisions. Solicit feedback from clients and build solutions based on feedback.
- Train end users on new reports and dashboards.
Measures of
Outcomes:
- Quality number of review comments on codes written
- Data consistency and data quality.
- Illustrates data graphically; translates complex findings into written text.
- Number of results located to help clients make informed decisions.
- Number of business processes changed due to vital analysis.
- Number of Business Intelligent Dashboards developed
- Number of productivity standards defined for project
Outputs Expected:
Determine Specific Data needs:
- Work with departmental managers to outline the specific data needs for each business method analysis project
Critical business insights:
- Mines the business's database in search of critical business insights; communicates findings to relevant departments.
Code:
- Creates efficient and reusable SQL code meant for the improvement
and analysis of data.
- Creates efficient and reusable code. Follows coding best practices.
Create/Validate Data Models:
- Builds statistical models; diagnoses
and improves the performance of these models over time.
Predictive analytics:
- Seeks to determine likely outcomes by detecting tendencies in descriptive and diagnostic analysis
Prescriptive analytics:
- Attempts to identify what business action to take
Code Versioning:
- Organize and manage the changes and revisions to code. Use a version control tool for example git
Create Reports:
- Create reports depicting the trends and behaviours from analyzed data
Document:
- Create documentation for worked performed. Additionally
Manage knowledge:
- Consume and contribute to project related documents
libraries and client universities
Status Reporting:
- Report status of tasks assigned
Skill Examples:
- Analytical Skills: Ability to work with large amounts of data: facts figures and number crunching.
- Communication Skills: Communicate effectively with a diverse population at various organization levels with the right level of detail.
- Critical Thinking: Data Analysts must review numbers trends and data to come up with original conclusions based on the findings.
- Presentation Skills facilitates reports and oral presentations to senior colleagues
- Strong meeting facilitation skills as well as presentation skills.
- Attention to Detail: Vigilant in the analysis to determine accurate conclusions.
- Mathematical Skills to estimate numerical data.
- Work in a team environment
Knowledge Examples:
- Database languages such as SQL
- Programming language such as R or Python
- Analytical tools and languages such as SAS & Mahout.
- Proficiency in MATLAB.
- Data visualization software such as Tableau or Qlik.
- Proficient in mathematics and calculations.
- Efficiently with spreadsheet tools such as Microsoft Excel or Google Sheets
- DBMS
- Operating Systems and software platforms
- Experience in python, specifically in the area of data engineering in a commercial setting (e.g. hands on coding experience) (pandas (or dask, vaex etc), Airflow)
- Good understanding of ETL/ELT patterns, idempotency and other data engineering best practices.
- Extensive Experience with data modelling (3rd normal form, star schemas, wide/tall projections)
- Experience of dealing with Metadata and best practices for cataloguing datasets in Snowflake/other Warehouses
- Excellent SQL knowledge, including an understanding of how to write optimised SQL code, good general knowledge of different SQL engines, and what considerations they bring when optimising.
- Experience with integrating DataWarehouse/Data pipelines with Data Governance tools like Collibra
- Familiar
More jobs from UST
-
Python Developer
Bengaluru, India - 1 week ago
-
C# .NET, MVC, Web Api, Angularjs
Bengaluru, India - 3 weeks ago
-
Inventory Technician Apply in 3 Minutes
Bengaluru, India - 5 days ago
-
Firmware Engineer
Pune, India - 1 week ago
-
Data Scientist I
Chennai, India - 2 weeks ago
-
SolarWinds Administrator
Trivandrum, India - 5 days ago