- 5+ years with AWS (preferred), GCP or Azure
- 7+ years of experience using standard methodologies to design, build, and support near real-time data pipelines and analytical solutions using Postgres, Redshift, BigQuery, Hadoop, Teradata, MS SQL Server, Talend, Informatica, Powerbi and/or SSIS
- 7+ years of experience using object-oriented languages (.Net, Java, Python) to deliver data for near real-time, streaming analytics.
- 7+ years of experience working with partners documenting business requirements and translating those requirements into relational, non-relational, and dimensional data models using Erwin
- 7+ years of experience working on agile teams delivering data solutions
- 7+ years of experience developing MDM solutions
- 7+ years of experience in delivering solutions on public cloud platforms (Google Cloud preferred)
- Experience writing automated unit, integration, and acceptance tests for data interfaces & data pipelines
- Ability to quickly comprehend the functions and capabilities of new technologies, and identify the most appropriate use for them
- Exceptional interpersonal skills, including teamwork, communication, and negotiation
Database Architect - India - CloudHire
Description
We are looking for Database Architecture ;
Responsibilities
Integrate multiple databases together, Snowflake schema, Star schema, Network model, and others.
· Work with multiple message buses, Kafka, IBM MQ to targets like Redshift, Postgres, MongoDB
· Discovering appropriate workloads and use the appropriate database to deliver the performance and functionality needed
· Adept at design and deploy for scale considering the types of requests the database must deliver on
· Database recovery with sequence and time constraint
· Collaborating directly with business and technology stakeholders to define future-state business capabilities & requirements and translating those into transitional and target state data architectures.
· Partnering with platform architects to ensure implementations meet published platform principles, guidelines, and standards.
· Analyzing the current technology environment to detect critical deficiencies and recommend solutions for improvement.
· Designing, implementing, and maintaining data services, interfaces, and real-time data pipelines via the practical application of existing, new, and emerging technologies and data engineering techniques
· Developing continuous integration and continuous deployment for data pipelines that include automated unit & integration testing
· Workflow management platforms like Airflow
· Mentoring, motivating, and supporting the team to achieve organizational objectives and goals
· Advocating for agile practices to increase delivery throughput.
· Creating, maintaining, and ensuring consistency with published development standards
Requirements
Role - Remote
Salary Budget - upto 35 LPA
If your experience matches with the Job Description, please share your updated resume at