No more applications are being accepted for this job
- Consult, Design, build and operationalize large scale enterprise data solutions using one or more of AWS data and analytics services in combination with 3rd parties - Spark, EMR, DynamoDB, RedShift, Kinesis, Lambda, Glue, Snowflake and Databricks.
- Analyse, re-architect and re-platform on-premises data stores/ Databases to modern data platforms on AWS cloud using AWS or 3rd party services.
- Design and build production data pipelines from ingestion to consumption within a big data architecture, using Java, Python, Scala.
- Design and optimize data models on AWS Cloud using AWS data stores such as Redshift, DynamoDB, RDS, S3
- Design and implement data engineering, ingestion and curation functions on AWS cloud using AWS native or custom programming.
- Perform detail assessments of current state data platforms and create an appropriate transition path to AWS cloud as part of customer consultation and business proposals.
- Participate in client design workshops and provide trade-offs and recommendations towards building solutions.
- Mentor other engineers in coding best practices and problem solving
- 13+ years' experience in the industry.
- Bachelor's Degree in computer science, Information Technology or other relevant fields
- Experience and knowledge of Big Data Architectures, on cloud and on premise
- Proficiency in AWS Collection Services: Kinesis, Kafka, Database Migration Service
- Proficiency in AWS main Storage Service: S3, EBS, EFS
- Proficiency in AWS main Compute Service: EC2, Lambda, ECS, EKS
- Proven experience in: Java, Scala, Python, and shell scripting.
- Working experience with: AWS Athena and Glue Pyspark, EMR, DynamoDB, Redshift, Kinesis, Lambda, Apache Spark, Databricks on AWS, Snowflake on AWS
- Proficient in AWS Redshift, S3, Glue, Athena, DynamoDB
- AWS Certification: AWS Certified Solutions Architect and/or AWS Certified Data Analytics
- Working experience with Agile Methodology and Kanban
- Good knowledge of SQL .
- Experience in building and delivering proofs-of-concept, in order to address specific business needs, using the most appropriate techniques, data sources and technologies
- Experience partnering with executive stakeholders as a trusted advisor as well as enabling technical implementers
- Working experience in migrating workloads from on premise to cloud environment
- Experience in monitoring distributed infrastructure, using AWS tools or open-source ones Experience in monitoring distributed infrastructure, using AWS tools or open source ones such as CloudWatch, Prometheus, and the ELK stack would be big advantage.
AWS Data Architect - Mumbai, India - Fractal
Description
Job Description
Position: AWS Data Architect
AWS Practice
Responsibilities:
Requirements: