Hang Seng

Senior Consultant Specialist/Apache Hadoop, Scala, Apache Spark,Python,ETL/MSS (BB-88B8D)

Found in: Talent IN

Job DescriptionDescriptionThe health and safety of our employees and candidates is very important to us. Due to the current situation related to the Novel Coronavirus (2019-nCoV), we’re leveraging our digital capabilities to ensure we can continue to recruit top talent at the HSBC Group. As your application progresses, you may be asked to use one of our digital tools to help you through your recruitment journey. If so, one of our Resourcing colleagues will explain how our video-interviewing technology will be used throughout the recruitment process and will be on hand to answer any questions you might have. Some careers have more impact than others. If you’re looking for a career where you can make a real impression, join HSBC and discover how valued you’ll be. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Senior Consultant Specialist Principal responsibilities GB&M Big Data is a Global Markets and Banking Initiative that is part of the Group Data Strategy to transform the way we govern, manage and use all our data to its full potential across HSBC. Assets that are being developed as part of GB&M Big Data are being designed to support HSBC at a Group level. These assets include the creation of a Data Lake for GBM and CMB, or a single virtual pool of client, transaction, product, instrument, pricing and portfolio data. Using the lake deliver solution to business requirement using Big Data as business as service. DevOps Data Engineer will be part of core big data technology and design team. Person would be entrusted to developed solutions/design ideas, identify design ideas to enable the software to meet the acceptance and success criteria. Work with architects/BA to build data component on the data environment. Software design, Scala & Spark development, automated testing of new and existing components in an Agile, DevOps and dynamic environment Promoting development standards, code reviews, mentoring, knowledge sharing Product and feature design, scrum story writing Data Engineering and Management Product support & troubleshooting Implement the tools and processes, handling performance, scale, availability, accuracy and monitoring Liaison with BAs to ensure that requirements are correctly interpreted and implemented. Liaison with Testers to ensure that they understand how requirements have been implemented – so that they can be effectively tested. Participation in regular planning and status meetings. Input to the development process – through the involvement in Sprint reviews and retrospectives. Input into system architecture and design. Peer code reviews. 3rd line support. QualificationsRequirements Scala development and design using Scala 2.10+ or Java development and design using Java 1.8+. Advanced understanding of core features of these languages and when to use them. Experience with most of the following technologies (Apache Hadoop, Scala, Apache Spark, Spark streaming, YARN, Kafka, Hive, Python, ETL frameworks, MapReduce, SQL, RESTful services). Sound knowledge on working Unix/Linux Platform Hands-on experience building data pipelines using Hadoop components - Hive, Spark, Spark SQL. Experience with developing Hive QL, UDF’s for analysing semi structured/structured datasets. Experience with scheduling tools such as Airflow, Control-M. Experience with industry standard version control tools (Git, GitHub), automated deployment tools (Ansible & Jenkins) and requirement management in JIRA. Understanding of big data modelling techniques using relational and non-relational techniques Coordination between Onsite and Offshore Forward thinking, independent, creative, and self-sufficient; who can work with less documentation, has exposure testing complex multi-tiered integrated applications. Ability to work with minimal supervision on own initiative and on multiple tasks simultaneously. Excellent communication, interpersonal, and decision making skills Strong team-working skills, working in global teams across multiple time zones Good knowledge on Data warehouse concepts. Knowledge on Software Development Life Cycle (SDLC), and Methodologies like DevOps, Agile, Scrum Willingness to learn and quick to adapt to changing requirements Identify project issues, communicate them and assist in their resolution. Assist in continuous improvement efforts in enhancing project team methodology and performance. Cooperative team focused attitude. Needs to be a Self-starter, proactive. Understanding or experience of Cloud design patterns. Experience with time-series/analytics db's such as Elasticsearch. You’ll achieve more when you join HSBC. HSBC is an equal opportunity employer committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and, opportunities to grow within an inclusive and diverse environment. We encourage applications from all suitably qualified persons irrespective of, but not limited to, their gender or genetic information, sexual orientation, ethnicity, religion, social status, medical care leave requirements, political affiliation, people with disabilities, color, national origin, veteran status, etc., We consider all applications based on merit and suitability to the role.” Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. **Issued By HSBC Software Development Centre***

calendar_today5 days ago


info Schedule

location_on Pune, India

work Hang Seng

I expressly authorise the Terms and Conditions

Similar jobs