Integration Architect - Bengaluru, India - ITC Infotech

    ITC Infotech
    ITC Infotech Bengaluru, India

    Found in: Appcast Linkedin IN C2 - 1 week ago

    Default job background
    Description

    Hi,

    We are Looking for talented individuals who are passionate about Data Science - Gen AI. Mentioned detailed job description below for your reference.

    Responsibilities:

    As a Data Architect specializing in system integration with additional skills in AI/ML integration, you will play a critical role in designing, developing, and implementing complex data systems. You will be responsible for creating blueprints for data management systems to integrate, centralize, protect, and maintain data sources.

    • He/she would be responsible to provide Software Systems Architecture and Enterprise Data Architecture consulting, in tandem with the Business Consultants, and would be responsible for end-to-end cycle from scoping the problem, designing the approach to ensuring complete delivery.
    • Develop and maintain scalable and reliable data architecture to support advanced analytics and data science initiatives.
    • Design, build, install, test, and maintain highly scalable data management systems.
    • Integrate disparate systems, data sources, and software to create a cohesive and comprehensive data environment.
    • Stay abreast of advancements in AI and trends in data architecture, system integration, and AI.

    Required Skill Set:

    • Demonstrated expertise in as an architect designing and developing enterprise applications and architecture in production, preferably with leadership experience.
    • In-depth understanding of database structure principles and data administration.
    • Experience with development and integration of software/web applications.
    • Experience with API development and integration (REST API, GraphQL etc).
    • Experience of system integration for tools such JIRA, SNOW, Confluence etc
    • Advanced knowledge of SQL and other database solutions, Strong expertise in SQL and NoSQL databases, data modelling, and ETL tools.
    • Proficiency in programming languages such as Python, Scala, or Java.
    • Strong experience with big data tools (e.g., Hadoop, Spark, PySpark, Kafka).
    • Expertise in modern data warehousing technologies, specifically Snowflake and Databricks.
    • Knowledge of cloud services (Azure, AWS, GCP) and their data-related services.
    • Excellent problem-solving, team management, and communication skills.
    • Should have handled project management and client facing roles
    • Ability to think strategically and analytically to effectively assess each opportunity
    • Excellent leadership skills and time management skills
    • Willingness to travel on short assignments on short notice.

    If interested please share your updated resume to