Search by job, company or skills

  • Posted 14 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Job Requirements

About the Role

The DataArchitect within the Data & Analytics function will collaborate with Data Scientists to build and support generative AI solutions across text, audio, image, and tabular data domains. The role involves managing large volumes of structured and unstructured data, ensuring efficient storage, retrieval, and augmentation to power GenAI models. The ideal candidate will be proficient in building scalable data pipelines, optimizing data architecture, and maintaining high standards of data reliability and governance.

Key Responsibilities

Primary Responsibilities

  • Build and maintain data engineering pipelines, with a focus on unstructured data.
  • Conduct requirements gathering and project scoping sessions with business users, subject matter experts, and executive stakeholders.
  • Design, build, and optimize data architecture and ETL pipelines for use by Data Scientists and GenAI products.
  • Manage the full data lifecycle: ingestion, transformation, and consumption.
  • Ensure data reliability, integrity, and governance across all systems.
  • Work with APIs to enable seamless data integration and usability.
  • Create technical design documentation for data pipelines and projects.
  • Debug technical issues and manage code versioning using Git.
  • Demonstrate hands-on experience with big data infrastructure such as MapReduce, Hive, HDFS, YARN, HBase, MongoDB, DynamoDB, etc.

Secondary Responsibilities

  • Apply machine learning and predictive analytics techniques where applicable.
  • Leverage domain knowledge in banking or financial services to enhance data solutions.
  • Present data insights using effective storytelling and visualization techniques.

What We Are Looking For

Education

  • Bachelor's or Master's degree in Computer Science or Data Engineering or Information Systems or a related field.

Experience

  • Proven experience in building and managing data pipelines and architectures.
  • Hands-on experience with big data technologies and cloud platforms (AWS, GCP, or Azure).
  • Exposure to GenAI applications and working with unstructured data formats.
  • Experience in the banking or financial services industry is a plus.

Skills and Attributes

  • Strong programming and debugging skills.
  • Proficiency in data architecture, ETL design, and pipeline optimization.
  • Familiarity with API integration and cloud services.
  • Excellent documentation and communication skills.
  • Strong problem-solving mindset and attention to detail.
  • Ability to deliver high-quality outputs under tight timelines.
  • Proficiency in data storytelling and presentation techniques.

Key Success Metrics

  • Timely and high-quality delivery of all assigned tasks.
  • Effective tracking and reporting of deliverables.
  • Ability to adapt and thrive in a fast-paced, evolving tech environment.

More Info

Job Type:
Industry:
Employment Type:

Job ID: 137854311