Search by job, company or skills

B

Data Architect

new job description bg glownew job description bg glownew job description bg svg
  • Posted 11 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

We are seeking an experienced Data Architect to design, build, and optimize scalable, cloud-native data platforms using technologies such as Snowflake, Databricks, and Amazon Redshift. This role will be instrumental in defining enterprise data architecture, enabling reliable data movement (including migrations and real-time replication), and driving high-quality analytics and business intelligence across the organization.

Key Responsibilities
  • Design and implement scalable, secure, and high-performance data architectures on cloud platforms

  • Architect and optimize data solutions using Snowflake, Databricks, and/or Amazon Redshift

  • Lead data migration and replication strategies using tools such as AWS DMS, including full-load and CDC-based pipelines

  • Define and maintain conceptual, logical, and physical data models aligned with business needs

  • Build and manage robust ETL/ELT and data ingestion pipelines (batch and real-time)

  • Collaborate with stakeholders to translate business requirements into scalable data solutions

  • Establish and enforce data governance, security, and compliance frameworks

  • Optimize performance, cost, and reliability of data platforms

  • Integrate structured and unstructured data from multiple sources

  • Enable real-time and streaming architectures alongside traditional batch processing

  • Provide technical leadership and mentorship to data engineering teams



Requirements

Required Skills & Qualifications
  • 8+ years of experience in Data Engineering / Data Architecture

  • Strong expertise in at least one: Snowflake, Databricks, or Amazon Redshift

  • Hands-on experience with AWS DMS for database migration, replication, and Change Data Capture (CDC)

  • Experience with AWS Schema Conversion Tool (SCT) or similar migration tools is a plus

  • Strong knowledge of SQL, data modeling, and data warehousing concepts

  • Experience with ETL/ELT frameworks and tools (e.g., Apache Spark, Airflow, dbt, Informatica)

  • Hands-on experience with cloud platforms (AWS / Azure / GCP)

  • Familiarity with data lake / lakehouse architectures

  • Strong understanding of data governance, security, and compliance

  • Experience in performance tuning, query optimization, and cost management

  • Excellent problem-solving and stakeholder management skills


Preferred Qualifications
  • Experience with streaming technologies (Kafka, Kinesis, etc.)

  • Exposure to AI/ML data pipelines

  • Certifications in AWS / Azure / Snowflake / Databricks

  • Experience with DevOps/DataOps practices and CI/CD pipelines

  • Knowledge of BI tools (Tableau, Power BI, Looker, etc.)

Signs You May Be a Great Fit

Impact: Play a critical role in maintaining system uptime and delivering seamless userc experiences.

Culture: Thrive in a fast-paced, collaborative environment focused on operational excellence.

Growth: Opportunity to expand into SRE, DevOps, or platform engineering roles.

Benefits: Competitive compensation, flexible work options, and continuous learning opportunities.



More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 145649565

Similar Jobs

Power BI Developer

**********Company Name Confidential