Search by job, company or skills

M

Data Architect

new job description bg glownew job description bg glownew job description bg svg
  • Posted 13 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

As a Data Architect, you will design and implement scalable, cloud-native data solutions that handle petabyte-scale datasets. You will lead architecture discussions, build robust data pipelines, and work closely with cross-functional teams to deliver enterprise-grade data platforms. Your work will directly support analytics, AI/ML, and real-time data processing needs across global clients.

Key Responsibilities:

  • Translate complex data and analytics requirements into scalable technical architectures.
  • Design and implement cloud-native architectures for real-time and batch data processing.
  • Build and maintain large-scale data pipelines and frameworks using modern orchestration tools (e.g., Airflow, Oozie).
  • Define strategies for data modeling, integration, metadata management, and governance.
  • Optimize data systems for cost-efficiency, performance, and scalability.
  • Leverage cloud services (AWS, Azure, GCP) including Azure Synapse, AWS Redshift, BigQuery, etc.
  • Implement data governance frameworks covering quality, lineage, cataloging, and access control.
  • Work with modern big data technologies (e.g., Spark, Kafka, Databricks, Snowflake, Hadoop).
  • Collaborate with data engineers, analysts, DevOps, and business stakeholders.
  • Evaluate and adopt emerging technologies to improve data architecture.
  • Provide architectural guidance in cloud migration and modernization projects.
  • Lead and mentor engineering teams and provide technical thought leadership.

Required Skills and Experience:

  • Bachelor's or Master's in Computer Science, Engineering, or related field.
  • 10+ years of experience in data architecture, engineering, or platform roles.
  • 5+ years of experience with cloud data platforms (Azure, AWS, or GCP).
  • Proven experience building scalable enterprise data platforms (data lakes/warehouses).
  • Strong expertise in distributed computing, data modeling, and pipeline optimization.
  • Proficiency in SQL and NoSQL databases (e.g., Snowflake, SQL Server, Cosmos DB, DynamoDB).
  • Experience with data integration tools like Azure Data Factory, Talend, or Informatica.
  • Hands-on experience with real-time streaming technologies (Kafka, Kinesis, Event Hub).
  • Expertise in scripting/programming languages such as Python, Spark, Java, or Scala.
  • Deep understanding of data governance, security, and regulatory compliance (GDPR, HIPAA, CCPA).
  • Strong communication, presentation, and stakeholder management skills.
  • Ability to lead multiple projects simultaneously in an agile environment.

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 144904389

Similar Jobs