Search by job, company or skills

PwC India

Data Architect

new job description bg glownew job description bg glownew job description bg svg
  • Posted 22 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Job Description & Summary:

A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge.

Job Position : Manager_Data Architect_Data and Analytics_Advisory_Bangalore

About the Role:

We are hiring sharp, hands-on Data Architect to lead the design and implementation of scalable, high-performance data solutions across both traditional and cloud-based data platforms. This role demands deep expertise in PySpark, SQL, Python and Data Modelling, along with a strong understanding of cloud platforms and modern data engineering practices

What you will do:

  • Architect, Design and implement scalable end-end data solutions, ensuring scalability, performance, and cost-efficiency.
  • Build and Deploy batch and near real-time use cases in cloud environments
  • Development using Pyspark and Python scripts for large-scale data processing and ETL workflows
  • Write optimized, complex SQL for data transformation and analysis.
  • Optimize existing Pyspark and SQL scripts over large-scale datasets (TBs) with a focus on performance and cost-efficiency.
  • Create and maintain data models, ensuring data quality and consistency
  • Leverage AI/ML models in data transformations and analytics.
  • Implement data governance and security best practices in cloud environments
  • Collaborate across teams to translate business requirements into robust technical solutions

Must have Primary skills and experiences

  • 7+ years of hands-on experience in Data Engineering
  • Strong command over SQL, Python, and PySpark for data manipulation and analysis
  • Deep experience with data & analytics & warehousing and implementation in cloud environments (Azure/AWS)
  • Proficiency in data modeling techniques for cloud-based systems (Databricks, Snowflake)
  • Solid understanding of ETL/ELT processes and best practices in cloud architectures
  • Experience with dimensional modeling, star schemas, and data mart design
  • Performance optimization techniques for cloud-based data warehouses
  • Strong analytical thinking and problem-solving skills

Secondary Skills:

  • Airflow (Workflow Design and Orchestration)
  • Apache Kafka real-time streaming
  • CI/CD (Automation, GitOps, DevOps for Data)
  • Understanding of warehousing tools like Teradata, Netezza, etc.

Good to have knowledge, skills and experiences

  • Familiarity with data lake architectures and delta lake concepts
  • Data Warehouse experience using Databricks/Snowflake
  • Knowledge of data warehouse migration strategies to cloud
  • Experience with real-time data streaming technologies (e.g., Apache Kafka, Azure Event Hubs)
  • Exposure to data quality and data governance tools and methodologies
  • Understanding of
  • Certifications in Azure or AWS or Databricks

Experience

  • 7-10 years

Certifications

  • Spark Certified
  • Databricks DE Associate/Professional Certified

Good to Have:

  • Snowflake SnowPro Core Certified

Education qualification:

  • BE, B.Tech, ME, M,Tech, MBA, MCA (60% above)

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 144153891

Similar Jobs