Role : Associate Architect - Data
Experience : 7-13 Years
Location : Bangalore
Work Mode : Work from Office 5 days a week
Must Have Skills:
- 7+ years of experience in designing and implementing data warehouses and data lake/lakehouse on AWS.
- Proven success working with globally distributed teams in collaborative delivery environments.
- Deep working knowledge across key AWS Data & Analytics services, including:
- Building large-scale data lake architectures on Amazon S3 and open table formats
- Implementing governance and cataloging through AWS Lake Formation.
- Developing ETL and metadata frameworks using AWS Glue.
- Leveraging AWS Lambda for serverless data processing.
- Running distributed data workloads on Amazon EMR.
- Enabling real-time data pipelines with AWS Kinesis (Data Streams and Firehose).
- Designing and optimizing schemas and query performance on Amazon Redshift, including Spectrum and Serverless features.
- Querying large datasets interactively using Amazon Athena.
- Managing operational databases using Amazon RDS across engines such as PostgreSQL, MySQL, and Aurora.
- Integrating and migrating data using AWS DMS, Glue Connectors, EventBridge, SNS, and SQS.
- Strong programming capability in Python and PySpark for large-scale data processing.
- Proficiency in writing complex SQL queries, analytical functions, and performance tuning for large datasets.
- Familiarity with NoSQL databases such as Amazon DynamoDB, MongoDB, or DocumentDB.
- Strong understanding of partitioning, indexing, scaling approaches, and query optimization techniques.
- Proven experience in architecting and implementing data pipelines using native AWS services in a modular and resilient manner.
- Solid understanding of data modeling concepts, including dimensional, normalized, and lakehouse patterns.
- Background in greenfield data implementation projects
- Prior experience in migrating (on-prem to cloud) and processing large amounts of data
- Solution Architect- Associate or Data Engineer- Associate Certification.
Good to have skills:
- Experience of working for customers/workloads in Financial domain.
- Experience in Databricks Workspaces,Notebooks, Delta Lake, APIs will be a plus.
- Experience with Java / Springboot, Scala is a plus
- Exposure to IaC tools like Terraform and to CI/CD tools
- Prior experience in migrating (on-prem to cloud) and processing large amounts of data
- Orchestrating pipelines using AWS Step Functions/Amazon MWAA/similar services.
- Managing security, monitoring, and compliance with AWS IAM, Secrets Manager, CloudWatch, CloudTrail, and KMS.
- Experience in implementing the industry's best practices
- Experience with Streaming, Kafka
- Experience of leveraging Gen AI tools like Claude / Copilot etc. to help accelerate the overall migration workloads.
Other:
- Demonstrate problem solving, communication and organizational skills, a positive attitude, and the proven ability to negotiate and influence others to obtain desired results.
- Ability to speak in business terms, as well as the ability to effectively communicate both internally and externally.
- Ability to collaborate with cross-functional teams such as Developers, QA, Project Managers, and other stakeholders to understand their requirements and implement solutions.
- Ability to clearly communicate technical roadmap, challenges and mitigation.