Job Description Senior AWS Data Engineer
Experience Required: 8+ Years (Developer Standard)
Relevant Experience: 8 Years
Location: Bangalore / Hyderabad
Qualifications
- Bachelor's degree in Computer Science, Software Engineering, MIS, or equivalent.
- 8+ years as a Data Engineer on the AWS stack with hands-on DevOps experience.
- AWS Solutions Architect or AWS Developer Certification (mandatory).
Technical Expertise Required- Strong hands-on experience with AWS services: CloudFormation, S3, Athena, Glue/Glue DataBrew, EMR/Spark, RDS, Redshift, DataSync, DMS, DynamoDB, Lambda, Step Functions, IAM, KMS, Secrets Manager, EventBridge, EC2, SQS, SNS, Lake Formation, CloudWatch, CloudTrail.
- Experience implementing streaming solutions using Kinesis, AWS Managed Airflow, and AWS Managed Kafka.
- Strong background building AWS Data Lake and Data Warehouse solutions.
- Experience designing and implementing ETL/ELT pipelines.
- Knowledge of RBAC implementation using AWS IAM and Redshift RBAC models.
- Experience building CI/CD pipelines using CloudFormation and Jenkins.
- Strong programming experience: Python, Shell scripting, SQL.
- Expertise in developing automated data pipelines from:
- Relational databases RDS, Aurora, Redshift
- REST APIs S3 / RDS / Aurora / Redshift
- Experience with modern source code tools (Bitbucket).
- Experience with DevOps and development tools: Jenkins, CloudBees, SonarQube, Artifactory, Maven, MS Build, Gradle, Docker.
- Atlassian tools: Jira, Confluence.
- Experience with Infrastructure as Code using CloudFormation.
- Experience integrating Sonar/Security scans and test automation scripts in CI/CD.
- Experience in DevOps QA environments focusing on pipeline creation and optimization.
Key Responsibilities
- Design, develop, and implement data ingestion and processing pipelines on AWS.
- Develop end-to-end AWS-based data lake and data warehouse solutions.
- Implement streaming and orchestration solutions using Kinesis, Airflow, or Kafka.
- Build and enhance CI/CD pipelines for the EDP Platform using Jenkins & CloudFormation.
- Implement automation for data ingestion from databases, file systems, NAS, and REST APIs.
- Manage IAM roles, RBAC strategies, and security best practices across AWS services.
- Integrate DevOps tools (SonarQube, Artifactory, CloudBees) into the CI/CD strategy.
- Collaborate with engineering, QA, and DevOps teams to streamline delivery.
- Maintain clear documentation including exceptions, issue logs, and action plans.
- Administer and manage DevOps tools in SaaS environments.
- Build Jenkins pipelines (Pipeline-as-Code) and maintain shared libraries.
- Monitor and optimize system performance using AWS monitoring tools.
Mandatory Skills
Primary Skills
- Core AWS Data Engineering & DevOps: Python, SQL, CI/CD (Jenkins, CloudFormation), ETL/ELT, Data Lake
- AWS Data Services Expertise: IAM, Glue, Glue Crawlers, Glue DataBrew, Step Functions, Lambda, Redshift, SQS, SNS, EventBridge, Athena, Lake Formation
- Version Control: Bitbucket / GitHub
Secondary Skills
- Streaming technologies: Kinesis, Kafka
- Docker, SonarQube, Maven, Groovy
Nice-to-Have Skills
- CloudBees
- QA integration
- Agile project management exposure
- Strong documentation skills
- Continuous learning mindset