Role: Data Engineer
Location: Remote
Experience: 5+ Years
Required Skills: AWS Glue, AWS Step Functions, Python, SQL, PySpark
Requirements:
- Python & PySpark Assisted in building ETL pipelines for structured and semi-structured data.
- Airflow Supported DAG development and gained familiarity with pipeline orchestration.
- AWS S3 & Glue Loaded data into cloud storage and supported basic transformations.
- Git & Jupyter Used notebooks for prototyping and Git for version control.
- Docker/Kubernetes Demonstrated foundational knowledge of containerization and orchestration concepts; capable of supporting Docker-based packaging and understanding Kubernetes-managed deployments in a team environment.