Job Description:
We are seeking a highly skilled and experienced Senior Data Engineer with a strong background in API Integration, Python, and AWS. The ideal candidate will have a passion for data engineering and a proven track record of developing robust data pipelines and platforms.
Key Responsibilities:
- Develop and maintain ETL/ELT data pipelines and API integrations (Fast API preferred).
- Design and implement data platforms/products and data warehouses.
- Develop data-intensive solutions on AWS, Azure, or GCP for analytics workloads.
- Design both ETL/ELT processes for batch processing and data streaming architectures for real-time or near real-time data ingestion and processing.
- Work with various database technologies (e.g., MySQL, PostgreSQL, MongoDB) and data warehouses (e.g., Redshift, BigQuery, Snowflake).
- Utilize cloud-based data engineering technologies (e.g., Kafka, PubSub, Apache Airflow, Glue).
- Develop conceptual, logical, and physical data models using ERDs.
- Create dashboards and data visualizations using tools such as Tableau and Quicksight.
Qualifications:
- Bachelors degree in Computer Science, Data Science, or a related technical discipline.
- 7+ years of hands-on experience in data engineering.
- 4+ years of experience in developing data-intensive solutions on AWS, Azure, or GCP.
- 3+ years of experience in designing ETL/ELT processes and data streaming architectures.
- 3+ years of experience with database technologies and data warehouses.
- 5+ years of programming experience in Python.
- Proficiency in dashboard/BI and data visualization tools (e.g., Tableau, Quicksight).
Skills and Attributes:
- Thrives in dynamic, cross-functional team environments.
- Possesses a team-first mindset, valuing diverse perspectives and contributing to a collaborative work culture.
- Approaches challenges with a positive and can-do attitude.
- Willing to challenge the status quo and take appropriate risks to drive performance.
- A passionate problem solver with high learning agility.