Engineer the Data Backbone for AI with goML
At goML, we build modern Generative AI, AI/ML, and Data Engineering solutions that help enterprises turn data into intelligent, scalable systems. Our mission is to bridge advanced data platforms with real-world business needsenabling faster insights, smarter decisions, and AI-ready foundations.
We're looking for a Data Engineer (AWS) to join our growing team. In this role, you'll design and deliver robust, scalable data pipelines and platforms across multiple client engagements. If you enjoy solving complex data problems, working close to customers, and building production-grade data systems, you'll thrive here.
Why You Why Now
As enterprises modernize their data stacks to support analytics, AI, and GenAI use cases, strong data engineering becomes mission-critical. This role is ideal for someone who enjoys owning data solutions end to end, translating business needs into technical systems, and delivering high-quality outcomes in fast-moving environments.
What You'll Do (Key Responsibilities)
First 30 Days: Context & Discovery
- Understand goML's data engineering frameworks, delivery standards, and AWS architecture patterns
- Deep dive into ongoing client projects, data models, and ETL workflows
- Collaborate with stakeholders to understand business requirements and data challenges
- Review existing data pipelines and identify opportunities for improvement
First 60 Days: Build & Deliver
- Lead the design and development of data solutions across client engagements
- Build and optimize ETL pipelines and data integration workflows
- Apply best engineering practices including agile delivery, unit testing, and peer reviews
- Translate business requirements into clear technical designs and implementation plans
- Work closely with cross-functional teams to ensure smooth execution from development to deployment
- Create and maintain technical documentation, solution designs, and test plans
First 180 Days: Ownership & Scale
- Own end-to-end delivery of AWS-based data platforms for multiple clients
- Design scalable and flexible data architectures aligned with evolving business needs
- Drive deployment, user onboarding, and change management during project rollouts
- Ensure reliability, performance, and data quality across production systems
- Mentor junior engineers and influence data engineering best practices at goML
What You Bring (Qualifications & Skills)
Must-Have
- 37 years of experience in data engineering, preferably in consulting or large-scale solution delivery
- Strong understanding of ETL processes, data warehousing, and data integration
- Proficiency in SQL / PL SQL and database development
- Experience designing scalable, flexible data solutions driven by business requirements
- Hands-on experience with AWS data services
- Experience with backend databases (e.g., Oracle) and/or ETL tools (e.g., Informatica)
- Bachelor's or Master's degree in Computer Science, Information Systems, or a related field
Nice-to-Have
- Experience working with multiple client stakeholders and delivery teams
- Exposure to modern cloud-native data stacks and data lake architectures
- Familiarity with CI/CD practices for data pipelines
- Strong documentation and communication skills
Who You Are
- Comfortable leading data solution design and execution
- Methodical, detail-oriented, and quality-focused
- Able to translate complex business needs into clear technical outcomes
- A strong collaborator who works well across teams and clients
Why Work With Us
- Remote-first role with flexible collaboration
- Work on diverse client projects across industries
- High ownership and visibility in data platform delivery
- Opportunities to grow into solution architect or technical lead roles
- A culture driven by learning, ownership, and impact