Search by job, company or skills

R Systems

GCP Data Engineer(Only Immediate)

new job description bg glownew job description bg glownew job description bg svg
  • Posted 21 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

About the role-

As a Software Engineer II - Data, you will contribute to the design and development of data systems including pipelines, APIs, analytics, AI and machine learning at scale. You'll be a core part of our Data Products team, building and maintaining production-grade pipelines and platform components that power business and product outcomes. This role emphasizes hands-on development, reliability, and team collaboration.

Role requires:

  • A proactive and collaborative approach to problem-solving, with a mindset focused on outcomes, learning, and iteration.
  • The ability to manage multiple priorities or projects simultaneously, while meeting deadlines and maintaining high technical standards.
  • Comfort operating within modern cloud-native architectures and tooling.
  • Commitment to writing clean, testable, and maintainable code.
  • An understanding of how your work contributes to broader team and business goals.
  • Willingness to ask questions, challenge assumptions, and share ideas.

Experience & Technical Requirements:

  • 36 years of development experience, with production systems in cloud environments.
  • Proficient in Python and/or Golang, and SQL for data processing, transformation, and orchestration tasks.
  • Experience with at least one modern cloud platform (e.g., GCP, AWS, or Azure).
  • Experience developing REST or GraphQL APIs and internal data access layers.
  • Experience building and maintaining ETL/ELT pipelines or API-driven data services.
  • Experience with source control (e.g., Git), automated testing, and CI/CD practices.
  • Exposure to orchestration tooling such as n8n, Cloud scheduler, Airflow, Step Functions, or similar.
  • Understanding of data modeling concepts and cloud warehousing (e.g. Databricks, BigQuery, Snowflake or other).
  • Familiarity with Kafka, Pub/Sub, or other event-based systems.
  • Awareness of data quality, observability, and governance principles in engineering contexts.
  • Strong written and verbal communication skills, with an ability to share context with both technical peers and cross-functional partners.
  • Experience working with containerized environments (e.g., Docker, Kubernetes).
  • Exposure to infrastructure-as-code, especially for deploying and managing data workflows.
  • Hands-on use of BI tools like Looker or Tableau
  • A growth mindset and interest in mentoring junior peers or learning from senior engineers.

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 135319327