Search by job, company or skills

Improzo

Cloud Data Engineer

7-9 Years
new job description bg glownew job description bg glownew job description bg svg
  • Posted 6 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

About Improzo

At Improzo (Improve + Zoe; meaning Life in Greek), we believe in improving life by empowering our customers. Founded by seasoned Industry leaders, we are laser focused on delivering quality-led commercial analytical solutions to our clients. Our dedicated team of experts in commercial data, technology, and operations has been evolving and learning together since our inception. Here, you won't find yourself confined to a cubicle; instead, you'll be navigating open waters, collaborating with brilliant minds to shape the future. You will work with leading Life Sciences clients, seasoned leaders and carefully chosen peers like you!

People are at the heart of our success, so we have defined our CARE values framework with a lot of effort, and we use it as our guiding light in everything we do. We CARE!

  • Customer-Centric: Client success is our success. Prioritize customer needs and outcomes in every action.
  • Adaptive: Agile and Innovative, with a growth mindset. Pursue bold and disruptive avenues that push the boundaries of possibilities.
  • Respect: Deep respect for our clients & colleagues. Foster a culture of collaboration and act with honesty, transparency, and ethical responsibility.
  • Execution: Laser focused on quality-led execution; we deliver! Strive for the highest quality in our services, solutions, and customer experiences.

About The Role

We are looking for a skilled Cloud Data Engineer who can design robust data architectures, optimize cloud databases, and build scalable pipelines that power our product's data ecosystem. The ideal candidate blends strong PostgreSQL expertise with hands-on cloud RDBMS experience to ensure high performance, security, and reliability. This role involves creating end-to-end data flows, integrating systems using modern connector tools, and developing Python-based ETL/ELT processes. The engineer will collaborate closely with application and analytics teams to maintain consistent, accurate, and governed data. This position is critical to ensuring seamless data operations across our microservices-driven B2B platform.

Key Responsibilities

  • Design, create, and maintain PostgreSQL database schemas aligned with microservices architecture and business requirements.
  • Perform DBA-like responsibilities including schema evolution, indexing, partitioning, performance tuning, security, and backup strategies.
  • Build scalable and reliable data pipelines for ingesting, transforming, and moving data across systems.
  • Work with data connector tools (e.g., Kafka Connect,DBT, Fivetran, Debezium, Airbyte, etc.) to sync and transfer data across sources and targets.
  • Develop and maintain Python-based ETL / ELT / data processing scripts.
  • Collaborate with application and analytics teams to ensure data availability, consistency, and reliability.
  • Manage and optimize cloud database deployments (AWS/Azure/GCP RDS, Aurora, Cloud SQL, etc.).
  • Ensure data governance, integrity, and compliance with enterprise standards.
  • Diagnose and resolve production issues related to data, DB performance, or pipelines.

Qualifications

  • Bachelor's or master's degree in a quantitative field such as computer science, statistics or mathematics.
  • 7 - 9 years of experience as a Data Engineer working on cloud-based data systems, data management, reporting, or big-datadriven projects.

Technical Stack / Skills

  • Strong hands-on experience with PostgreSQL (schema design, triggers, views, indexing, query tuning, etc.).
  • Proven expertise in RDBMS for cloud platforms (AWS RDS / Aurora, Azure PostgreSQL, GCP Cloud SQL, etc.).
  • Practical experience in building and maintaining data pipelines and data ingestion jobs.
  • Experience with Python for data processing (Pandas, SQLAlchemy, Airflow, etc.).
  • Understanding of microservices-oriented data modeling.
  • Familiarity with data connector/integration tools like Debezium, Kafka Connect, Airbyte, Fivetran, etc.
  • Knowledge of database monitoring and alerting tools.
  • Good understanding of cloud systems, networking basics, and security for data workloads.
  • Good to have exposure to Life Sciences or Healthcare data models

Benefits

  • Competitive salary and benefits package.
  • Opportunity to work on cutting-edge tech projects, transforming the life sciences industry
  • Collaborative and supportive work environment.
  • Opportunities for professional development and growth.

Skills: kafka connect,python,gcp cloud sql,aws rds,rdbms,cloud sql,postgresql,azure postgresql,aurora,debezium

More Info

Job Type:
Industry:
Function:
Employment Type:

About Company

Job ID: 135854037

Similar Jobs