Search by job, company or skills

Parexel

Senior Data Engineer

new job description bg glownew job description bg glownew job description bg svg
  • Posted a day ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Key Responsibilities

  • Architect and implement end-to-end data pipelines using Azure Data Factory, Databricks, and Snowflake for large-scale data ingestion, transformation, and storage.
  • Using Microsoft Azure data PaaS services, design, build, modify, and support data pipelines leveraging DataBricks and PowerBI in a medallion architecture setting.
  • If necessary, create prototypes to validate proposed ideas and solicit input from stakeholders.
  • Excellent grasp of and expertise with test-driven development and continuous integration processes.
  • Analysis and Design Converts high-level design to low-level design and implements it.
  • Collaborate with Team Leads to define/clarify business requirements, estimate development costs, and finalize work plans.
  • Run unit and integration tests on all created code Create and run unit and integration tests throughout the development lifecycle.
  • Benchmark application code proactively to prevent performance and scalability concerns.
  • Collaborate with the Quality Assurance Team on issue reporting, resolution, and change management.
  • Support and Troubleshooting Assist the Operations Team with any environmental issues that arise during application deployment in the Development, QA, Staging, and Production environments.
  • Familiarity with PowerBI and Reltio is advantageous but not required.
  • Collaborate with BI teams to ensure data models are optimized for reporting in Power BI, with a focus on performance and usability.
  • Establish data governance, quality, and security controls, ensuring compliance with GDPR, HIPAA, and global clinical data regulations.
  • Mentor and guide junior engineers, fostering technical excellence and knowledge sharing.
  • Drive automation and CI/CD practices within data engineering pipelines, integrating with version control and deployment workflows.
  • Work closely with Data Architects, Business Analysts, and Product Owners to translate business needs into technical solutions.

Skills

  • Expert-level knowledge of Azure Data Factory, Databricks, and Snowflake.
  • Understanding of quality processes and estimate methods.
  • Understanding of design concepts and architectural basics.
  • Fundamental grasp of the project domain.
  • The ability to transform functional and nonfunctional needs into system requirements.
  • The ability to develop and code complicated applications is required.
  • The ability to create test cases and scenarios based on specifications.
  • Solid knowledge of SDLC and agile techniques.
  • Knowledge of current technology and trends.
  • Logical thinking and problem-solving abilities, as well as the capacity to collaborate.
  • Primary skills: Cloud Platform, Azure, Databricks, ADF, ADO.
  • Sought: SQL, Python, PowerBI.
  • General Knowledge: PowerApps, Java/Spark, Reltio.
  • 5-7 years of experience in software development with minimum 3 years of cloud computing.
  • Proficient in SQL, Python, and cloud-native architecture.
  • Strong grasp of data security, privacy compliance, and best practices in a regulated environment.

Education

  • Bachelor's Degree in technical discipline (Math's, Science, Engineering, Computing, etc.)

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 136094853

Similar Jobs

Early Applicant