
Search by job, company or skills
Senior Data Engineer - Engineering
Beghou brings over three decades of experience helping life sciences companies optimize their commercialization through strategic insight, advanced analytics, and technology. From developing go-to-market strategies and building foundational data analytics infrastructures to leveraging artificial intelligence to improve customer insights and engagement, Beghou helps life sciences companies maximize performance across their portfolios. Beghou also deploys proprietary and third-party technology solutions to help companies forecast performance, design territories, manage customer data, organize, and report on medical and commercial data, and more. Headquartered in Evanston, Illinois, we have 10 global offices.
Position Summary:
This role builds front-end and back-end infrastructure for the firm's in-house enterprise data platform as well as explores, enables, and documents new technologies. This position also works independently to support client teams with different data needs, including deployment for new clients, support for existing clients, development of backend datasets for dashboards, and support for the front-end.
Duties & Responsibilities:
Desired Technical Skills & Experience:
At least 5 years experience in data engineering using python including use of pandas or PySpark.
• Experience working with Databricks extensively (including installing packages, understand and setting cluster configurations, managing jobs, user management, handling permissions, managing Unity Catalogs, Databricks APIs and AI tools, and handling configuration issues) and relational database technologies, such as PostgreSQL, Oracle, MySQL, Redshift, Snowflake.
• Software development fundamentals, including Agile development, version control systems such as Git or DevOps, code reviews, testing, and documentation.
• Experience working with AI tools.
• Experience configuring AzureAD/SAML/Okta/Oauth and administering AWS or Azure security best practices, preferred.
• Experience with WYSIWYG ETL tools (Azure Data Factory, Informatica, SnapLogic, Boomi), preferred.
• Container orchestration systems experience using Docker, Kubernetes, AWS ECS, preferred.
• At least 6 months experience in web application development using Flask, Django, JavaScript, Ajax, or CSS/HTML is preferred.
• Candidates holding a recognized data engineering certification are preferred (e.g., Databricks Data Engineer, Google Cloud Professional Data Engineer, Azure Data/Fabric Data Engineer, AWS Certified Data Engineer).
• Proficiency using Microsoft Office products, including Excel, PowerPoint, and Word.
• Life Sciences industry experience, preferred.
Desired Soft Skills:
Job ID: 147531715
Skills:
snowflake , Sql, Github, Kafka, Python, Git, Microsoft Azure, LangChain, GitHub Actions, Helm charts, dbt Data Build Tool, Azure Event Hubs, LangGraph
Skills:
snowflake , Sql, Github, Kafka, Python, Git, Microsoft Azure, LangChain, GitHub Actions, Helm charts, dbt Data Build Tool, Azure Event Hubs, LangGraph
Skills:
snowflake , Data Migration, Dimensional Modeling, Sql, ELT, Git, Debugging, Data Warehousing Concepts, Etl, cdc, data integration patterns, reconciliation techniques, incremental loading, dbt, Salesforce data models, performance optimization, CI CD practices, CRM objects
Skills:
snowflake , Git, Data Modeling, Databricks, Data Warehousing, Python, Sql, Etl, Orchestration frameworks, Data pipeline workflow automation
Skills:
Sql, ELT, Git, Pandas, Docker, Databricks, FastAPI, Rest Apis, Azure, Kubernetes, Python, Etl, Polars, Delta Lake, Pydantic
We don’t charge any money for job offers