Job Description
Project Role : Data Engineer
Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems.
Must have skills : Microsoft Azure Databricks
Good to have skills : NA
Minimum 3 Year(s) Of Experience Is Required
Educational Qualification : 15 years full time education
Summary:
As a Data Engineer, you will engage in the design, development, and maintenance of data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and contribute to the overall data strategy of the organization, ensuring that data solutions are robust, scalable, and aligned with business objectives.
Roles and Responsibilities:
The Senior Data Engineer is the one who designs and builds data foundations and end to end solutions for the Shell Business to maximize value from data. The role helps create a data-driven thinking within the organization, not just within IT teams, but also in the wider Business stakeholder community. A Senior Data engineer is expected to be a subject matter expert, who design & build data solutions and mentor junior engineers. They are also the key drivers to convert Vison and Data Strategy for IT solutions and deliver.
Key Characteristics
Technology expert who constantly pursues knowledge enhancement and has inherent curiosity to understand work from multiple dimensions.
Deep data focus with expertise in technology domain
A skilled communicator capable of speaking to both technical developers and business managers. Respected and trusted by leaders and staff.
Actively delivers the roll-out and embedding of Data Foundation initiatives in support of the key business programmers.
Coordinate the change management process, incident management and problem management process.
Present reports and findings to key stakeholders and be the subject matter expert on Data Analysis & Design.
Drive implementation efficiency and effectiveness across the pilots and future projects to minimize cost, increase speed of implementation and maximize value delivery.
Contributes to community building initiatives like CoE, CoP.
Mandatory skills:
AWS/Azure/SAP - Master
ELT - Master
Data Modeling - Master
Data Integration & Ingestion
Data Manipulation and Processing
GITHUB, Action, Azure DevOps, Data factory, Databricks, SQL DB, Synapse, Stream Analytics, Glue, Airflow, Kinesis, Redshift,
Minimum: 4 year(s) of experience is required
Educational Qualification : 15 years full time education, 15 years full time education