
Search by job, company or skills
Job description
NoticePeriod :within 15 days or immediate joiner
AboutTheRole :
As a Data Engineer for the Data Science team, you will play a pivotal role in enriching andmaintainingthe organization's central repository of datasets.
This repository serves as the backbone for advanced data analytics and machine learning applications, enabling actionable insights from financial and market data.
You will work closely with cross-functional teams to design and implement robust ETL pipelines that automate data updates and ensure accessibility across the organization.
This is a critical role requiring technicalexpertisein building scalable data pipelines, ensuring data quality, and supporting data analytics and reporting infrastructure for business growth.
Note :Should be working with Azure as cloud technology.
KeyResponsibilities :
ETLDevelopment :
- Design, develop, andmaintainefficient ETL processes for handling multi-scale datasets.
- Implement andoptimizedata transformation and validation processes to ensure data accuracy and consistency.
- Collaborate with cross-functional teams to gather data requirements and translate business logic into ETL workflows.
Data PipelineArchitecture :
- Architect, build, andmaintainscalable and high-performance data pipelines to enable seamless data flow.
- Evaluate and implement modern technologies to enhance the efficiency and reliability of data pipelines.
- Build pipelines for extracting data via web scraping to source sector-specific datasets on an ad hoc basis.
DataModelling:
- Design and implement data models to support analytics and reporting needs across teams.
-Optimizedatabase structures to enhance performance and scalability.
Data QualityAndGovernance :
- Develop and implement data quality checks and governance processes to ensure data integrity.
- Collaborate with stakeholders to define and enforce data quality standards across the and Communication
-Maintaindetailed documentation of ETL processes, data models, and other key workflows.
- Effectively communicate complex technical concepts to non-technical stakeholders and business Collaboration
- Work closely with the Quant team and developers to design andoptimizedata pipelines.
- Collaborate with external stakeholders to understand business requirements and translate them into technical solutions.
Essential Requirements
Qualifications :
- Familiarity with big data technologies like Hadoop, Spark, and Kafka.
- Experience with datamodelingtools and techniques.
- Excellent problem-solving, analytical, and communication skills.
- Proven experience as a Data Engineer withexpertisein ETL techniques (minimum years).
-5-7years of strong programming experience in languages such as Python, Java, or Scala
- Hands-on experience in web scraping to extract and transform data from publicly available web sources.
-Proficiencywith cloud-based data platforms such as AWS, Azure, or GCP.
- Strong knowledge of SQL and experience with relational and non-relational databases.
- Deep understanding of data warehousing concepts and
-Bachelor's or Master's degree in Computer Science or Data Science.
- Knowledge of data streaming and real-time processing frameworks.
- Familiarity with data governance and security best practices
Role:Data Engineer
Industry Type:IT Services & Consulting
Employment Type:Full Time, Contract
Role Category:Software Development
Job ID: 142650307