About the Role:
In this role as a Data Engineer, you will:
- Design, develop, optimize, and automate data pipelines that blend and transform data across different sources to help drive business intelligence, data science, and self-service data solutions.
- Work closely with data scientists and data visualization teams to understand data requirements to ensure the availability of high-quality data for analytics, modelling, and reporting.
- Build pipelines that source, transform, and load data that s both structured and unstructured keeping in mind data security and access controls.
- Explore large volumes of data with curiosity and conviction.
- Contribute to the strategy and architecture of data management systems and solutions.
- Proactively troubleshoot and resolve data-related and performance bottlenecks in a timely manner.
- Be open to learning and working on emerging technologies in the data engineering, data science and cloud computing space.
- Have the curiosity to interrogate data, conduct independent research, utilize various techniques, and tackle ambiguous problems.
- Shift Timings: 12 PM to 9 PM (IST)
- Work from office for 2 days in a week (Mandatory)
About You
You re a fit for the role of Data Engineer, if your background includes:
- Must have at least 4-6 years of total work experience with at least 2+ years in data engineering or analytics domains.
- Graduates in data analytics, data science, computer science, software engineering or other data centric disciplines.
- SQL Proficiency a must.
- Experience with data pipeline and transformation tools such as dbt, Glue, FiveTran, Alteryx or similar solutions.
- Experience using cloud-based data warehouse solutions such as Snowflake, Redshift, Azure.
- Experience with orchestration tools like Airflow or Dagster.
- Preferred experience using Amazon Web Services (S3, Glue, Athena, Quick sight).
- Data modelling knowledge of various schemas like snowflake and star.
- Has built data pipelines and other custom automated solutions to speed the ingestion, analysis, and visualization of large volumes of data.
- Knowledge building ETL workflows, database design, and query optimization.
- Has experience of a scripting language like Python.
- Works well within a team and collaborates with colleagues across domains and geographies.
- Excellent oral, written, and visual communication skills.
- Has a demonstrable ability to assimilate new information thoroughly and quickly.
- Strong logical and scientific approach to problem-solving.
- Can articulate complex results in a simple and concise manner to all levels within the organization.