Job Description
We are looking for an experienced and resilient person to join our growing team of analytic experts and willing to learn through bigger challenges.
What You Will Do
- Work closely with product and engineering team to understand the domains, features and metrics
- Design and build scalable data pipelines to handle data from different sources
- Extract data using ETL tools and load to data warehouse ( AWS Redshift / Google BigQuery)
- Implement batch processing for structured and unstructured data
- Analyse data and create visualizations using tools like Tableau/Metabase/GoogleDataStudio which helps in implementing business decisions
- Work on core data team which designs and maintain the Data warehouse
- Troubleshooting and resolving issues in data processing and pipelines.
- Should be able to anticipate problems and build processes to avoid them.
- Learn new technology in a short span of time.
- Setting up CI/CD.
What Makes You a Great Fit
- Proficiency in database design and writing SQL queries
- Experience in any of data warehouse solutions AWS Redshift / Google BigQuery / Snowflake
- Knowledge on platforms such as Segment / HevoData / Stitch / Amplitude / Clevertap
- Hands on experience in ApacheSpark / Python / R / Hadoop / Kafka
- Knowledge on working with connectors (REST / SOAP etc.)
- Experience in BI platforms like Metabase / Power BI / Tableau / Google Data Studio
check(event) ; career-website-detail-template-2 => apply(record.id,meta) mousedown=lyte-button => check(event) final-style=background-color:#6875E2;border-color:#6875E2;color:white; final-class=lyte-button lyteBackgroundColorBtn lyteSuccess lyte-rendered=>