Description
Why would you like to join us
TransOrg Analytics specializes in Data Science, Data Engineering and Generative AI, providing advanced analytics solutions to industry leaders and Fortune 500 companies across India, US, APAC and the Middle East. We leverage data science to streamline, optimize, and accelerate our clients businesses.
Position Summary
Build a centralized data system/warehouse for unifying data, improving data integration, and enhancing reporting and business intelligence capabilities.
Education & Experience Requirements
- Bachelors degree in data science, computer science, or IT-related fields with at least 7 years of experience or masters with at least 5 years of experience in Data Engineering/analytics.
Key Responsibilities
- Design & implement a centralized data integration process to merge data from multiple disconnected systems.
- Generate data reports & dashboards for leadership, stakeholders, and researchers.
- Train staff on using and managing the data infrastructure.
- Lead the design, development, and maintenance of scalable data pipelines and analytical workflows.
- Work extensively with large-scale tabular datasets to extract insights, optimize performance, and ensure high data quality.
- Write and optimize complex SQL queries and manage large datasets in BigQuery, Databricks, or other modern data warehouses.
- Collaborate with data scientists and business teams to prepare clean, reliable datasets for modeling and decision-making.
- Design efficient ETL/ELT processes to ingest, transform, and integrate structured and semi structured data.
- Partner with engineering teams to design scalable data infrastructure and ensure system reliability.
- Mentor junior analysts/engineers and promote best practices in analytics engineering.
Key Skills & Knowledge Requirements
The candidate must have expertise in the following areas :
Technical Skills
- Data Engineering : Data collection, management, transformation, and storage.
- Business Intelligence tools (Looker,ThoughtSpot, Power BI, Tableau, ).
- Data Modeling and designing scalable data architectures.
- Strong track record in producing deliverables on time and reporting to stakeholders.
- Data Integration : Enabling data sharing and interoperability with other agencies.
- Proven expertise in working with large-scale tabular data for business intelligence, reporting, and advanced analytics.
- Expert-level proficiency in SQL and hands-on experience with BigQuery (mandatory).
- Strong knowledge of data modeling, query optimization, and performance tuning.
- Experience with ETL/ELT frameworks, data pipelines, and orchestration tools (e.g., Airflow)
- Exposure to cloud platforms (GCP/Azure) and their native data services.
- Experience with version control (Git) and CI/CD for data workflows is a plus.
- Strong understanding of data governance, lineage, and metadata management.
Other
- Excellent analytical and problem-solving skills with strong attention to detail.
- Strong written and verbal communication skills in English.
- Ability to collaborate effectively across cross-functional teams.
- Strategic thinker with the ability to work independently under tight deadlines.
- Passion for building robust data foundations that enable advanced analytics and AI solutions.
- Ability to collaborate with stakeholders from different technical backgrounds.
- Experience in documenting business & technical requirements.
- Strong communication skills for training and knowledge sharing.
(ref:hirist.tech)