
Search by job, company or skills
Data Architecture and Design:-
. Design and implement scalable, high-performance data architecture solutions to support analytics, reporting, and data science requirements.
. Create and maintain data models, schemas, and blueprints for cloud platforms.
. Define and enforce data governance, quality, and security standards across all platforms.
Cloud and Data Engineering
. Develop and maintain ETL/ELT pipelines for extracting, transforming, and loading data using tools like Databricks, AWS Glue, and AWS Lambda.
. Leverage AWS services such as S3, Redshift, Athena, Dynamo DB, and EMR for data storage and processing.
. Optimize the performance of data pipelines and ensure the availability of data systems.
Analytics and Reporting
. Collaborate with analytics and business intelligence teams to integrate data sources and enable advanced reporting.
. Implement and manage reporting transforming from existing Qlik to Sigma Enterprise BI to provide intuitive dashboards and insights.
. Partner with stakeholders to understand business requirements and translate them into data and analytics solutions.
Collaboration and Leadership
. Work closely with cross-functional teams, including Data Scientists, Analysts, and Developers, to ensure seamless data flow and accessibility.
. Mentor junior team members in best practices for data architecture and engineering.
. Stay up-to-date with emerging technologies and advocate for the adoption of innovative solutions.
Skills & Qualification
. 12+ Years of relevant experience
. Bachelors or Master's degree in Computer Science, Data Science, or a related field.
. AWS certifications (e.g., AWS Certified Data Analytics - Specialty, AWS Certified Solutions Architect).
. Prior experience in implementing real-time data pipelines and streaming solutions.
. Familiarity with machine learning workflows and tools.
Primary Skill(s): Data Architect
Technical Skills
. Proficiency with AWS Cloud Services (e.g., S3, Redshift, Glue, Lambda, Athena, DynamoDB).
. Hands-on experience with Databricks for data engineering and analytics workloads.
. Expertise in designing and building data pipelines and workflows using tools like Apache Spark and Databricks.
. Strong understanding of relational databases, data lakes, and data warehousing concepts.
. Experience with Qlik, Sigma or other reporting tools like Tableau, Power BI, or Looker.
Programming and Frameworks
. Advanced proficiency in Python, SQL, and Scala for data manipulation and processing.
. Experience with infrastructure-as-code tools like Terraform or Cloud Formation.
. Knowledge of big data technologies such as Apache Hadoop, Kafka, or Flink is a plus.
Soft Skills
. Excellent problem-solving and analytical skills.
. Strong written and verbal communication skills to effectively present complex concepts.
. Ability to work in a fast-paced environment and manage multiple projects simultaneously.
Perks and Benefits for Irisians
Iris provides world-class benefits for a personalized employee experience. These benefits are designed to support financial, health and well-being needs of Irisians for a holistic professional and personal growth. Click to view the benefits.
A strategic partner that transformational leaders can trust to realize the full potential of technology-enabled transformation.As a trusted technology partner, we focus our highly-experienced talent and rightsized teams to develop complex, mission-critical applications and solutions for leading enterprise across financial services, life sciences, including pharmaceutical, CROs and medical devices, manufacturing & logistics and educational services.
Job ID: 145411065