While technology is the heart of our business, a global and diverse culture is the heart of our success. We love our people and we take pride in catering them to a culture built on transparency, diversity, integrity, learning and growth.
If working in an environment that encourages you to innovate and excel, not just in professional but personal life, interests you- you would enjoy your career with Quantiphi!
Designation: Architect - Data
Experience Level: 10+ Years
Location: Mumbai/Bangalore/Trivandrum (Hybrid)
Role & Responsibilities:
- Morethan10yearsofexperience in Technical, Solutioning, and Analytical roles.
- 5+yearsofexperience in building and managing Data Lakes, Data Warehouse, Data Integration, Data Migration and Business Intelligence/Artificial Intelligence solutions on Cloud (GCP/AWS/Azure)
- Ability to understand business requirements, translate them into functional and non-functional areas, define non-functional boundaries in terms of Availability, Scalability, Performance, Security, Resilience etc.
- Experiencein architecting, designing, and implementing end to end data pipelines and data integration solutions for varied structured and unstructured data sources and targets.
- Experienceof having worked in distributed computing and enterprise environments like Hadoop, GCP/AWS/Azure Cloud.
- Wellversedwithvarious Data Integration, and ETL technologies on Cloud like Spark, Pyspark/Scala, Dataflow, DataProc, EMR, etc. on various Cloud.
- Experienceof having worked with traditional ETL tools like Informatica/DataStage/OWB/Talend, etc.
- DeepknowledgeofoneormoreCloudandOn-Premise Databases like Cloud SQL, Cloud Spanner, Big Table, RDS, Aurora, DynamoDB, Oracle, Teradata, MySQL, DB2, SQL Server, etc.
- ExposuretoanyoftheNo-SQLdatabases like MongodB, CouchDB, Cassandra, Graph dB, etc. Architect- Data
- Experience in architecting and designing scalable data warehouse solutions on cloud on Big Query or Redshift.
- Experiencein having worked on one or more data integration, storage, and data pipeline tool sets like S3, Cloud Storage, Athena, Glue, Sqoop, Flume, Hive, Kafka, Pub-Sub, Kinesis, Dataflow, DataProc, Airflow, Composer, Spark SQL, Presto, EMRFS, etc.
- Preferred experience of having worked on Machine Learning Frameworks like TensorFlow, Pytorch, etc.
- Goodunderstanding of Cloud solutions for Iaas, PaaS, SaaS, Containers and Microservices Architecture and Design.
- Ability to compare products and tools across technology stacks on Google, AWS, and Azure Cloud.
- Goodunderstanding of BI Reporting and Dashboarding and one or more tool sets associated with it like Looker, Tableau, Power BI, SAP BO, Cognos, Superset, etc.
- Understanding of Security features and Policies in one or more Cloud environments like GCP/AWS/Azure.
- Experienceof having worked in business transformation projects for movement of On-Premise data solutions to Clouds like GCP/AWS/Azure.
Role:
- Leadmultiple data engagements on GCP Cloud for data lakes, data engineering, data migration, data warehouse, and business intelligence.
- Interface with multiple stakeholders within IT and business to understand the data requirements.
- Takecompleteresponsibility for the successful delivery of all allocated projects on the parameters of Schedule, Quality, and Customer Satisfaction.
- Responsible for design and development of distributed, high volume multi-thread batch, real-time, and event processing systems.
- Implementprocesses and systems to validate data, monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.
- WorkwiththePre-Sales team on RFP, RFIs and help them by creating solutions for data. . Mentor Young Talent within the Team, Define and track their growth parameters. . Contribute to building Assets and Accelerators.
Other Skills:
- StrongCommunication and Articulation Skills.
- GoodLeadershipSkills.
- Shouldbeagoodteamplayer.
- GoodAnalytical and Problem-solving skills