
Search by job, company or skills
This job is no longer accepting applications
Primary Skill :Data Engineer
Secondary Skill : GCP, Big Query
Exp : 5 to 10 yrs
Job Description:
We are seeking a skilled Data Engineer with expertise in Google Cloud Platform (GCP), particularly in BigQuery and Pub/Sub, coupled with advanced proficiency in Python. As a Data Engineer, you will play a pivotal role in designing, implementing, and optimizing data pipelines, ensuring the smooth flow of data within our organization.
Responsibilities:
1. Architect and Develop Data Pipelines: Design, develop, and maintain scalable and efficient data pipelines using GCP services such as Pub/Sub for real-time data ingestion and BigQuery for storage and analysis.
2. Data Transformation and Processing: Implement data transformation processes to cleanse, enrich, and aggregate raw data from various sources, ensuring data quality and consistency.
3. Optimize Performance: Fine-tune data pipelines and queries to optimize performance and reduce latency, ensuring timely access to data for stakeholders.
4. Monitoring and Maintenance: Implement monitoring solutions to track pipeline performance and proactively address issues. Perform regular maintenance tasks to ensure the reliability and availability of data infrastructure.
5. Collaboration: Collaborate with cross-functional teams including Data Scientists, Software Engineers, and Business Analysts to understand data requirements and provide technical solutions to address business needs.
6. Documentation: Document data pipelines, processes, and best practices to ensure knowledge sharing and maintain a comprehensive understanding of data architecture.
Requirements:
1. Proficiency in GCP Services: Extensive hands-on experience with Google Cloud Platform services, particularly BigQuery for data storage and analysis, and Pub/Sub for real-time data streaming.
2. Python Programming: Strong programming skills in Python for data manipulation, scripting, and automation tasks. Experience with libraries such as Pandas, NumPy, and TensorFlow is highly desirable.
3. Data Modeling: Solid understanding of data modeling concepts and experience in designing efficient data models for analytics and reporting purposes.
4. SQL Skills: Proficiency in writing complex SQL queries for data extraction, transformation, and analysis within BigQuery.
5. Experience with Data Warehousing: Familiarity with data warehousing concepts and experience in implementing data warehouse solutions using GCP BigQuery or similar technologies.
6. Problem-solving Skills: Strong analytical and problem-solving skills with the ability to troubleshoot and resolve complex data-related issues.
7. Communication Skills: Excellent communication skills with the ability to effectively collaborate with cross-functional teams and articulate technical concepts to non-technical stakeholders.
8. Bachelor's Degree: Bachelor's degree in Computer Science, Engineering, Mathematics, or a related field. Advanced degree or relevant certifications in Data Engineering or GCP are a plus.
Altimetrik delivers outcomes for our clients by rapidly enabling digital business & culture and infuse speed and agility into enterprise technology and connected solutions. We are practitioners of end-to-end business and technology transformation. We tap into an organization's technology, people, and assets to fuel fast, meaningful results for global enterprise customers across financial services, payments, retail, automotive, healthcare, manufacturing, and other industries. Founded in 2012 and with offices across the globe, Altimetrik makes industries, leaders and Fortune 500 companies more agile, empowered and successful.
Altimetrik helps get companies get unstuck. We're a technology company that lives organizations a process and context to solve problems in unconventional ways. We're a catalyst for organization's talent and technology, helping teams push boundaries and challenge traditional approaches. We make delivery more bold, efficient, collaborative and even more enjoyable.
Should be an individual contributor as well as handling a team. Expertise in below mentioned domain.
Strong fundamentals in Computer Science - programming, algorithms, data structures, sql querying
Basic understanding of Data landscape (ETL, Data processing, Data Storage, Reporting)
Working experience in Cloud would be a plus
Working experience in Spark /DataBricks /Scala/Java would be an added advantage
Working experience in tools like NiFi, Kafka etc .
Data Streaming and handling real time data load would be a big plus
Excellent SQL/Data Analysis Skills
Creating Data Mapping documents between Source to target and also documenting business rules on transformations is mandatroy
Should have worked in traditional ETL tool and processes
Should have experience in RDBMS / NoSQL /Big Data
Should have basic experience in programming/scripting language (Python/JAVA/C++)
Basics of Data Modelling and Data Architecture
Triage Data Issues and do RCA
Data Testing - Unit Testing, Functional Testing and end to end Data Validation
Expertise in Performance Optimization
Primary Skill :Data Engineer
Secondary Skill : GCP, Big Query
Exp : 5 to 10 yrs
Job Description:
We are seeking a skilled Data Engineer with expertise in Google Cloud Platform (GCP), particularly in BigQuery and Pub/Sub, coupled with advanced proficiency in Python. As a Data Engineer, you will play a pivotal role in designing, implementing, and optimizing data pipelines, ensuring the smooth flow of data within our organization.
Responsibilities:
1. Architect and Develop Data Pipelines: Design, develop, and maintain scalable and efficient data pipelines using GCP services such as Pub/Sub for real-time data ingestion and BigQuery for storage and analysis.
2. Data Transformation and Processing: Implement data transformation processes to cleanse, enrich, and aggregate raw data from various sources, ensuring data quality and consistency.
3. Optimize Performance: Fine-tune data pipelines and queries to optimize performance and reduce latency, ensuring timely access to data for stakeholders.
4. Monitoring and Maintenance: Implement monitoring solutions to track pipeline performance and proactively address issues. Perform regular maintenance tasks to ensure the reliability and availability of data infrastructure.
5. Collaboration: Collaborate with cross-functional teams including Data Scientists, Software Engineers, and Business Analysts to understand data requirements and provide technical solutions to address business needs.
6. Documentation: Document data pipelines, processes, and best practices to ensure knowledge sharing and maintain a comprehensive understanding of data architecture.
Requirements:
1. Proficiency in GCP Services: Extensive hands-on experience with Google Cloud Platform services, particularly BigQuery for data storage and analysis, and Pub/Sub for real-time data streaming.
2. Python Programming: Strong programming skills in Python for data manipulation, scripting, and automation tasks. Experience with libraries such as Pandas, NumPy, and TensorFlow is highly desirable.
3. Data Modeling: Solid understanding of data modeling concepts and experience in designing efficient data models for analytics and reporting purposes.
4. SQL Skills: Proficiency in writing complex SQL queries for data extraction, transformation, and analysis within BigQuery.
5. Experience with Data Warehousing: Familiarity with data warehousing concepts and experience in implementing data warehouse solutions using GCP BigQuery or similar technologies.
6. Problem-solving Skills: Strong analytical and problem-solving skills with the ability to troubleshoot and resolve complex data-related issues.
7. Communication Skills: Excellent communication skills with the ability to effectively collaborate with cross-functional teams and articulate technical concepts to non-technical stakeholders.
8. Bachelor's Degree: Bachelor's degree in Computer Science, Engineering, Mathematics, or a related field. Advanced degree or relevant certifications in Data Engineering or GCP are a plus.
Altimetrik is a digital business enablement company. We deliver bite-size outcomes as organizations scale digitalization to accelerate revenue growth without disrupting ongoing business operations. Our practitioners and agile engineering teams create solutions that drive transformation and achieve business goals.
Job ID: 83841873