
Search by job, company or skills
HCLTech is hiring GCP Data Engineer || Bangalore/Noida
HCLTech is a global technology company, home to more than 227,000 people across 60 countries, delivering capabilities in digital, engineering, cloud, and AI. Guided by our philosophy of Supercharging Progress™, we partner with clients worldwide to accelerate their transformation journeys.
Please find below Job Description:
Overall Experience: 5 to 10 yrs
Notice Period: Immediate/30 days/ 60 days
Mandatory Skills: GCP, Big Query, Pub/Sub, DataProc,Air Flow, composer, Orchestration
Responsibilities
• Build and Optimize ELT/ETL Pipelines using BigQuery, GCS, Dataflow, PubSub and Orchestration services Composer/Airflow
• Hands-On experience in building ETL/ELT Pipelines with developing software code in Python
• Experience in working with data warehouses, data warehouse technical architectures, reporting/analytic tools
• Develop and implement data quality and governance procedures to ensure the accuracy and reliability of data
• Demonstrate extensive skills and success in the implementation of technology projects within a professional environment, with a particular focus on data engineering
• Eager to learn and explore new services within GCP to enhance skills and contribution to Projects
• Demonstrated excellent communication, presentation, and problem-solving skills.
• Prior Experience in ETL tool such as DBT,Talend Etc
Good to have skills
• AI/ML,Gen AI Backgroud
• IAM, Cloud Logging and Monitoring
• The Data Engineer coaches the junior data engineering personnel position by bringing them up to speed and help them get better understanding of overall Data ecosystem.
• Working Experience with Agile methodologies and CI/CD Tools like Terraform/Jenkins
• Working on Solution deck, IP build, client meetings on requirement gathering
Qualifications, Skills and Competencies
Education & Experience:- Bachelor's/Master's +5-8 years experience
Technical / Functional Skills:- • Knowledge in architecture principles, guidelines and standards
• Data Warehousing
• Programming Language: Python, SQL
• Big Data
• Data Analytics
• Experience in Building Streaming and Batch data pipelines with GCP services
Experience in designing & implementing solution in mentioned areas:
Cloud Storage, BigQuery, Data Flow, DataProc,
PubSub,Data Fusion,Cloud Function,Composer etc
Core Competencies (Soft Skills):- Strong communication & presentation- Analytical, client-first mindset- Leadership & stakeholder management- Ability to work in global, fast-paced environment.
Interested candidates, kindly share the resume on [Confidential Information] with below mentioned details.
Overall Experience:
Skills:
Current and Expected CTC:
Current and Preferred Location:
Notice Period:
Job ID: 145738509
We don’t charge any money for job offers