Search by job, company or skills

B

Senior Data Engineer

Save
new job description bg glownew job description bg glownew job description bg svg
  • Posted 6 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Job Title: Sr. Data Engineer / Jr Solution Architect

Location - Noida/Pune/Bengaluru/Chennai/Mumbai

Shift - 12Pm-10 PM

Mode - Hybrid - 3 days a week

Notice Period - Immediate-30 Days

Primary Skill: - PySpark / Apache Spark, GCP, Dataproc jobs, APIGEE / API Integration(Knowledge should be there), Should be experience in developing ETL and ELT pipelines.

Key Roles & Responsibilities:

• Design and implement scalable solutions on Google Cloud Platform based on business requirements

• Build and manage data pipelines using Dataproc and PySpark for batch processing

• Develop, optimize, and manage datasets and queries in BigQuery

• Create and manage Dataproc jobs, ensuring performance, reliability, and cost efficiency

• Develop backend and event-driven components using Cloud Functions

• Deploy and manage containerized applications on Google Kubernetes Engine

• Support development of AI/ML and Generative AI solutions using Vertex AI

• Implement monitoring, logging, and alerting using Cloud Monitoring

• Troubleshoot production issues, debug data pipelines, and ensure system stability

• Contribute to solution design documents (HLD/LLD) and architecture discussions

• Participate in client interactions, requirement gathering, and solution walkthroughs

• Collaborate with cross-functional teams including data, DevOps, and application teams

Skill Requirement:

• 4–8 years of experience with at least 2–3 years hands-on in GCP

• Strong programming skills in Python

• Hands-on experience with PySpark / Apache Spark (mandatory)

• Experience working with Dataproc jobs and cluster management

• Strong SQL skills and experience with BigQuery

• Understanding of data pipeline design (batch and basic streaming)

• Experience with serverless and microservices-based architecture

• Hands-on exposure to GKE and container-based deployments

• Basic knowledge of Vertex AI, ML concepts, and Generative AI (LLMs, RAG)

• Familiarity with monitoring, logging, and debugging distributed systems

• Understanding of cloud fundamentals (IAM, networking, security basics)

• Exposure to CI/CD pipelines and Infrastructure as Code (Terraform preferred)

• Good problem-solving and analytical skills

• Ability to communicate technical concepts clearly to stakeholders

• Experience in production support and handling real-world system issues

• Willingness to learn, adapt, and grow into a Solution Architect role

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 147255751

Similar Jobs

Delhi, India

Skills:

snowflake SqlELTAirtableData WarehousingEtlCortex AIMetabaseLLM APIsHevodbtBusiness IntelligenceAI Infrastructure

Noida, India

Skills:

JavaCassandraPostgreSQLScalaApache SparkKafkaSpring BootSqlApache NifiRESTGcpDockerMongoDBOracleKubernetesPythonAWSAirflowFlinkGRPC

Gurugram, Gurugram, India

Skills:

GithubS3TableauData ModelingInformaticaSql DevelopmentUnix Shell ScriptingLambdaDatabase DesignVersion Control SystemsEtl ToolsPythonAws ServicesDatabase SecurityPower BiSQL ServerGitdata visualization toolsdata quality controlsOracle databasesControl-MPL SQL developmentStep FunctionsT-SQL developmentCI CD pipelinesAI-assisted development

Gurugram, Gurugram, India

Skills:

Jfrog ArtifactoryPysparkPandasNumpyGcpDockerDatabricksAzurePythonAWSscikit-learnGreat ExpectationsGitHub ActionsGenerative AI technologiesKedroOpenAI GPT-4

Noida, India

Skills:

SqlAzure DatabricksPythonApache SparkMicrosoft AzureDelta LakeGenAI LLM-based capabilitiesData quality checks