
Search by job, company or skills
Job Title: Sr. Data Engineer / Jr Solution Architect
Location - Noida/Pune/Bengaluru/Chennai/Mumbai
Shift - 12Pm-10 PM
Mode - Hybrid - 3 days a week
Notice Period - Immediate-30 Days
Primary Skill: - PySpark / Apache Spark, GCP, Dataproc jobs, APIGEE / API Integration(Knowledge should be there), Should be experience in developing ETL and ELT pipelines.
Key Roles & Responsibilities:
• Design and implement scalable solutions on Google Cloud Platform based on business requirements
• Build and manage data pipelines using Dataproc and PySpark for batch processing
• Develop, optimize, and manage datasets and queries in BigQuery
• Create and manage Dataproc jobs, ensuring performance, reliability, and cost efficiency
• Develop backend and event-driven components using Cloud Functions
• Deploy and manage containerized applications on Google Kubernetes Engine
• Support development of AI/ML and Generative AI solutions using Vertex AI
• Implement monitoring, logging, and alerting using Cloud Monitoring
• Troubleshoot production issues, debug data pipelines, and ensure system stability
• Contribute to solution design documents (HLD/LLD) and architecture discussions
• Participate in client interactions, requirement gathering, and solution walkthroughs
• Collaborate with cross-functional teams including data, DevOps, and application teams
Skill Requirement:
• 4–8 years of experience with at least 2–3 years hands-on in GCP
• Strong programming skills in Python
• Hands-on experience with PySpark / Apache Spark (mandatory)
• Experience working with Dataproc jobs and cluster management
• Strong SQL skills and experience with BigQuery
• Understanding of data pipeline design (batch and basic streaming)
• Experience with serverless and microservices-based architecture
• Hands-on exposure to GKE and container-based deployments
• Basic knowledge of Vertex AI, ML concepts, and Generative AI (LLMs, RAG)
• Familiarity with monitoring, logging, and debugging distributed systems
• Understanding of cloud fundamentals (IAM, networking, security basics)
• Exposure to CI/CD pipelines and Infrastructure as Code (Terraform preferred)
• Good problem-solving and analytical skills
• Ability to communicate technical concepts clearly to stakeholders
• Experience in production support and handling real-world system issues
• Willingness to learn, adapt, and grow into a Solution Architect role
Job ID: 147255751
Skills:
snowflake , Sql, ELT, Airtable, Data Warehousing, Etl, Cortex AI, Metabase, LLM APIs, Hevo, dbt, Business Intelligence, AI Infrastructure
Skills:
Java, Cassandra, PostgreSQL, Scala, Apache Spark, Kafka, Spring Boot, Sql, Apache Nifi, REST, Gcp, Docker, MongoDB, Oracle, Kubernetes, Python, AWS, Airflow, Flink, GRPC
Skills:
Github, S3, Tableau, Data Modeling, Informatica, Sql Development, Unix Shell Scripting, Lambda, Database Design, Version Control Systems, Etl Tools, Python, Aws Services, Database Security, Power Bi, SQL Server, Git, data visualization tools, data quality controls, Oracle databases, Control-M, PL SQL development, Step Functions, T-SQL development, CI CD pipelines, AI-assisted development
Skills:
Jfrog Artifactory, Pyspark, Pandas, Numpy, Gcp, Docker, Databricks, Azure, Python, AWS, scikit-learn, Great Expectations, GitHub Actions, Generative AI technologies, Kedro, OpenAI GPT-4
Skills:
Sql, Azure Databricks, Python, Apache Spark, Microsoft Azure, Delta Lake, GenAI LLM-based capabilities, Data quality checks
We don’t charge any money for job offers