Search by job, company or skills

B

Service Desk Specialist(Portuguese Language)

7-9 Years
Save
new job description bg glownew job description bg glownew job description bg svg
  • Posted 23 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Area(s) of responsibility

About Us

Birlasoft, a global leader at the forefront of Cloud, AI, and Digital technologies, seamlessly blends domain expertise with enterprise solutions. The company's consultative and design-thinking approach empowers societies worldwide, enhancing the efficiency and productivity of businesses. As part of the multibillion-dollar diversified CKA Birla Group, Birlasoft with its 12,000+ professionals, is committed to continuing the Group's 170-year heritage of building sustainable communities.

Technical Lead – Data Engineering (Databricks & Snowflake)

Job Summary

Key Responsibilities

  • Lead the design and architecture of scalable, high-performance data solutions on Databricks and Snowflake
  • Define and enforce best practices for data engineering, including coding standards, performance optimization, and cost management
  • Oversee the development and maintenance of robust ETL/ELT pipelines ensuring data quality, reliability, and scalability
  • Collaborate with data architects, product owners, and business stakeholders to translate requirements into technical solutions
  • Implement and ensure adherence to data governance, security, and compliance standards
  • Mentor and guide engineering teams through code reviews, design discussions, and technical problem-solving
  • Monitor system performance, troubleshoot issues, and optimize workloads for efficiency and cost
  • Drive release planning, deployment strategies, and continuous improvement across data platforms
  • Identify and resolve technical blockers, ensuring smooth project delivery

Required Skills & Qualifications

  • 7+ years of experience in data engineering or related roles
  • Strong expertise in Databricks (Spark, Delta Lake) and Snowflake
  • Proficiency in Python, SQL, and distributed data processing frameworks
  • Experience designing and implementing ETL/ELT pipelines at scale
  • Strong understanding of data modeling, data warehousing, and lakehouse architecture
  • Experience with orchestration tools (Airflow, Azure Data Factory, etc.)
  • Knowledge of cloud platforms (Azure, AWS, or GCP)
  • Familiarity with data governance, security, and compliance practices
  • Excellent problem-solving and leadership skills

Preferred Qualifications

  • Experience with CI/CD pipelines and DevOps practices
  • Exposure to real-time data processing and streaming frameworks
  • Knowledge of cost optimization strategies in cloud data platforms
  • Prior experience in leading teams or managing technical delivery

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 147193719