Search by job, company or skills

Solvios Technology

Data Engineer

Save
new job description bg glownew job description bg glow
  • Posted a day ago
  • Be among the first 20 applicants
Early Applicant

Job Description

Company Description

Solvios Technology is a global software development company specializing in creating impactful solutions by merging human-led strategies with AI-driven execution. With operations in India and the USA, Solvios partners with startups, enterprises, and growing brands across multiple regions, including the USA, UK, Canada, Australia, and the Middle East. The company provides services like custom software development, AI-driven data engineering, cloud and DevOps implementation, web and mobile app development, and CRM/ERP integration. Known for its strong in-house engineering teams, clear communication, and cost-effective, high-quality solutions, Solvios emphasizes long-term partnerships and meaningful collaboration between expert insights and intelligent systems to solve real-world challenges.

Role Description

Data Engineer with 3+ years of experience in building scalable data solutions across AWS & Azure. The ideal candidate will have strong expertise in data pipelines, ETL processes, and cloud-based data platforms.

Roles and Responsibilities:

  • Design, develop, and maintain scalable data pipelines and ETL processes
  • Build and optimize data architectures on AWS and Azure platforms
  • Work with large datasets using Python, PySpark, and SQL
  • Develop and manage workflows using AWS Lambda, Azure Data Factory, and Databricks
  • Ensure data quality, integrity, and security across systems
  • Collaborate with cross-functional teams, analysts, and stakeholders
  • Optimize performance and cost of data processing systems
  • Troubleshoot and resolve data-related issues
  • Build scalable and high-performance data solutions

Requirements:

  • Hands-on experience with AWS services (Lambda, S3, Redshift, etc.)
  • Strong experience in Python, PySpark, and SQL
  • Experience with Azure Data Factory and Databricks
  • Solid understanding of ETL/ELT processes and data warehousing concepts
  • Experience in building scalable and high-performance data solutions
  • Knowledge of data modeling and data pipeline optimization
  • Familiarity with version control tools like Git

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 147432871

Similar Jobs

Remote

Skills:

data engineering PythonPysparkAWS GlueDockerAWS Batch

Ahmedabad, India

Skills:

Aws LambdaPysparkData ModelingSqlGitAzure Data FactoryDatabricksAzurePythonAWSdata pipelinesETL processesdata pipeline optimization

Remote

Skills:

CI/CD & DevOpsSpark Streaming / Event HubsAgile environments (Jira / Azure DevOps)REST API integrationsAI/ML solutionsLife Sciences analytics experience

Ahmedabad, India

Skills:

Amazon Web ServicesGoogle Cloud PlatformScalaApache SparkSqlELTMicrosoft AzureDatabricksPythonEtlbig data ecosystemcloud platforms

Ahmedabad, India

Skills:

snowflake S3KafkaRedshiftSqlDockerTerraformKubernetesPythonAirflowDBT CoreAWS Data ServicesGlue