Search by job, company or skills

eucloid data solutions

Lead Data Engineer

Save
new job description bg glownew job description bg glow
  • Posted 7 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Eucloid is looking for a Lead Data Engineer to join our Data Platform team supporting various business applications. The ideal candidate will support the development of data infrastructure for our clients by participating in activities that may include starting from upstream and downstream technology selection to designing and building different components. Candidate will also be involved in projects like integrating data from various sources, managing big data pipelines that are easily accessible with optimised performance of the overall ecosystem. The ideal candidate is an experienced data wrangler who will support our software developers, database architects and data analysts on business initiatives. You must be self-directed and comfortable supporting the data needs of cross-functional teams, systems, and technical solutions.

Responsibilities

  • Responsible for the design, deployment, configuration, and operations of a multi-node big data cluster.
  • This includes working with open source and/or commercial stacks to support the full SDLC.
  • Resource will work to deploy, manage, and maintain development, test and production environments for the big data platform.
  • Develop scripts to automate and streamline operations and configurations in the infrastructure.
  • Specify, design, build, and support BI solutions by working closely with datalake team.
  • Create dashboards and KPIs to show the business performance to management.
  • Design and maintain data models used for reporting and analytics
  • Work to identify infrastructure needs and provide support to developers and business users.
  • Research performance issues.
  • Optimize platform for performance.
  • Troubleshoot and resolve issues in all operational environments.
  • Work with a cross-functional team delivering software deployments.
  • Forward thinking by continuously adopting new ideas and technologies to solve business problems.
  • Own the design and development of automated solutions for recurring reporting and in-depth analysis.
  • A problem solver and critical thinker.

Requirements

  • Strong Experience in Data lake - Spark, distributed file system, Yarn, Cloud services (preferably GCP / AWS).
  • Strong Experience in SQL tools - Vertica, Dremio product or any big data SQL.
  • Scripting Knowledge - Shell, Python.
  • Experience with ETL and OLAP concepts in building highly scalable data pipelines.
  • Exposure to any Visualization systems is a plus (like Apache Superset, Tableau).
  • Experience with Agile, data structures, data analysis and wrangling tools and technologies.
  • Familiar with version control and relational databases.
  • Strong Experience in monitoring, debugging and troubleshooting of services.
  • Experience in providing on-call support.

Basic Qualifications

  • Eucloid Data Solutions Awarded for the most promising Data Solutions Provider by Entrepreneur Magazine.
  • Bachelor's / Master's Degree in Computer or related field from a reputed institution.
  • 5+ years of professional experience in software, with most of them from a product company.

Preferred Qualifications

  • Proficient in one or more technologies, such as AWS, EMR, Hadoop, Spark, SQL, Python, and Data Structures.
  • Experience working in Linux based environment.
  • Good communication and design skills.

This job was posted by Eucloid Careers from Eucloid Data Solutions.

More Info

Job Type:
Industry:
Employment Type:

Job ID: 147474851

Similar Jobs

Chennai, India

Skills:

Data ModelingPysparkELTPythonSpark SQLSqlAzure Data FactoryData WarehousingEtlcdcOneLakeWarehouseLakehouse ArchitectureAzure Data Lake Storage Gen2Azure MonitoringMicrosoft Fabric Data PipelinesMicrosoft Purviewwatermarking techniquesMicrosoft Azure FabricIncremental loadsAzure Blob StorageLakehouseStored Proceduresperformance optimizationPipeline monitoring

Chennai, India

Skills:

stream processing Machine LearningApisNosqlDataStageData WarehousingEtlData LakesAnalyticsPersistenceSearchMessagingBusiness IntelligenceAutomated Deployment Processes

Chennai, India

Skills:

composer BigQueryGcpSparkKafkaSqlPythonAirflowdbtGCS

Chennai, India

Skills:

JavaHadoopOracle SqlScalaHBaseHiveGcpDB2SparkAzurePythonAWSAirflowTeradata

Chennai, India

Skills:

snowflake SqlDatabricksData WarehousingCloudDevopsJiraTableauPower BiEtlGcpGitELTAzure Data FactoryData VisualizationDatadogAWSKubernetesPythonServicenowAzureDockerRedshiftMonitoringAzure MonitordbtLookerSynapseCI CD pipelinesBig QueryTicketing Workflow ToolsLog AnalyticsFreshdeskAzure SQL Server