Search by job, company or skills

A

Data Platform Engineer

Save
new job description bg glownew job description bg glow
  • Posted 2 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Project Role : Data Platform Engineer

Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models.

Must have skills : Snowflake Data Warehouse

Good to have skills : NA

Minimum 5 Year(s) Of Experience Is Required

Educational Qualification : 15 years full time education

Summary:

As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture strategy. You will be actively involved in problem-solving and decision-making processes that impact the data platform's effectiveness and efficiency, ensuring that all components work seamlessly together to support organizational goals.

AI Powered Tech Talent

Roles & Responsibilities:

Key Responsibilities:

Snowflake Architecture & Engineering:

Lead and implement Snowflake solutions across data ingestion, storage, transformation, and access.

Lead end-to-end implementation of solutions using Snowflake s latest features.

Develop reusable frameworks, UDFs, and components using Snowpark or SQL scripting.

Develop performance-optimized pipelines using Snowflake-native capabilities and modern ELT tools.

Establish data governance, security models, RBAC, and masking policies in line with compliance standards.

Conduct performance tuning, workload monitoring, and optimization.

GenAI:

Enable GenAI use cases leveraging Snowflake s native capabilities

Implement Cortex AI features with custom workflows and business logic

Required Qualifications and Certifications:

8+ years of experience as a Snowflake Lead or Data Engineer.

Good command of Snowflake internals, including performance tuning, clustering, caching.

Good Knowledge of SQL, Snowflake scripting, and at least any one cloud platforms (AWS, Azure, or GCP).

Knowledge of Advanced Features like dynamic tables, Snow pipe streaming, Snowflake Cortex AI etc.

Good communication skills with the ability to bridge technical and business discussions.

Certifications: SnowPro Core Certified.

Education Qualification:

15 years of full-time education is required.

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 147460363

Similar Jobs

Hyderabad, India

Skills:

JavaPower BiPostgreSQLScalaApache SparkDynamodbKafkaTableauSqlELTDockerMySQLMongoDBAzureKubernetesPythonAWSEtl

Hyderabad, India

Skills:

GcpDatabricksSqlS3AgileEmrJiraApache SparkLambdaCollibraPysparkRDSAWSKubernetesPythonAzureEc2DockerEKSGlueRedshift Spectrum

Hyderabad

Skills:

AzureAws

Hyderabad, India

Skills:

ElkPrometheusGrafanaPythonKubernetesAws S3AWS DMSAzure Data MigrationData Lakehouse architectures

Hyderabad, India

Skills:

census snowflake BigQueryRedshiftSqlTypescriptJavascriptCloudflare WorkersMeta Conversions APISegmentMixpanelHightouchPolytomicTikTok Events APIGoogle Enhanced Conversions