Search by job, company or skills

  • Posted an hour ago
  • Be among the first 10 applicants
Early Applicant
Quick Apply

Job Description

A Data Scientist specializing in Python and Structured Data Machine Learning focuses on analyzing structured datasets and developing predictive models using Python-based ML frameworks. 

A Data Scientist in this domain is responsible for extracting insights from structured data, building machine learning models, and optimizing data-driven decision-making. 

 

Key Responsibilities 

  • Develop ML models for structured data analysis. 
  • Implement data preprocessing pipelines using Python (Pandas, NumPy). 
  • Optimize feature engineering for structured datasets. 
  • Work with SQL databases and data warehouses (Snowflake, BigQuery).
  • Train and evaluate models using Scikit-learn, TensorFlow, or PyTorch.
  • Deploy ML models using MLOps frameworks (MLflow, Kubeflow). 
  • Collaborate with data engineers and analysts to ensure data quality. 
  • Perform hyperparameter tuning and model optimization. 
  • Ensure data security, governance, and compliance. 

 

Required Skills :

  • Python programming for ML and data manipulation. 
  • SQL proficiency for structured data querying. 
  • Experience with ML frameworks (Scikit-learn, TensorFlow, PyTorch). 
  • Knowledge of ETL processes for structured data.
  • Understanding of data warehousing concepts. 
  • Familiarity with cloud platforms (AWS, Azure, GCP). Preferred Qualifications
  • Experience with data visualization tools (Tableau, Power BI). 
  • Knowledge of data governance frameworks (GDPR, HIPAA). 
  • Familiarity with automated ML workflows. cloud-based data solutions, leveraging Snowflake for data warehousing and Python for scripting, automation, and data processing. 
  • A Snowflake and Python Developer is responsible for designing, developing, and optimizing data solutions using Snowflake's cloud data platform and Python-based ETL processes. 

 

Key Responsibilities 

  • Develop ETL/ELT pipelines using Python and Snowflake.
  • Design and implement data models and schemas in Snowflake.
  • Write and optimize SQL queries for data transformation and reporting. 
  • Integrate data from various sources into Snowflake. 
  • Implement Snowflake Tasks and Streams for real-time data processing. 
  • Ensure data security, governance, and compliance. 
  • Collaborate with data engineers and analysts to build scalable solutions. 
  • Perform performance tuning and query optimization in Snowflake. 
  • Automate workflows using Python scripts. Required Skills 
  • Strong SQL proficiency for Snowflake. 
  • Python programming for data manipulation and automation. 
  • Experience with ETL tools (e.g., Apache NiFi, Talend, Fivetran). 
  • Knowledge of cloud platforms (AWS, Azure, GCP). 
  • Understanding of data warehousing concepts. 
  • Familiarity with Snowflake features like Snowpipe, Streams, and Tasks. Preferred Qualifications 
  • Experience with data visualization tools (Tableau, Power BI). 
  • Knowledge of data governance frameworks (GDPR, HIPAA). 
  • Familiarity with machine learning workflows in Snowflake.

More Info

Job Type:
Industry:
Function:
Employment Type:
Open to candidates from:
Indian

Job ID: 145526937

Similar Jobs

Early Applicant
Early Applicant