Job Post - Data Engineer
Experience :- 5+ years
Timing - 7:00 P.M. to 4:00 P.M. or 8:00 P.M. to 5:00 P.M.
Location :- Remote
Responsibilities:
- Design, develop, and maintain scalable and efficient data pipelines using industry-standard tools and technologies.
- Design, implement, and optimize complex SQL queries to extract, transform, and load (ETL) data from various sources with data Quality and Reliability
- Develop and maintain robust data models, including star and snowflake schemas, to support analytics and reporting needs.
- Call third party APIs to fetch data using python and perform ETL on the same.
- Monitor and improve the performance of existing data pipelines and workflows, addressing any issues related to data quality or efficiency.
- Designed and maintained SQL Server databases, ensuring high availability and performance.
- Contribute to the overall system design and architecture, ensuring scalability and maintainability.
- Stay current with emerging trends and technologies in data engineering and propose new solutions to enhance the data infrastructure.
Requirements:
- Advanced SQL skills, with experience in writing optimized and complex queries for data extraction and transformation.
- Strong experience with AWS cloud services or similar cloud Services (e.g.,S3, Lambda, Glue, Eventhub, EC2, IAM).
- Familiarity with Hadoop, Spark, or similar big data processing frameworks.
- Proficiency in Python or other programming languages for data manipulation and data automation.
- Proficiency with ETL tools (e.g., Glue, Apache Airflow, Talend, Informatica) or custom ETL pipeline development.
- Third party API calling with Postman, swagger or similar tools and implement calling in python.
- Experience with data warehousing solutions such as Snowflake, Amazon Redshift, or Google Big Query.
- Expertise in SQL Server databases, including design, optimization, and management.
- Excellent problem-solving skills and attention to detail.
- Strong communication skills and the ability to work collaboratively in a remote team environment.
Good to have :
- Knowledge of Numpy ,Pandans and similar Python libraries
Keywords suggestions :
- Data Engineering, SQL, ETL, AWS, Glue, Snowflake, Data modeling, API calling