As a Lead Data Engineer, you will be responsible for the design, development, and maintenance of our organization's data architecture and infrastructure. You will be a key player in building a data platform that can efficiently process large volumes of data, ensuring our business can make informed decisions based on reliable and high-quality information. The ideal candidate will have a strong background in data engineering, excellent leadership skills, and a proven track record of successfully managing complex data projects.
Responsibilities
- Data Architecture & Design: Design and implement scalable and efficient data architectures that meet the organization's data processing needs. Collaborate with cross-functional teams to ensure data solutions align with business objectives.
- ETL Development: Oversee the development of robust ETL (Extract, Transform, Load) processes to move data from various sources into our data warehouse. Implement best practices for data cleansing and validation to ensure data quality and integrity.
- Big Data Technology: Stay updated on emerging trends in big data and analytics. Implement and optimize big data technologies like Hadoop, Spark, and Kafka to efficiently process and analyze large datasets.
- Cloud Integration: Work with the IT infrastructure team to integrate data engineering solutions with cloud platforms (AWS, Azure, or Google Cloud), focusing on scalability, security, and performance.
- Performance Monitoring & Optimization: Implement monitoring tools to track data pipeline performance and proactively address any issues.
- Documentation: Maintain comprehensive documentation for all data engineering processes, data models, and system architecture.
- Collaboration & Communication: Work closely with data scientists, analysts, and other stakeholders to understand their data needs and deliver effective solutions. Communicate project status and challenges clearly to both technical and non-technical audiences.
Qualifications
- Educational Background: Bachelor's or Master's degree in Computer Science, Information Technology, or a related field.
- Experience: 6-8 years of professional experience in data engineering.
- Technical Skills:
- In-depth knowledge of data modeling, ETL processes, and data warehousing.
- Strong experience building data warehouses using Snowflake.
- Hands-on experience with data ingestion, data lakes, data mesh, and data governance.
- Proficiency in Python programming.
- Strong understanding of big data technologies such as Hadoop, Spark, and Kafka.
- Experience with cloud platforms like AWS, Azure, or Google Cloud.
- Familiarity with various database systems (SQL, NoSQL) and data pipeline orchestration tools.
- Soft Skills: Excellent problem-solving and analytical skills, strong communication and interpersonal abilities, and a proven ability to work collaboratively in a fast-paced, dynamic environment.