Role Description:
Data engineering role requires creating and managing technological infrastructure of a data platform, be in-charge / involved in architecting, building, and managing data flows / pipelines and construct data storages (noSQL, SQL), tools to work with big data (Hadoop, Kafka), and integration tools to connect sources or other databases. Should hold minimum 5 years of experience in DBT and Snowflake.
Role Responsibility:
- Translate functional specifications and change requests into technical specifications
- Translate business requirement document, functional specification, and technical specification to related coding
- Develop efficient code with unit testing and code documentation
Role Requirement:
- Proficient in basic and advanced SQL programming concepts (Procedures, Analytical functions etc.)
- Good Knowledge and Understanding of Data warehouse concepts (Dimensional Modeling, change data capture, slowly changing dimensions etc.)
- Knowledgeable in Shell / PowerShell scripting
- Knowledgeable in relational databases, non-relational databases, data streams, and file stores
- Knowledgeable in performance tuning and optimization
- Experience in Data Profiling and Data validation
- Experience in requirements gathering and documentation processes and performing unit testing
- Understanding and Implementing QA and various testing process in the project
Additional Requirement:
- Design, develop, and maintain scalable data models and transformations using DBT in conjunction with Snowflake, ensure the effective transformation and load data from diverse sources into data warehouse or data lake.
- Implement and manage data models in DBT, guarantee accurate data transformation and alignment with business needs.
- Utilize DBT to convert raw, unstructured data into structured datasets, enabling efficient analysis and reporting.
- Write and optimize SQL queries within DBT to enhance data transformation processes and improve overall performance.
- Establish best DBT processes to improve performance, scalability, and reliability.
- Expertise in SQL and a strong understanding of Data Warehouse concepts and Modern Data Architectures.
- Familiarity with cloud-based platforms (e.g., AWS, Azure, GCP).
- Migrate legacy transformation code into modular DBT data models.