Job Overview:
- Python Knowledge with Pyspark, Pandas and Python Objects
- Knowledge of Google Cloud Platform
- GCP Certified Prof. Data Engineer.
- Google Cloud : GCP cloud storage, Data proc, Big query
- SQL Strong SQL & Advanced SQL
- Spark- writing skills on Pyspark
- DWH Data warehousing concepts & dimension modeling
- GIT
- Team Leading Experience
- Apache Spark
- Scheduling tools/ETL tools
Roles & Responsibilities:
- Provide Technical Solutions
- Team Management: Assign the tasks to the team, Help/Guide Team Members for technical Queries
- Excellent troubleshooting, attention to detail, and communication skills in fast-paced setting.
- Good Understanding of Technical Requirements and co-ordination with business stakeholders
- Design & Build Enterprise Datawarehouse & DataMarts, deploy in cloud (GCP)
- Perform Descriptive Analytics & Reporting
- Perform peer code reviews, design documents & test cases
- Support systems currently live and deployed for customers
- Build knowledge repository & cloud capabilities
- Good understand writing python code
- Understanding Agile - User Story creation