Senior Data Engineer - SAP BO
Location: Bangalore, India
The Data Engineer is part of a team responsible for supporting Enterprise Data Warehouse (EDW), custom reporting, and the on-demand data needs of the organization. Under the direction of the Manager of EDW Systems, you will participate in the planning, designing, developing, installing, testing and supporting the complete data management solutions to address ongoing business reporting needs and new opportunities.
Responsible for maintaining and enhancing our existing SAP Business Objects (Web Intelligence, Crystal Reports)
Responsible for maintaining and enhancing our existing data and analytics system.
Building required data pipelines for optimal extraction, transformation and loading of data from various data sources using Cloud and SQL technologies and supporting them.
Triaging the support incidents, investigating, identifying the root cause of the problem and fixing.
Analyzing business challenges & translating them into solutions.
Working with stakeholders including cross-teams and business users and assisting them with data-related technical issues.
Identifying, designing and implementing internal process improvements including re-designing the applications for greater scalability, optimizing data delivery, and automating manual processes.
Communicating effectively across multiple departments and with stakeholders to review business requirements and propose solutions.
Performing other duties as assigned.
Due to the nature of this position, the applicant will need the ability to work from home or during off-hours as necessary. Candidates must have:
Knowledge in DWH and BI environment and infrastructure, with a proven work experience of 3 to 4 years
Strong experience in SAP BO and SAP BW, Oracle is advantage
Good working knowledge on BO WebI and Dashboard Designer
Working Experience on Business Objects Design Studio
Knowledge in dimensional modeling, database structures and query optimization.
Experience with GCP Big Data products including BigQuery, is a plus
Knowledge in different types of storage (filesystem, relational, NoSQL) and working with various kinds of data (structured, unstructured, metrics, log files, etc.)
Experience with data pipelines (ETL, ELT) and data wrangling procedures using Python and SQL.
Experience with batch and stream processing (including GCP Dataflow/Kafka Streams) is a plus
Basic knowledge of Python and/or Java
Ability to learn new technologies quickly
Strong analytical skills and problem-solving skills
Strong verbal and written communications skills (English)
Understanding of ITIL principles
Bachelor's degree or equivalent experience in Computer Science, Information Systems or related disciplines
The ability to demonstrate work experience by providing report examples and source code.