
Search by job, company or skills
We are looking for a Mid-level Data Engineer to join a small, focused team building and maintaining a production-grade financial data processing platform for the Abu Dhabi Securities Exchange (ADX).
The platform ingests daily stock trading data from ADX, processes it through a series of ETL pipelines, tracks shareholder balances and positions, and powers downstream analytics via PostgreSQL and Elasticsearch. The codebase is Python-first, well-structured, and held to high quality standards (strict typing, clean architecture, comprehensive tests).
You will own the data and application layer designing pipelines, evolving the schema, writing business logic services, and keeping the system accurate and reliable. The CTO handles all infrastructure and deployment concerns. We are looking someone extremely curious and likes to connect the dots with data and understand sentiment. The ideal candidate will be a good storyteller with data and statistics Problem-solving, adaptability in fast-paced environments, and familiarity with financial and trading platform databases.
Job Responsibilities:Build and maintain ETL pipelines that ingest daily Excel/CSV trade files, transform and validate the data, and persist it to PostgreSQL
Design and evolve database schemas using SQLAlchemy ORM and Alembic migrations
Implement business logic services: shareholder balance tracking, sharebook reconciliation, movement detection, classification rules
Maintain and extend Elasticsearch sync services (incremental and full resync workflows)
Support business stakeholders with data questions using the ELK stack writing and running queries, interpreting results, and surfacing relevant data from Kibana
Parse and process structured financial data files (Excel, CSV) using Pandas, including data validation and deduplication
Write robust, type-safe Python following project standards (mypy strict mode, Black, isort)
Write and maintain automated tests with pytest unit tests, repository mocks, snapshot tests
Collaborate with the CTO on data flow design and review migration strategies before deployment
Requirements3-5 years of professional software/data engineering experience
At least 2 years working with Python in a production environment
Python 3.8+ 3+ years of production experience; comfortable with type hints, async/await, and clean OOP design
PostgreSQL schema design, indexing strategies, query optimization, migrations
SQLAlchemy 2.0 declarative ORM models, session management, Alembic for migrations
Pandas data transformation, Excel/CSV parsing, validation pipelines
FastAPI building and maintaining REST APIs with dependency injection and async handlers
Pytest writing unit and integration tests, using fixtures and mocking patterns
ELK Stack (Elasticsearch, Kibana) working knowledge of Elasticsearch indexing and querying; able to write queries and navigate Kibana to answer business questions (dashboard building not required; Logstash not used)
Code quality discipline strong typing, Black formatting, readable and maintainable code
Nice to HaveWeb scraping familiarity with Playwright or similar browser automation tools
Financial data domain understanding of stock trading concepts (OHLCV data, shareholder registers, portfolio performance metrics such as XIRR or TWR)
Financial APIs experience with finance or similar market data sources
Job ID: 144569273