About Us
WHY LINEDATA
Join us in shaping the fintech of tomorrow.
With more than 1,350 colleagues across 20 locations worldwide, you'll help build cutting-edge platforms and tailored services that power the daily operations of 700 leading financial institutions. As a global, multicultural company with over 45 nationalities represented and just as many languages spoken, we thrive on diverse perspectives and collaboration. Together, we tackle future-focused topics like Al and the digitalization of finance, delivering smart solutions for our customers.
Whether you're just beginning your career or are an experienced professional, Linedata offers exposure to large-scale client projects, internal mobility and opportunities to develop your skills alongside leading experts.
Job Summary
We are looking for a skilled
Python developer with hands-on experience in data manipulation and analysis using
Pandas. The ideal candidate should have strong problem-solving skills and experience working with large datasets in real-world applications.
Key Responsibilities
- Write efficient, scalable Python code for data processing and transformation.
- Perform complex data manipulation using Pandas, NumPy, and other scientific libraries.
- Clean, transform, and analyze large volumes of structured and unstructured data.
- Develop scripts and tools to automate workflows.
- Collaborate with data engineers and analysts to optimize data pipelines.
- Participate in code reviews and follow best practices in version control (Git).
Required Skills
- Strong proficiency in Python, especially with Pandas and NumPy.
- Experience working with data files in formats like CSV, Excel, JSON, and SQL.
- Familiarity with Jupyter Notebooks, API integrations, and data visualization tools (e.g., Matplotlib, Seaborn).
- Experience with SQL queries and relational databases.
- Good understanding of software development lifecycle and version control (Git).
Good To Have
- Exposure to machine learning libraries (e.g., scikit-learn).
- Experience with cloud platforms (AWS/Azure/GCP) or data lake frameworks.
- Familiarity with Airflow, Spark, or other data orchestration tools.