
Search by job, company or skills
Senior Software Developer (Quantitative Solutions)
Summary
The Senior Software Developer will be responsible for development of next generation of quantitative solutions using a modern cloud-native technology stack with Python on AWS cloud infrastructure. This is a rare opportunity to make a big impact on both the team and the organization by being part of the initial design and development of a new customer-facing application framework that will serve as the foundation for all future development . The ideal candidate has a passion for solving business problems with technology and can effectively communicate business and technical needs to stakeholders. We are looking for candidates that value collaboration with colleagues and having an immediate, tangible impact for a leading global independent financial insights and data company. The team uses a contemporary stack in the AWS cloud to design, build, and maintain robust data delivery pipelines via APIs and Feeds. Key Responsibilities Model Development: Lead the design and development of quantitative data engineering
models, including algorithms, data pipelines, and data processing systems, to support business requirements. Data Processing: Develop and maintain data processing pipelines to ingest, clean, transform, and aggregate large volumes of data from various sources, ensuring data quality and reliability. Algorithm Development: Design and implement algorithms for data analysis, machine learning, and statistical modeling, using techniques such as regression analysis, clustering, and predictive modeling. Performance Optimization: Identify and implement optimizations to improve the performance and efficiency of data processing and modeling algorithms, considering factors like scalability and resource utilization. Data Visualization: Create visualizations of data and model outputs to communicate insights and findings to stakeholders. Data Quality Assurance: Implement data quality checks and validation processes to ensure the accuracy, completeness, and consistency of data used in models and analyses. Model Evaluation: Evaluate the performance of data engineering models using metrics and validation techniques, and iterate on models to improve their accuracy and effectiveness. Collaboration: Collaborate with data scientists, analysts, and business stakeholders to understand requirements, develop models, and deliver insights that drive business decisions.
Documentation: Document the design, implementation, and evaluation of data engineering models, including assumptions, methodologies, and results, to ensure reproducibility and transparency. Continuous Learning: Stay updated with the latest trends, tools, and technologies in quantitative data engineering and data science, and continuously improve your skills and knowledge. Desired Skills and Experience Data Engineering: Strong background in data engineering principles, including data ingestion, data processing, data transformation, and data storage, using tools and frameworks such as Apache Spark, Apache Flink, or AWS Glue. Quantitative Analysis: Proficiency in quantitative analysis techniques, including statistical modeling, machine learning, and data mining, with experience in implementing algorithms for regression analysis, clustering, classification, and predictive modeling
Programming Languages: Proficiency in programming languages commonly used for data engineering and quantitative analysis, such as Python, R, Java, or Scala, as well as experience with SQL for data querying and manipulation. Big Data Technologies: Familiarity with big data technologies and platforms, such as Hadoop, Apache Kafka, Apache Hive, or AWS EMR, for processing and analyzing large volumes of data. Data Visualization: Experience in data visualization techniques and tools, such as Matplotlib, Seaborn, or Tableau, for creating visualizations of data and model outputs to communicate insights effectively. Machine Learning Frameworks: Familiarity with machine learning frameworks and libraries, such as PyTorch for implementing and deploying machine learning models. Cloud Computing: Experience with cloud computing platforms, such as AWS, Azure, or Google Cloud Platform, and proficiency in using cloud services for data engineering and model deployment. Software Development: Strong software development skills, including proficiency in software design patterns, version control systems (e.g., Git), and software testing frameworks, to develop robust and maintainable code. Problem-solving Skills: Excellent problem-solving skills, with the ability to analyze complex data engineering and quantitative analysis problems, identify solutions, and implement them effectively.
Communication and Collaboration: Strong communication and collaboration skills, with the ability to work effectively with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand requirements and deliver solutions. Domain Knowledge: Domain knowledge in areas such as finance, healthcare, or marketing, depending on the industry, to understand the context and requirements of data engineering models in specific domains
Job ID: 147474873
Skills:
Java J2ee, Oracle Sql, Ibm Mq, Pl Sql, Spring Boot, Kafka, Shell Scripting, Fircosoft, Microservices, Jenkins, Kubernetes, AWS, Continuity V6, Screening Preparation, Firco Utilities, CI CD pipelines, messaging technologies
Skills:
Docker, Typescript, Node.js, AWS Shield, AWS Backup, TSLint, Nunjucks Template Engine
Skills:
Docker, Typescript, Node.js, AWS Shield, AWS Backup, TSLint, Nunjucks Template Engine
Skills:
.NET, .NET 4.7, Kibana, Core, Web Api, CSS, Asp.net Mvc, Node, Logstash, Visual Studio, HTML, Rest Api, Elasticsearch, Ado.net, jQuery, Javascript, Python, AJAX, NoSQL databases, Angular 16
Skills:
Angular, Jenkins, Git, React, Typescript, Javascript, Sonarqube, React Native, Angular Material, AWS, MUI, AG Grid, Flexbox, React Table, CSS Grid
We don’t charge any money for job offers