Search by job, company or skills

FedEx ACC

Data Engineer Advisor [T500-9126]

This job is no longer accepting applications

new job description bg glownew job description bg glownew job description bg svg
  • Posted 24 months ago

Job Description

Skill Required:

Must Haves:

  • Exp with D&A domain - 6yrs
  • Azure specific & not AWS over here
  • Pandas, Scikit-Learn, Matplotlib, Tensor Flow, Jupiter and other Python data tools - 4yrs
  • Spark (Scala and PySpark)
  • SQL and NoSQL storage tools
  • Detailed knowledge of the Microsoft Azure tooling for large-scale data processing
  • Azure Data Factory, Dataflow, Data bricks, Python, Synapse, Building Data Pipelines

Good to Have:

  • Kafka and other high-volume data tools
  • Event Hub / IOT, Azure Kubernetes Service (AKS), APIM, Network Design, Hadoop

Essential Job Duties & Responsibilities:

  • Understanding in depth both the business and technical problems, Data Works aims to solve
  • Building tools, platforms and pipelines to enable teams to clearly and cleanly analyze data, build models and drive decisions
  • Scaling up from laptop-scale to cluster scale problems, in terms of both infrastructure and problem structure and technique
  • Delivering tangible value very rapidly, collaborating with diverse teams of varying backgrounds and disciplines
  • Codifying best practices for future reuse in the form of accessible, reusable patterns, templates, and code bases
  • Interacting with senior technologists from the broader enterprise and outside of FedEx (partner ecosystems and customers) to create synergies and ensure smooth deployments to downstream operational systems

Skill/Knowledge Considered a plus:

  • Technical background in computer science, software engineering, database systems, distributed systems
  • Fluency with distributed and cloud environments and a deep understanding of how to balance computational considerations with theoretical properties
  • Detailed knowledge of the Microsoft Azure tooling for large-scale data engineering efforts and deployments is highly preferred
  • A compelling track record of designing and deploying large scale technical solutions, which deliver tangible, ongoing value
  • Direct experience having built and deployed robust, complex production systems that implement modern, data scientific methods at scale
  • Ability to context-switch, to provide support to dispersed teams which may need an expert hacker to unblock an especially challenging technical obstacle, and to work through problems as they are still being defined
  • Demonstrated ability to deliver technical projects with a team, often working under tight time constraints to deliver value
  • An engineering mindset, willing to make rapid, pragmatic decisions to improve performance, accelerate progress or magnify impact
  • Comfort with working with distributed teams on code-based deliverables, using version control systems and code reviews
  • Ability to conduct data analysis, investigation, and lineage studies to document and enhance data quality and access
  • Use of agile and devops practices for project and software management including continuous integration and continuous delivery
  • Demonstrated expertise working with some of the following common languages and tools:
  • Spark (Scala and PySpark), HDFS, Kafka and other high-volume data tools
  • SQL and NoSQL storage tools, such as MySQL, Postgres, Cassandra, MongoDB and Elasticsearch
  • Pandas, Scikit-Learn, Matplotlib, Tensor Flow, Jupiter and other Python data tools
  • Azure/MARS

Minimum Qualifications

  • Bachelor's Degree in Information Systems, Computer Science, or a quantitative discipline such as Mathematics or Engineering and/or equivalent formal training or work experience. Seven (7) years equivalent work experience in measurement and analysis, quantitative
  • Business problem solving, simulation development and/or predictive analytics. Extensive knowledge in data engineering and machine learning frameworks including design, development and implementation of highly complex systems and data pipelines.
  • Extensive knowledge in Information Systems including design, development and implementation of large batch or online transaction-based systems. Strong understanding of the transportation industry, competitors, and evolving technologies.
  • Experience providing leadership in a general planning or consulting setting. Experience as a leader or a senior member of multi-function project teams. Strong oral and written communication skills. A related advanced degree may offset the related experience requirements.

More Info

Job Type:
Industry:
Function:
Employment Type:

About Company

Job ID: 70352575