
Search by job, company or skills
This job is no longer accepting applications
In The role of Sr Software Engineer II, you will be responsible for taking on the role of an individual contractor for the GCP applications which is critical in the Amex environment.
We develop Engineering Development strategic frameworks, processes, tools and actionable insights.
As a Data Engineer, you will be responsible for designing, developing, and maintaining robust and scalable framework/services/application/pipelines for processing huge volume of data. You will work closely with cross-functional teams to deliver high-quality software solutions that meet our organizational needs.
· Preferably a B.Tech or M.Tech degree in computer science, computer engineering, or other technical discipline
· 8+ years of software development experience
· GCP Architecture design and build solutions, SQL, PySpark, Python, Cloud technologies
· Design and develop solutions using Bigdata tools and technologies like MapReduce, Hive, Spark etc.
· Extensive hands-on experience in GCP and object-oriented programming using Python, PySpark etc.
· Experience in building data pipelines for huge volume of data.
· Experience in designing, implementing, and managing various ETL job execution flows.
· Experience in implementing and maintaining Data Ingestion process.
· Hands on experience in writing basic to advance level of optimized queries using HQL, SQL & Spark.
· Hands on experience in designing, implementing, and maintaining Data Transformation jobs using most efficient tools/technologies.
· Ensure the performance, quality, and responsiveness of solutions.
· Participate in code reviews to maintain code quality.
· Should be able to write shell scripts.
· Utilize Git for source version control.
· Set up and maintain CI/CD pipelines.
· Troubleshoot, debug, and upgrade existing application & ETL job chains.
· Ability to effectively interpret technical and business objectives and challenges and articulate solutions
· Experience with managing teams and balance multiple priorities.
· Willingness to learn new technologies and exploit them to their optimal potential
· Strong experience with Data Engineering, Big Data Applications
· Strong background with Python, PySpark , Java , Airflow , Spark , PL/SQL, Airflow Dags
· Cloud experience with GCP is must
· Conduct IT requirements gathering.
· Define problems and provide solution alternatives.
· Create detailed computer system design documentation.
· Implement deployment plan.
· Conduct knowledge transfer with the objective of providing high-quality IT consulting solutions
· Support consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation, design and deployment.
· Under supervision participate in unit-level and organizational initiatives with the objective of providing high-quality and value adding consulting solutions.
· Understand issues and diagnose root-cause of issues. Perform secondary research as instructed by supervisor to assist in strategy and business planning.
· Excellent communication and analytical skills
· Excellent team-player with ability to work with global team
Job ID: 142646033