Search by job, company or skills

  • Posted 10 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

TCS is Hiring AWS Devops For New Delhi and Bangalore location

Experience : 6 - 12 years

Mode of interview: Virtual

Location : New Delhi, Bangalore

JOB DESCRIPTION:

Must have:

  • Develop and implement data virtualization solutions using the Denodo Platform
  • Stay up to date with the latest trends and technologies in data virtualization and related fields.
  • Strong understanding of data virtualization concepts and best practices
  • Proficient in SQL and database management systems
  • Understand the data delivery platform as a foundation for reporting, business intelligence, Data as a Service API exposures, and analytics.
  • Experience on Design and develop Data Virtualization views by connecting Oracle, AWS Redshift, Hana, SQL Server, and other data sources.
  • Install and upgrade current Denodo environment.
  • Triage, debug, and fix technical issues related to Denodo.
  • Design and Develop solution for Scale.
  • Optimize Denodo view and Resources performance.
  • Design and Evaluate Data Models (Star, Snowflake and Flattened using data caching)
  • Coordinate with Business and Technical teams through all the phases in the software development life cycle.
  • Maintain and Manage Code repositories like Git.
  • Experience working in Agile/ Scrum methodology.
  • Participate in making major technical and architectural decisions.
  • Set up integrated environments for GitHub and Jira, New Relic, and AWS cloud.
  • Implemented and maintained the monitoring and alerting of production and corporate servers/storage using AWS Cloud watch.
  • Coordinate or assist the developers with establishing and applying appropriate branching, merging conventions by using GitHub.
  • Installing and configuring GIT, Jenkins/Concourse, Deployment, and automation
  • Maintaining the source code in GIT for various applications.
  • Responsible for creating the branches and tagging the code.
  • Supporting the developers in configuration management environment
  • Experience in Data Analytics with Databricks, Python and Data management tools.
  • Experience in Data extraction, PySpark transformations, migrations, and Loads.
  • Full software development life cycle (SDLC) and proficient in working with Agile Scrum Methodologies.
  • Experience in developing SQL scripts for Automation.
  • Develop and maintain data models to support data analysis and reporting.
  • Develop and maintain data warehouses and data marts.
  • Develop and maintain ETL processes to move data between systems.
  • Develop and maintain data quality and governance processes.
  • Expertise and experience in Linux Operating system.
  • Strong communication and documentation skills

More Info

Job Type:
Industry:
Function:
Employment Type:

Job ID: 135881295

Similar Jobs