Hadoop Applications Support Lead

Location: Airoli Navi Mumbai, Maharashtra IN

Apply

Requisition Number: 210309

External Description:

Role: Hadoop Applications Support Lead

Location: Mumbai/Pune

Role Description:

Providing Applications Support for Teradata Customers on Hadoop platforms. Typically these customers may have 24/7 contracts, and the successful applicant must be prepared to work in shifts and also be on-call to support customer site/s per contractual obligations.

The Hadoop Application associate manages and administers and manages jobs/application on Hadoop Ecosystem environment for Teradata customers. The Hadoop application associate requires specific technical knowledge about the dataflow, integrated tools and services of Hadoop Ecosystem, including the associated operating system, related tools, network, and hardware.

Minimum Requirements:

  • Minimum experience of 6-8 years in Managing and Supporting large scale Production Hadoop environments (configuration management, monitoring, and application performance tuning) in any of the Hadoop distributions (Apache, Hortonworks, Cloudera, MapR, IBM BigInsights, Pivotal HD)
  • Around 6-8 years of experience in Applications Support (Java / J2EE, any ETL tool , Strong Knowledge of SQL queries and Unix Shell Scripting, BI operations, Analytics support) engagements on large scale systems.
  • Experience in core Hadoop components and key dataflow tools
  • Experience working independently and as part of a team to debug application issues working with configuration files\databases and application log files.
  • Root cause analysis for job failures & data quality issues & providing solutions.
  • Have a working understanding of the software development lifecycle and be able to communicate incident and project status, issues, and resolutions
  • Experience in Incident management, ServiceNow, JIRA, Change Management Process.
  • Handle/Lead deployment activities, code and data movement between Dev., QA and Prod Environments
  • 4+ years of experience in Scripting Language (Linux, SQL, Python, Ansible).
  • Should be proficient in shell scripting.
  • Proficiency in Spark and Kafka.
  • Experience on devops and integrated tools such as Jenkins/ControlM/Kubernetes/Docker/Git/Slack
  • Experience in developing / supporting RESTful applications
  • Working knowledge of Linux operating system required.
  • Handson experience of any cloud platform native technologies AWS/Azure/GCP
  • Strong written and verbal communication skills.
  • ITIL Knowledge.
  • Must be willing to provide 24x7 on-call support on a rotational basis with the team.

Preferred:

  • Database support or application DBA – Oracle, DB2, MySQL, PostgreSQL
  • Knowledge of ETL tools – Nifi, CDF, Kylo.
  • Development, implementation or deployment experience in the Hadoop ecosystem
  • Working experience with one of the Scheduling tools (Control-M, JCL, Unix/Linux-cron etc.)
  • Proficiency in Hive internals (including HCatalog), SQOOP, Pig, Oozie and Flume/Kafka.
  • Proficiency with at least one of the following: Java, Python, Perl, Ruby, C or Web-related development
  • Development or Operational knowledge on NoSQL technologies like Hbase, MongoDB, Cassandra, Accumulo, etc.
  • Development or Operational knowledge on Web or cloud platforms like Amazon S3, EC2, Redshift, Rackspace, OpenShift, etc.
  • Development/scripting experience on Configuration management and provisioning tools e.g. Puppet, Chef
  • Web/Application Server & SOA administration (Tomcat, JBoss, etc.)
  • Should be able to handle performance tuning on Hadoop
  • Develop and produce daily/ weekly operations reports and metrics as required by IT management
  • Experience on any of the following will be an added advantage:
  • Cloudera data science workbench
  • Cloudera data platform
  • Kubernetes, Docker, Terraform, Chef, Puppet
  • Hadoop integration with large scale distributed DBMSs like Teradata, Teradata aster, Vertica, Greenplum, Netezza, DB2, Oracle, etc.
  • Data Modeling or ability to understand data models
  • Knowledge of Business Intelligence and/or Data Integration (ETL) solution delivery techniques, models, processes, methodologies
  • Exposure to tools data acquisition, transformation & integration tools like Talend, Informatica, etc. & BI tools like Tableau, Power BI etc.
  • Linux Administrator certified.
  • Lead Cloud AWS/Azure/GCP certifications.

 

CountryEEOText_Description:

City: Airoli Navi Mumbai

State: Maharashtra

Community / Marketing Title: Hadoop Applications Support Lead

Job Category: Consulting

Company Profile:

Teradata helps businesses unlock value by turning data into their greatest asset.  We’re the cloud data analytics platform company, built for a hybrid multi-cloud reality, solving the world's most complex data challenges at scale. Collectively, we endeavor to serve equal parts innovator and contributor. Because our mission isn’t just about the collection of data – it’s about revolutionizing the future of transportation to save lives, optimizing energy costs to make the planet a cleaner place, and using data to predict and identify cancer risks.

Location_formattedLocationLong: Airoli Navi Mumbai, Maharashtra IN

.

© 2021, Teradata. All rights reserved. | Privacy | Terms of Use | Fraud Alert | Tracking Consent | Teradata is an Equal Opportunity Employer | www.teradata.com