Be aware of scams on social media involving phony job postings. Learn more


Specialist Hadoop Administrator

Location: Karnataka, India

Notice

This position is no longer open.

Requisition Number: 210309

External Description:

Role: Hadoop Applications Support Lead

Location: Mumbai/Pune

Role Description:

Providing Applications Support for Teradata Customers on Hadoop platforms. Typically these customers may have 24/7 contracts, and the successful applicant must be prepared to work in shifts and also be on-call to support customer site/s per contractual obligations.

The Hadoop Application associate manages and administers and manages jobs/application on Hadoop Ecosystem environment for Teradata customers. The Hadoop application associate requires specific technical knowledge about the dataflow, integrated tools and services of Hadoop Ecosystem, including the associated operating system, related tools, network, and hardware.

Minimum Requirements:

  • Minimum experience of 6-8 years in Managing and Supporting large scale Production Hadoop environments (configuration management, monitoring, and application performance tuning) in any of the Hadoop distributions (Apache, Hortonworks, Cloudera, MapR, IBM BigInsights, Pivotal HD)
  • Around 6-8 years of experience in Applications Support (Java / J2EE, any ETL tool , Strong Knowledge of SQL queries and Unix Shell Scripting, BI operations, Analytics support) engagements on large scale systems.
  • Experience in core Hadoop components and key dataflow tools
  • Experience working independently and as part of a team to debug application issues working with configuration files\databases and application log files.
  • Root cause analysis for job failures & data quality issues & providing solutions.
  • Have a working understanding of the software development lifecycle and be able to communicate incident and project status, issues, and resolutions
  • Experience in Incident management, ServiceNow, JIRA, Change Management Process.
  • Handle/Lead deployment activities, code and data movement between Dev., QA and Prod Environments
  • 4+ years of experience in Scripting Language (Linux, SQL, Python, Ansible).
  • Should be proficient in shell scripting.
  • Proficiency in Spark and Kafka.
  • Experience on devops and integrated tools such as Jenkins/ControlM/Kubernetes/Docker/Git/Slack
  • Experience in developing / supporting RESTful applications
  • Working knowledge of Linux operating system required.
  • Handson experience of any cloud platform native technologies AWS/Azure/GCP
  • Strong written and verbal communication skills.
  • ITIL Knowledge.
  • Must be willing to provide 24x7 on-call support on a rotational basis with the team.

Preferred:

  • Database support or application DBA – Oracle, DB2, MySQL, PostgreSQL
  • Knowledge of ETL tools – Nifi, CDF, Kylo.
  • Development, implementation or deployment experience in the Hadoop ecosystem
  • Working experience with one of the Scheduling tools (Control-M, JCL, Unix/Linux-cron etc.)
  • Proficiency in Hive internals (including HCatalog), SQOOP, Pig, Oozie and Flume/Kafka.
  • Proficiency with at least one of the following: Java, Python, Perl, Ruby, C or Web-related development
  • Development or Operational knowledge on NoSQL technologies like Hbase, MongoDB, Cassandra, Accumulo, etc.
  • Development or Operational knowledge on Web or cloud platforms like Amazon S3, EC2, Redshift, Rackspace, OpenShift, etc.
  • Development/scripting experience on Configuration management and provisioning tools e.g. Puppet, Chef
  • Web/Application Server & SOA administration (Tomcat, JBoss, etc.)
  • Should be able to handle performance tuning on Hadoop
  • Develop and produce daily/ weekly operations reports and metrics as required by IT management
  • Experience on any of the following will be an added advantage:
  • Cloudera data science workbench
  • Cloudera data platform
  • Kubernetes, Docker, Terraform, Chef, Puppet
  • Hadoop integration with large scale distributed DBMSs like Teradata, Teradata aster, Vertica, Greenplum, Netezza, DB2, Oracle, etc.
  • Data Modeling or ability to understand data models
  • Knowledge of Business Intelligence and/or Data Integration (ETL) solution delivery techniques, models, processes, methodologies
  • Exposure to tools data acquisition, transformation & integration tools like Talend, Informatica, etc. & BI tools like Tableau, Power BI etc.
  • Linux Administrator certified.
  • Lead Cloud AWS/Azure/GCP certifications.

 

CountryEEOText_Description: Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are.

City: Airoli Navi Mumbai

State: Maharashtra

Community / Marketing Title: Specialist Hadoop Administrator

Job Category: Consulting

Company Profile:

Our Company

At Teradata, we believe that people thrive when empowered with better information. That’s why we built the most complete cloud analytics and data platform for AI. By delivering harmonized data, trusted AI, and faster innovation, we uplift and empower our customers—and our customers’ customers—to make better, more confident decisions. The world’s top companies across every major industry trust Teradata to improve business performance, enrich customer experiences, and fully integrate data across the enterprise.

LinkedIn Remote:

Location_formattedLocationLong: Bangalore, Karnataka IN

.