Hadoop Admin (Platform Support)
Location: Pune, Maharashtra IN
Requisition Number: 210306
Role: Hadoop Admin (Platform Support)
Providing Administrative Support for Teradata Customers on Hadoop platforms. Typically, these customers may have 24/7 contracts, and the successful applicant must be prepared to work in shifts and also be on-call to support customer site/s per contractual obligations.
The Hadoop Administrator manages and controls the Hadoop System environment for Teradata customers. The Hadoop Administrator requires specific technical knowledge about the administration and control of the Hadoop System, including the associated operating system, related tools, network, and hardware.
Minimum Requirements –
- Minimum experience of 4-6 years in Managing and Supporting large scale Production Hadoop environments (configuration management, monitoring, and performance tuning) in any of the Hadoop distributions (Apache, Hortonworks, Cloudera, MapR, IBM BigInsights, Pivotal HD)
- 2-3 years of experience in Scripting Language (Linux, SQL, Python).
- Should be proficient in shell scripting
- 4-6 years of experience on Administrative activities likes-
Administration, maintenance, control, optimization of Hadoop capacity, Performance tuning, security, configuration, process scheduling, and errors.
- Management of data, users, and job execution on the Hadoop System
- Experience in Backup, Archival and Recovery (BAR) and High availability (HA)
- Plan for and support hardware and software installation and upgrades.
- 4-6 years of Experience in Hadoop Monitoring tools (Cloudera Manager, and Ambari, Nagios, Ganglia etc).
- Experience may include (but is not limited to) build and support including design, configuration, installation (upgrade), monitoring and performance tuning of any of the Hadoop distributions
- Hadoop software installation and upgrades
- Experience of workload / performance management
- Implementing standards and best practices to manage and support data platforms as per distribution.
- Proficiency in Hive internals (including HCatalog), SQOOP, Pig, Oozie, Spark and Flume/Kafka.
- Experience in MySQL & PostgreSQL databases.
- Experience in Incident management, ServiceNow, JIRA, Change Management Process.
- Fundamental Knowledge of any cloud platform native technologies AWS/Azure/GCP
- ITIL Knowledge.
- Must be willing to provide 24x7 on-call support on a rotational basis with the team.
- Experience with DR (Disaster Recovery) strategies and principles.
- Development or administration on NoSQL technologies like Hbase, MongoDB, Cassandra, Accumulo, etc.
- Development or administration on Web or cloud platforms like Amazon S3, EC2, Redshift, Rackspace, OpenShift etc.
- Development/scripting experience on Configuration management and provisioning tools e.g. Puppet, Chef
- Web/Application Server & SOA administration (Tomcat, JBoss, etc.)
- Development, Implementation or deployment experience on the Hadoop ecosystem (HDFS, MapReduce, Hive, Hbase)
- Experience on any one of the following will be an added advantage:
- Cloudera data science workbench
- Cloudera data platform
- Kubernetes, Docker, Terraform
- Hadoop integration with large scale distributed data platforms like Teradata, Teradata Aster, Vertica, Greenplum, Netezza, DB2, Oracle, etc.
- Proficiency with at least one of the following: Java, Python, Perl, Ruby, C or Web-related development
- Knowledge of Business Intelligence and/or Data Integration (ETL) operations delivery techniques, processes, methodologies
- Exposure to tools data acquisition, transformation & integration tools like Talend, Informatica, etc. & BI tools like Tableau, Pentaho, etc.
- Linux Administrator certified.
Community / Marketing Title: Hadoop Admin (Platform Support)
Job Category: Consulting
Teradata helps businesses unlock value by turning data into their greatest asset. We’re the cloud data analytics platform company, built for a hybrid multi-cloud reality, solving the world's most complex data challenges at scale. Collectively, we endeavor to serve equal parts innovator and contributor. Because our mission isn’t just about the collection of data – it’s about revolutionizing the future of transportation to save lives, optimizing energy costs to make the planet a cleaner place, and using data to predict and identify cancer risks.
Location_formattedLocationLong: Pune, Maharashtra IN