Hadoop Admin Architect

Location: Powai Mumbai, Maharashtra IN

Notice

This position is no longer open.

Requisition Number: 175676

Position Title: Glob Deliv Ctr Consultant (IV)

External Description:

 

Role Description:

 

This position is for Big Data Admin Architect who will be involved in Think Big Data projects. This is a client-facing position having accountability for client expectations management while delivering the hadoop platform related services and solutions.

 

The Hadoop Administrator architect should be able to assist client in all aspects related to hadoop platform including Distribution Selection, Cluster Sizing, Optimization and Security implementation etc. The Hadoop Administrator Architect requires specific technical knowledge about the administration and control of the Hadoop System, including the associated operating system, related tools, network, and hardware.

Minimum Requirements:

  • 10+ years total Industry experience with strong technical background and 4+ years in Big data platform administration.
  • Minimum 3 years in Managing and Supporting large scale Production Hadoop environments in any of the Hadoop distributions (Apache, Teradata, Hortonworks, Cloudera, MapR, IBM BigInsights, Pivotal HD)
  • 5+ years of experience in Scripting Language (Linux, SQL, Python). Should be proficient in shell scripting
  • 4+ years of experience on Administrative activities likes –
    • Management of data, users, and job execution on the Hadoop System
    • Periodic backups of the system
    • Security Implementation
    • Cluster Optimization
    • High availability, BAR and DR strategies and principles
    • Plan for and support hardware and software installation and upgrades.
  • 2+ years of Experience in Hadoop Monitoring tools (Nagios, Ganglia, Cloudera Manager, and Ambari etc).
  • Should have in depth knowledge on Big data distributions such as Cloudera, Hortonworks & Greenplum Pivotal, and MapR.
  • Hadoop administration, maintenance, control, and optimization of cluster capacity, security, configuration, process scheduling, and errors.
  • Define standards, Develop and Implement Best Practices to manage and support data platforms
  • Should have experience in Operations methodologies like ITIL.

Nice to have Experience:

    • Experience with ANY ONE of the following:
      • Proficiency in Hive internals (including HCatalog), SQOOP, Pig, Oozie and Flume/Kafka.
      • Development or administration on NoSQL technologies like Hbase, MongoDB, Cassandra, Accumulo, etc.
      • Development or administration on Web or cloud platforms like Amazon S3, EC2, Redshift, Rackspace, OpenShift, etc.
      • Development/scripting experience on Configuration management and provisioning tools e.g. Puppet, Chef
      • Web/Application Server & SOA administration (Tomcat, JBoss, etc.)
    • Development, Implementation or deployment experience on the Hadoop ecosystem (HDFS, MapReduce, Hive, Hbase)
    • Analysis and optimization of workloads, performance monitoring and tuning, and automation.
    • Addressing challenges of query execution across a distributed database platform on modern hardware architectures
    • Experience on any one of the following will be an added advantage:
      • Hadoop integration with large scale distributed data platforms like Teradata, Teradata Aster, Vertica, Greenplum, Netezza, DB2, Oracle, etc.
      • Proficiency with at least one of the following: Java, Python, Perl, Ruby, C or Web-related development
      • Knowledge of Business Intelligence and/or Data Integration (ETL) operations delivery techniques, processes, methodologies
      • Exposure to tools data acquisition, transformation & integration tools like Talend, Informatica, etc. & BI tools like Tableau, Pentaho, etc.

 

CountryEEOText_Description:

City: Powai Mumbai

State: Maharashtra

Community / Marketing Title: Hadoop Admin Architect

Job Category: Services

Company Profile:

With all the investments made in analytics, it’s time to stop buying into partial solutions that overpromise and underdeliver. It’s time to invest in answers. Only Teradata leverages all of the data, all of the time, so that customers can analyze anything, deploy anywhere, and deliver analytics that matter most to them. And we do it at scale, on-premises, in the Cloud, or anywhere in between.

We call this Pervasive Data Intelligence. It’s the answer to the complexity, cost, and inadequacy of today’s analytics. And it's the way Teradata transforms how businesses work and people live through the power of data throughout the world. Join us and help create the era of Pervasive Data Intelligence.

Location_formattedLocationLong: Powai Mumbai, Maharashtra IN

.

© 2018, Teradata. All rights reserved. | Privacy | Terms of Use | Fraud Alert | Tracking Consent | Teradata is an Equal Opportunity Employer | www.teradata.com