Be aware of scams on social media involving phony job postings. Learn more


Senior Technical Consultant

Location: Pune, Maharashtra, India

Notice

This position is no longer open.

Requisition Number: 200998

External Description:

  1. Hadoop Applications Support

Role Description:

Providing Applications Support for Think Big Customers on Hadoop platforms. Typically these customers may have 24/7 contracts, and the successful applicant must be prepared to work in shifts and also be on-call to support customer site/s per contractual obligations.

 

Minimum Requirements:

  • Minimum experience of 3-6 years in Managing and Supporting large scale Production Hadoop environments (configuration management, monitoring, and application performance tuning) in any of the Hadoop distributions (Apache, Hortonworks, Cloudera, MapR, IBM BigInsights, Pivotal HD)
  • Around 3-6 years of experience in Applications Support (Java / J2EE, any ETL tool , Strong Knowledge of SQL queries and Unix Shell Scripting, BI operations, Analytics support) engagements on large scale systems.
  • Experience in Hadoop components such as:
    • Yarn
    • Spark
    • Hive
    • Impala
    • HBase
    • Sqoop
    • Pig
    • Python
    • Anaconda
    • Shiny
    • RStudio
    • Tensorflow
    • Keras
    • Kafka
    • STreaming Tools – NiFi 
    • IPython Jupyter
    • Gitlab
    • Scala
    • Flume
    • Solr
    • Ranger
    • Atlas
    • Mahout
    • Data pipeline tools
    • Zookeeper
    • Oozie

 

  • Experience working independently and as part of a team to debug application issues working with configuration files\databases and application log files.
  • Root cause analysis for job failures & data quality issues & providing solutions.
  • Have a working understanding of the software development lifecycle and be able to communicate incident and project status, issues, and resolutions
  • Experience in Incident management, ServiceNow, JIRA, Change Management Process.
  • 3+ years of experience in Scripting Language (Linux, SQL, Python). Should be proficient in shell scripting.
  • Experience in developing / supporting RESTful applications
  • Working knowledge of Linux operating system required.
  • Strong written and verbal communication skills.
  • ITIL Knowledge.

 

Preferred:

  • Database support or application DBA – Oracle, DB2, MySQL, PostgreSQL
  • Knowledge of Storm, Accumulo.
  • Knowledge of Datastage ETL tools – TalenD, Informatica, Data Stage.
  • Development, implementation or deployment experience in the Hadoop ecosystem
  • Working experience with one of the Scheduling tools (Control-M, JCL, Unix/Linux-cron etc.)
  • Proficiency in Hive internals (including HCatalog), SQOOP, Pig, Oozie and Flume/Kafka.
  • Proficiency with at least one of the following: Java, Python, Perl, Ruby, C or Web-related development
  • Development or Operational knowledge on NoSQL technologies like Hbase, MongoDB, Cassandra, Accumulo, etc.
  • Development or Operational knowledge on Web or cloud platforms like Amazon S3, EC2, Redshift, Rackspace, OpenShift, etc.
  • Development/scripting experience on Configuration management and provisioning tools e.g. Puppet, Chef
  • Web/Application Server & SOA administration (Tomcat, JBoss, etc.)
  • Handle deployment methodologies, code and data movement between Dev., QA and Prod Environments (deployment groups / folder copy/ data-copy etc.)
  • Should be able to articulate and discuss the principles of performance tuning on Hadoop
  • Develop and produce daily/ weekly operations reports and metrics as required by IT management
  • Experience on any of the following will be an added advantage:
  • Hadoop integration with large scale distributed DBMSs like Teradata, Teradata aster, Vertica, Greenplum, Netezza, DB2, Oracle, etc.
  • Data Modeling or ability to understand data models
  • Knowledge of Business Intelligence and/or Data Integration (ETL) solution delivery techniques, models, processes, methodologies
  • Exposure to tools data acquisition, transformation & integration tools like Talend, Informatica, etc. & BI tools like Tableau, Pentaho, etc.
  • Linux Administrator certified.

CountryEEOText_Description: Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are.

City: Pune

State: Maharashtra

Community / Marketing Title: Senior Technical Consultant

Job Category: Services

Company Profile:

Our Company

At Teradata, we believe that people thrive when empowered with better information. That’s why we built the most complete cloud analytics and data platform for AI. By delivering harmonized data, trusted AI, and faster innovation, we uplift and empower our customers—and our customers’ customers—to make better, more confident decisions. The world’s top companies across every major industry trust Teradata to improve business performance, enrich customer experiences, and fully integrate data across the enterprise.

LinkedIn Remote:

Location_formattedLocationLong: Pune, Maharashtra IN

.