Hadoop Applications Support
Location: Pune, Maharashtra IN
Requisition Number: 210126
Role: Hadoop Applications Support
Job Location: Mumbai/Pune
Providing Applications Support for Teradata Customers on Hadoop platforms. Typically these customers may have 24/7 contracts, and the successful applicant must be prepared to work in shifts and also be on-call to support customer site/s per contractual obligations.
The Hadoop Application associate manages and administers and manages jobs/application on Hadoop Ecosystem environment for Teradata customers. The Hadoop application associate requires specific technical knowledge about the dataflow, integrated tools and services of Hadoop Ecosystem, including the associated operating system, related tools, network, and hardware.
• Minimum experience of 3-10 years in Managing and Supporting large scale Production Hadoop environments (configuration management, monitoring, and application performance tuning) in any of the Hadoop distributions (Apache, Hortonworks, Cloudera, MapR, IBM BigInsights, Pivotal HD)
• Around 4-8 years of experience in Applications Support (Java / J2EE, any ETL tool , Strong Knowledge of SQL queries and Unix Shell Scripting, BI operations, Analytics support) engagements on large scale systems.
• Experience in core Hadoop components and key dataflow tools
• Experience working independently and as part of a team to debug application issues working with configuration files\databases and application log files.
• Root cause analysis for job failures & data quality issues & providing solutions.
• Have a working understanding of the software development lifecycle and be able to communicate incident and project status, issues, and resolutions
• Experience in Incident management, ServiceNow, JIRA, Change Management Process.
• Handle/Lead deployment activities, code and data movement between Dev., QA and Prod Environments
• 3+ years of experience in Scripting Language (Linux, SQL, Python, Ansible).
• Should be proficient in shell scripting.
• Proficiency in Spark and Kafka.
• Experience on devops and integrated tools such as Jenkins/ControlM/Kubernetes/Docker/Git/Slack
• Experience in developing / supporting RESTful applications
• Working knowledge of Linux operating system required.
• Handson experience of any cloud platform native technologies AWS/Azure/GCP
• Strong written and verbal communication skills.
• ITIL Knowledge.
• Must be willing to provide 24x7 on-call support on a rotational basis with the team.
• Database support or application DBA – Oracle, DB2, MySQL, PostgreSQL
• Knowledge of ETL tools – Nifi, CDF, Kylo.
• Development, implementation or deployment experience in the Hadoop ecosystem
• Working experience with one of the Scheduling tools (Control-M, JCL, Unix/Linux-cron etc.)
• Proficiency in Hive internals (including HCatalog), SQOOP, Pig, Oozie and Flume/Kafka.
• Proficiency with at least one of the following: Java, Python, Perl, Ruby, C or Web-related development
• Development or Operational knowledge on NoSQL technologies like Hbase, MongoDB, Cassandra, Accumulo, etc.
• Development or Operational knowledge on Web or cloud platforms like Amazon S3, EC2, Redshift, Rackspace, OpenShift, etc.
• Development/scripting experience on Configuration management and provisioning tools e.g. Puppet, Chef
• Web/Application Server & SOA administration (Tomcat, JBoss, etc.)
• Should be able to handle performance tuning on Hadoop
• Develop and produce daily/ weekly operations reports and metrics as required by IT management
• Experience on any of the following will be an added advantage:
• Cloudera data science workbench
• Cloudera data platform
• Kubernetes, Docker, Terraform, Chef, Puppet
• Hadoop integration with large scale distributed DBMSs like Teradata, Teradata aster, Vertica, Greenplum, Netezza, DB2, Oracle, etc.
• Data Modeling or ability to understand data models
• Knowledge of Business Intelligence and/or Data Integration (ETL) solution delivery techniques, models, processes, methodologies
• Exposure to tools data acquisition, transformation & integration tools like Talend, Informatica, etc. & BI tools like Tableau, Power BI etc.
• Linux Administrator certified.
• Lead Cloud AWS/Azure/GCP certifications.
Community / Marketing Title: Hadoop Applications Support
Job Category: Consulting
Teradata (NYSE: TDC) is the leading multi-cloud data platform company for enterprise analytics, transforming how businesses work and people live through the power of data.
At Teradata, we are leading the data era. As enterprises address today’s digital economy, they are faced with new competition and consumer expectations and are turning to data to power their future. Teradata has worked with the largest companies in the world for 40+ years, bringing our experience and expertise to support global enterprises with their most demanding, mission-critical, complex, and large-scale data needs. Teradata is recognized as a leader in the cloud, data, and analytics spaces by top analyst firms, Gartner and Forrester, and Fortune Magazine as well.
Our connected multi-cloud data platform for enterprise analytics, Teradata Vantage™, is an extremely scalable, secure, and resilient offering that simplifies ecosystems by connecting data and making it easier to uncover insights from across the organization…regardless of where that data resides. With Vantage, we enable companies to modernize their data management, from start to scale. Every day, millions of users benefit from our open data platform. Empowering customers and partners to develop and build how they like, we enable hundreds of business outcomes and solutions, including improving customer experience and profitability, driving operational efficiency, realizing financial transformation, or achieving operational efficiency.
As the world of data grows, we are the leader in enabling the future of connected businesses, powered by data intelligence. We are committed to delivering on this vision by following sustainable business practices and with a strong focus on diversity, equity, and inclusion. We believe that only by embracing diversity of identity, thought, background, expression, and perspective can we solve today’s challenges and reimagine tomorrow’s world.
Location_formattedLocationLong: Pune, Maharashtra IN