Skip to content

Navigate your cloud native world with training that matures your DevOps practices

Learn how to put the latest open source technology into practice with hands-on training, delivered by industry experts, aligned to your desired business outcomes

rx-m-cloud.png
CNCF-Logo.svg
KTP-logo.svg
KCSP-logo.svg
lf-atp-logo.svg
apache-bronze-sponsor-logo.svg

Hadoop Foundation

2 Days

Available On-Site

Available Virtually

Open Enrollments Available

Customizable


This hands-on Hadoop training course teaches Data Analysts, BI Analysts, BI Developers, SAS Developers and other types of analysts who need to answer questions and analyze Big Data stored in a Hadoop cluster how to develop applications and analyze Big Data stored in Apache Hadoop using Hive. Students will learn the details of Hadoop, YARN, the Hadoop Distributed File System (HDFS), an overview of MapReduce, and a deep dive into using Hive to perform data analytics on Big Data.

Students will work through lab exercises using the Hortonworks Data Platform for Windows to issue HDFS commands to add/remove files and folders from HDFS, run and monitor MapReduce jobs, retrieve HCatalog schemas from within a Pig script, perform a join of datasets and use advanced Hive features like windowing, views and multi-file inserts.

Delivery

Available for Instructor-Led (ILT) in-person/onsite training or Virtual Instructor-Led training (VILT) delivery; Open Enrollment options may be available.

Who Should Attend?

Data Analysts, Business Intelligence (BI) Analysts, Business Intelligence (BI) Developers, Statistical Analysis System (SAS) Developers and other types of analysts who need to answer questions and analyze Big Data stored in a Hadoop cluster.

What Attendees will learn

At the completion of the course students will be able to:

  • Understand the architecture of the Hadoop Distributed File System (HDFS) and how HDFS Federation works in Hadoop
  • Use the Hadoop client to input data into HDFS
  • Understand the various tools and frameworks in the Hadoop 2.0 ecosystem
  • Use Sqoop to transfer data between Hadoop and a relational database
  • Understand the architecture of MapReduce and run a MapReduce job on Hadoop 2.0
  • Understand how Hive tables are defined and implemented
  • Write efficient Hive queries and use Hive to run SQL-like queries to perform data analysis
  • Perform data analytics on Big Data using Hive
  • Use HCatalog with Hive

Prerequisites

Students should be familiar with SQL and have a minimal understanding of programming principles. No prior Hadoop knowledge is required.


Contact us to request more information about enrolling in the Hadoop Foundation course or to inquire about booking a custom in-house course for your team.

Other Open Enrollments from RX-M

May 2020
June 2020
No event found!
Load More

Frequently Asked Questions about Open Enrollment Courses

RX-M's Cloud Native & DevOps enablement philosophy

Bring a neutral perspective

We bring a market neutral perspective to every engagement, taking no stake in any of the competing cloud native platforms, components or solutions so we can offer unbiased insights to our clients

Practice what we teach

We are a multi-cloud company consisting of prominent open source contributors with large-scale software engineering experience, actively contributing to the evolution of next-gen software architectures, application management, and platforms

Be solution focused

RX-M has the unique ability to deliver purpose-built, solution-based training in the form of custom curriculum that aligns with each of our client's specific desired outcomes so your team has the skills needed to accelerate the business

Our team has been trusted to work alongside Cloud Native and DevOps teams at some of the most exciting companies

grey-client-logos-16-mar-2020.svg