You should have some programming background and some familiarity with a Unix-based operating system. You should have an understanding of working with databases and database systems. You should also be familiar with Windows and Linux. For example, the course might say simply “Open PuTTY” without explaining how to do that. You should also be able to navigate the folder hierarchy using Windows Explorer. No specific experience with Java programming language or Hadoop is required.
Take a deeper look into Hadoop—what goes into it and what it does. This path breaks down Hadoop and Big Data so you can understand where they fit into your organization and what they can do for you. After traveling back in time to understand the history of Hadoop, fast forward to where Hadoop is today and take a look at some of its major components and its architecture. Examine some of the pieces that make up Hadoop and become familiar with their functionality. Explore how your organization can incorporate Hadoop into its existing IT framework as well as how it can advance your organization’s competitive edge.
It is common for users to take all the courses in this path in the order shown below.
15+ hours covering 53 topics
|Hadoop: Introduction||4||2h 1m||Intermediate|
|Hadoop, Part 1: Introduction and HDFS||7||1h 42m||Intermediate|
|Hadoop, Part 2: ETL and MapReduce||6||1h 40m||Intermediate|
|Hadoop, Part 3: YARN and NiFi||9||2h 12m||Intermediate|
|Hadoop, Part 4: HBase and MapReduce||8||2h 19m||Intermediate|
|Introduction to R|
|Introduction to R, Part 1: Workspaces and Types||8||1h 55m||Beginner|
|Introduction to R, Part 2: Advanced Types and Operators||6||1h 40m||Beginner|
|Introduction to R, Part 3: Working with Data||5||1h 40m||Beginner|