Difference between revisions of "Hadoop Session"

From Gridkaschool
Line 15: Line 15:
 
'''Cloud-Connector VM'''
 
'''Cloud-Connector VM'''
 
http://training.cloudera.com/cloudera/VMs/Cloudera-Training-Get2EC2-VM-1.0-vmware.zip
 
http://training.cloudera.com/cloudera/VMs/Cloudera-Training-Get2EC2-VM-1.0-vmware.zip
 
   
 
Do you have a connection to the internet form inside this VM? It should work out of the box, but sometimes it does not ;-(
 
Do you have a connection to the internet form inside this VM? It should work out of the box, but sometimes it does not ;-(
   
 
So please prepare your VM image on your computer before the session starts. All things related to the cloud setup will be prepared for you.
 
So please prepare your VM image on your computer before the session starts. All things related to the cloud setup will be prepared for you.
  +
  +
If you have any trouble please send your feedback or help request to: '''mirko@cloudera.com'''
  +
Thanks, and see you soon.
   
   

Revision as of 09:11, 25 August 2013

Hadoop: Quickstart

Requirements:

You need a notebook with VMWare Player or VMWare Fusion (if you work on a Mac). Just in case, this works not for you, we will have a local Hadoop cluster. The Development Environment is based on Oracle JDK (Version 1.6) and Eclipse, Git and Maven.


Please download the following two VM images:

Quickstart VM http://www.cloudera.com/content/support/en/downloads/download-components/download-products.html?productID=F6mO278Rvo

Cloud-Connector VM http://training.cloudera.com/cloudera/VMs/Cloudera-Training-Get2EC2-VM-1.0-vmware.zip

Do you have a connection to the internet form inside this VM? It should work out of the box, but sometimes it does not ;-(

So please prepare your VM image on your computer before the session starts. All things related to the cloud setup will be prepared for you.

If you have any trouble please send your feedback or help request to: mirko@cloudera.com Thanks, and see you soon.


Objectives

1. Build your own Hadoop cluster in the Amazon Cloud.

2. Collect Twitter data with Flume.

3. Create, deploy and execute a MapReduce Program for the Java-API (MRv1).

4. Learn how to work with your data using the Hadoop-GUI HUE.



Ressources

... more content comming soon!