RHIPE: Using Hadoop to analyze very large data with R

Hadoop is an Apache project that provides an environment that supports the management of large amounts of data. Such systems are typically used to support the large amount of information generated by great social networks, especially when relational databases are beginning to have problems of scalability, or the cost of growth is too high.

 

Another good application of these management systems BigData is support the analysis of very large information. For this reason BI Pentaho's suite is integrated with Hadoop and allows Hadoop to use as a data source.
 
Another good application of Hadoop is found in RHIPE project, an interface between Hadoop and R , the open source tool for statistical processing, which allows you to use Hadoop to efficiently support over very large data on a statistical analysis that can be performed with R .
The following video, obtained by Article RHIPE: An Interface Between Hadoop and R for Large and Complex Data Analysis presents this interesting interface:

 

 

RHIPE: An Interface Between Hadoop and R
Presented by Saptarshi Guha

Video Link

 

Post new comment

The content of this field is kept private and will not be shown publicly.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.