`
收藏列表
标题 标签 来源
解压压缩 hadoop首页
What Is Hadoop?

The Apache Hadoop project develops open-source software for reliable, scalable, distributed computing. Hadoop includes these subprojects:

    * Hadoop Common: The common utilities that support the other Hadoop subprojects.
    * HDFS: A distributed file system that provides high throughput access to application data.
    * MapReduce: A software framework for distributed processing of large data sets on compute clusters.

Other Hadoop-related projects at Apache include:

    * Avro: A data serialization system.
    * Chukwa: A data collection system for managing large distributed systems.
    * HBase: A scalable, distributed database that supports structured data storage for large tables.
    * Hive: A data warehouse infrastructure that provides data summarization and ad hoc querying.
    * Mahout: A Scalable machine learning and data mining library.
    * Pig: A high-level data-flow language and execution framework for parallel computation.
    * ZooKeeper: A high-performance coordination service for distributed applications.

Global site tag (gtag.js) - Google Analytics