Apache 하둡* 커뮤니티 스포트라이트:
Apache 하둡 분산 파일 시스템*

죄송합니다. 이 PDF는 다운로드 형식으로만 제공됩니다.

Konstantin Shvachko, Project Management Committee member for the Apache Hadoop* framework and founder of AltoScale, demystifies the Apache* Hadoop* Distributed File System (HDFS*) and talks about where software development is headed. HDFS is the primary distributed storage component used by applications under the Apache open-source project Hadoop. This overview by an expert from the Apache Hadoop open-source community explains the four design principles that drive development, how HDFS works, why it’s so well suited for handling large unstructured data sets, and where the software is headed. Part of the Intel® IT Center’s Hadoop Community Spotlight series. Also listen to the podcast of the interview.