Questo sito utilizza cookie di profilazione (propri e di terze parti) per ottimizzare la tua esperienza online e per inviarti pubblicità in linea con le tue preferenze. Continuando a utilizzare questo sito senza modificare le tue preferenze acconsenti all’uso dei cookie. Se vuoi saperne di più o negare il consenso a tutti o ad alcuni cookie clicca qui>
The website that you are visiting also provides Arabian language. Do you wish to switch language version?
يوفر موقع الويب الذي تزوره المحتوى باللغة العربية أيضًا. هل ترغب في تبديل إصدار اللغة؟
The website that you are visiting also provides Russia language Do you wish to switch language version?
Данный сайт есть в английской версии. Желаете ли Вы перейти на английскую версию?
Tsinghua University's Institute for Interdisciplinary Information Sciences wanted to build a world-class interdisciplinary information research center and talent-fostering base to promote the development of theoretical computer science and quantum information science and foster top creative talents with international competitiveness. Since its foundation, the institute has planned to build a Hadoop big data research platform to trace international cutting-edge technologies.
Big data is a technology for capturing, mining, analyzing, and organizing huge amounts of data, that cannot be processed by using traditional applications and tools, within a reasonable period of time to eventually make the data helpful for enterprises' decision-making in business operation. For the big data technology, what matters is mining the significance of massive data but not obtaining massive data. In other words, if big data is compared to an industry, the key to making profits from this industry is to improve the data processing capability to increase data value.
Technically speaking, big data must be processed by a distributed computing system instead of a standalone computer. Massive data is mined by a distributed processing system with a distributed database using virtualization technologies.
The institute planned to build the largest and most advanced big data research platform in China based on the open-source Hadoop project. This platform requires numerous servers, storage devices, and network devices to work together. As the computing core of the big data research platform, servers must offer high computing performance and I/O performance, large storage capacity, and low power consumption.
After in-depth analysis of customer requirements, Huawei provided a big data platform solution that is based on Huawei RH2288 V2 servers for the institute to perform a test. The test results from the institute show that Huawei's solution offer high computing performance and I/O performance as well as large storage capacity, which meets the requirements of the big data research platform.
The Huawei RH2288 V2 server uses the latest Intel E5-2600 series CPUs, supports a maximum of 24 RDIMMs with up to 768 GB memory capacity, provides a maximum of 26 hard disks, and uses a dual-port 10GE NIC, which meets the requirements of the big data research platform.
In addition to providing high-performance hardware for the big data research platform, Huawei designed a distributed equipment room solution and planned the device deployment scheme and air conditioning system installation based on the space and power supply conditions of the old buildings to implement distributed deployment and centralized device management, which cut down initial deployment costs. Huawei servers can operate stably at high temperature, which reduces the cooling costs of the air conditioning system and saves operating expenses by over 40%.
The Huawei RH2288 V2 and customized equipment room solution help the institute rapidly build a Hadoop big data research platform. The RH2288 V2 provides the industry-leading storage capacity and throughput bandwidth, improving performance by 30% compared with competitors' products.
Huawei's solution enables distributed deployment, unified management, and simplified O&M, which helps slash construction and O&M costs by 40%.