Virtualization | Networks | Workstations
Total Page:16
File Type:pdf, Size:1020Kb
FIRST REVIEW NVIDIA’S FLAGSHIP QUADRO M6000 The reference media in high-performance IT solutions www.hpcreview.com HPC | BIG DATA | CLOUD | STORAGE | VISUALIZATION | VIRTUALIZATION | NETWORKS | WORKSTATIONS EXClusive INTervieW #2 Ralph Schmitt GLOBAL OCZ CEO EdITION LAB RevieW Dell PowerEdge FX2 STORAGE InforTrend EonStor 3000 DS VIRTUALIZATION Dell Precision T5810 A Whole Toshiba W50 NEW World hoW TO Create your of groWTH Raspberry Pi supercomputing opporTuniTIES cluster 3 EDITor’S NOTE HPC Review Welcome ! This issue focuses on a hot topic nowadays, storage virtualization. Being able to get the most of an existing storage infrastructure and to extend it while winning capacity and features has become key within tight budget constraints. And virtualizing the existing storage pools and islands within the enterprise is only possible through vir- tualization. If you have a storage problem, there are good chances you will find the solution In this issue! This issue is also packed with product reviews, of which the outs- tanding – or not, depending on the use case –new flagship of NVI- DIA, the Quadro M6000 – Our benchmarks show it certainly shines on the performance side, but at a price. Our product reviews also focus on infrastructure elements like Dell’s modular and scalable PowerEdge FX2’s powerful enough to make one rethink an enter- prise equipment strategy. Not to mention InforTrend’s innovative and performance-oriented storage arrays As always, we welcome your suggestions and comments at [email protected] Happy reading! 5 CONTENTS HPC Review COVER STORY STORAGE VIRTUALIZATION A whole new world of growth opportunities NEWsfEED LAB REVIEW HOW TO NVIDIA Quadro A Raspberry Intel celebrates 50 years Pi-computing of Moore’s Law M6000 Toshiba W50 based cluster The first 3D Dell FX2 VIEWPOINT printed car Infortrend EON Stor 3000 DS The best Ralph Schmitt, Dell Precision benchmark CEO OCZ Tower 5810 is you Qlik Sense Desktop HGST Active TEch ZONE Archive CD-adapco, Books | Moocs | The HPC observatory HPC Labs : 35 years How do we test of innovation editor’s note | opinion | next issue 6 OPINION HPC Review When Data Needs More Firepower: The HPC, Analytics Convergence ig data arguably originated in the STEVE CONWAY global high-performance compu- is IDC Research ting (HPC) community in the 1950s Vice President, for government applications such HPC as cryptography, weather fore- Bcasting, and space exploration. High-perfor- mance data analysis (HPDA)—big data using HPC technology—moved into the private sec- tor in the 1980s, especially for data-intensive modeling and simulation to develop physi- cal products such as cars and airplanes. In the late 1980s, the financial services industry ment of products ranging from cars and planes (FSI) became the first commercial market to to golf clubs and potato chips. But lately, some- use HPC technology for advanced data analy- thing new is happening.Leading commercial tics (as opposed to modeling and simulation). companies in a variety of market segments Investment banks began to use HPC systems are turning to HPC-born parallel and distribu- for daunting analytics tasks such as optimi- ted computing technologies — clusters, grids, zing portfolios of mortgage-backed securi- and clouds — for challenging big data analy- ties, pricing exotic financial instruments, tics workloads that enterprise IT technology and managing firm-wide risk. More recently, alone cannot handle effectively. IDC estimates high-frequency trading joined the list of HPC- that the move to HPC has already saved PayPal enabled FSI applications. more than $700 million and is saving tens of The invention of the cluster by two NASA millions of dollars per year for some others. HPC experts in 1994 made HPC technology The commercial trend isn’t totally surprising far more affordable and helped propel HPC when you realize that some of the key techno- market growth from about $2 billion in 1990 logies underpinning business analytics (BA) to more than $20 billion in 2013. More than originated in the world of HPC. The evolution 100,000 HPC systems are now sold each year of these HPC-born technologies for business at starting prices below $50,000, and many of analytics has taken two major leaps and is in them head into the private sector. the midst of a third. The advances have fol- lowed this sequence: WHAT’S NEW? • Phase 1 was the advance from the main- It’s widely known that industrial firms of all frame mentality of running single applica- sizes have adopted HPC to speed the develop- tions on traditional SMP servers to modern 7 OPINION HPC Review clusters (i.e., systems that lash together ho- there is a need to go beyond query-driven mogeneous Linux or Windows blades to ex- searches in order to discover unknown pat- ploit the attractive economics of commodity terns and relationships in data — such as hardware). The cluster was invented by two for fraud detection, to reveal hidden com- NASA HPC experts in 1994. monalities within millions of archived me- • Phase 2 was the move to grids with the goal dical records, or to track buying behaviors of supporting multiple applications across through wide networks of relatives and ac- business units coherently. This enables en- quaintances. IDC believes that HPC techno- terprisewide management of the applica- logy will play a crucial role in the transition tions and workloads. from today’s static searches to the emerging • Phase 3 is the emerging move to cloud com- era of higher-value, dynamic pattern disco- puting, which focuses on delivering generic very. computing resources to the applications and • High time criticality. Information that is not business units on an on-demand, pay-as- available quickly enough may be of little you-go basis. Clouds can be hosted within value. The weather report for tomorrow is a company, by an external provider, or as a useless if it’s unavailable until the day after hybrid combination of both. tomorrow. At PayPal, enterprise technology • was unable to detect fraudulent transac- WhY BUSINEssES TURN TO HPC FOR tions until after the charges had hit consu- AdvANCED DATA ANALYTIcs mers’ credit cards. The move to high-perfor- High-performance data analysis is the term mance data analysis using HPC technology IDC coined to describe the formative mar- corrected this problem. For financial ser- ket for big data workloads that exploit HPC vices companies engaged in high frequency resources. The HPDA market represents the trading, HPC technology enables proprieta- convergence of long-standing, data-intensive ry algorithms to exploit market movements modeling and simulation (M&S) methods in in minute fractions of a second, before the the HPC industry/application segments that opportunities disappear. IDC has tracked for more than 25 years and • High variability. People generally assume newer high-performance analytics methods that big data is “deep,” meaning that it in- that are increasingly employed in these seg- volves large amounts of data. They reco- ments as well as by commercial organizations gnize less often that it may also be “wide,” that are adopting HPC for the first time. HPDA meaning that it can include many variables. may employ either long-standing numerical Think of “deep” as corresponding to lots of M&S methods, newer methods such as large- spreadsheet rows and “wide” as referring to scale graph analytics, semantic technologies, lots of columns (although a growing num- and knowledge discovery algorithms, or some ber of high-performance data analysis pro- combination of long-standing and newer me- blems don’t fit neatly into traditional row- thods. and-column spreadsheets). A “deep” query The factors driving businesses to adopt HPC might request a prioritized listing of last for big data analytics (i.e., HPDA) fall into a quarter’s 500 top customers in Europe. A few main categories: “wide” query might go on to analyze their • High complexity. HPC technology allows buying preferences and behaviors in rela- companies to aim more complex, intelligent tion to dozens of criteria. An even “wider” questions at their data infrastructures. This analysis might employ graph analytics to ability can provide important advantages in identify any fraudulent behavior within the today’s increasingly competitive markets. customer base. HPC technology is especially useful when 9 NEWSFEED HPC Review 50 years of Moore’s Law 50 years ago, Gordon Moore, Intel co-founder decreed that the power of microprocessors would double every 24 months. Five decades later, this prediction continues to be true. costs less than a billion times than 50 years ago. And a comical resulting enacted by Moore himself, who says that the number of transis- tors produced since finally exceeded the num- ber of ants on Earth! 1969: AN INNOVATIVE REspONSE TO AN UNUSUAL DEMAND In 1969, the Japanese company Nippon Calcu- lating Machine Corporation Intel approach to ) design a set of 12 chips for its new Busicom 141- ap ( PF calculator. Engineers from Intel respond uma K by suggesting a family of only four chips, of sa which one would be programmable. This set paul of four chips known as the MCS-4 included a « The number of central processing unit (CPU) 4004, a read only memory (ROM) for custom applications pro- transistors in a processor grams, a random access memory chip (RAM) for data processing and an input / output (I / O) will double approximately circuit. every 24 months. » 1971: THE BEGINNING OF THE ERA OF — Gordon Moore, Intel co-founder INTEGRATED ELECTRONIcs With its 2,300 transistors Intel 4004 processor ver the investments in semi- can be programmed for use in a variety of pro- conductor manufacturing tech- ducts. In this it differed from its predecessors nology, Intel has made Moore’s and peers. After buying Law a reality. Relying on his own the rights of Nippon Cal- observations and his calculations culating Machine Corpo- OGordon Moore had no idea that his law would ration, Intel launched the last as long.