Cloud Computing and Big Data Is There a Relation Between the Two: a Study

Total Page:16

File Type:pdf, Size:1020Kb

Cloud Computing and Big Data Is There a Relation Between the Two: a Study International Journal of Applied Engineering Research ISSN 0973-4562 Volume 12, Number 17 (2017) pp. 6970-6982 © Research India Publications. http://www.ripublication.com Cloud Computing and Big Data is there a Relation between the Two: A Study Nabeel Zanoon1, Abdullah Al-Haj2, Sufian M Khwaldeh3 1 Department of Applied science, Al- Balqa Applied University/Aqaba, Jordan. 2 Faculty of Information Technology, University of Jordan/Aqaba, Jordan. 3Business Information Technology (BIT) Department, The University of Jordan/Aqaba, Jordan. 1Orcid ID: 0000-0003-0581-206X Abstract and combinations of each. Such data can be directly or indirectly related to geospatial information [3]. Communicating by using information technology in various ways produces big amounts of data. Such data requires Cloud computing refers to on-demand computer resources and processing and storage. The cloud is an online storage model systems available across the network that can provide a where data is stored on multiple virtual servers. Big data number of integrated computing services without local processing represents a new challenge in computing, resources to facilitate user access. These resources include especially in cloud computing. Data processing involves data data storage capacity, backup and self-synchronization [4]. acquisition, storage and analysis. In this respect, there are Most IT Infrastructure computing consist of services that are many questions including, what is the relationship between provided and delivered through public centers and servers big data and cloud computing? And how is big data processed based on them. Here, clouds appear as individual access in cloud computing? The answer to these questions will be points for the computing needs of the consumer. It is generally discussed in this paper, where the big data and cloud expected for commercial offers to meet the QoS requirements computing will be studied, in addition to getting acquainted of customers or consumers, and typically include service level with the relationship between them in terms of safety and agreements (SLAs) [5]. They are an online storage model challenges. We have suggested a term for big data, and a where data are stored on multiple virtual servers, rather than model that illustrates the relationship between big data and being hosted on a specific server, and are usually provided by cloud computing. a third party. The hosting companies, which have advanced data centers, rent spaces that are stored in a cloud to their Keywords: big data, Hadoop, Cloud, MapReduce, resources, customers in line with their needs [6]. Five (Vs). The expert Erik Brynjolfsson likened big data to a microscope which was invented in old times, and by which scientists were INTRODUCTION able to identify and measure things they had never imagined before at the cell level. This is similar to big data which is a Data is the raw material for information before sorting, modern day microscope by which you are able to see things arranging and processing. It cannot be used in its primary and measure data that you never have expected. [7] The form prior to processing. Information represents data after statistics shown in [8] show that data growth in cloud processing and analysis [1]. The technology has been environments is increasing exponentially and rapidly with the developed and used in all aspects of life, increasing the increasing number of internet users around the world. With demand for storing and processing more data. As a result, this rapid growth, the question that comes to mind is how can several systems have been developed including cloud these vast amounts of data be stored in cloud environments? computing that support big data. While big data is responsible We need storage technology that meets the needs of rapid data for data storage and processing, the cloud provides a reliable, growth on the cloud and we need storage technology with low accessible, and scalable environment for big data systems to cost, high reliability and high capability. function [2]. Big data is defined as the quantity of digital data produced from different sources of technology for example, The relationship between big data and the cloud computing is sensors, digitizers, scanners, numerical modeling, mobile based on integration in that the cloud represents the phones, Internet, videos, e-mails and social networks. The storehouse and the big data represents the product that will be data types include texts, geometries, images, videos, sounds stored in the storehouse, since it is not possible to create storehouses without storing any product in them. The 6970 International Journal of Applied Engineering Research ISSN 0973-4562 Volume 12, Number 17 (2017) pp. 6970-6982 © Research India Publications. http://www.ripublication.com traditional databases known as 'relational' are no longer sufficient to process multiple-source data. For example, how can these traditional methods deal with data such as record of transactions, customer behavior, mobile phone and GPS navigation, and others. Here comes the role of cloud computing. At this point, a relationship between big data and the cloud will arise. In this paper, the relationship between them will be discussed, in addition to the obstacles and challenges that this relationship may encounter. BIG DATA Big data comes and is composed through electronics operations from multiple sources. It requires proper processing power and high capabilities for analysis [9]. The importance of big data lies in the analytical use which can help generate an informed decision to provide better and faster services [10]. The term big data is called on the huge amount of high-speed Figure 1 . Characteristics Of Big Data big data of different types; this data cannot be processed and stored in regular computers. The main characteristics of big There have been numerous revisions to the big data until they data, called V's 5 As in Figure 1 , can be summed up in the reached (7 v) [14]. In this paper, based on the relationship fact that the issue is not only about the volume of data, other between cloud computing and big data, will suggest a new dimensions of big data, known as 'five Vs', are as follows: term, virtualization, which virtually represents The data structure is by default. The virtualization of big data is a 1. Volume: It represents the amount of data produced from process that focuses on creating virtual structures for big data multiple sources which show the huge data in numbers systems. Virtualization technology is the key technology used by zeta bytes. The volume is most evident dimension in to help cloud computing handle large amounts of data flexibly what concerns to big data. and facilitate the process of managing big data. The virtual 2.Variety: It represents data types, with, increasing storage technology will be studied in section (6.2). the number of Internet users everywhere, smart phones and social networks users, the familiar form of data has changed from structured data in databases to The type and nature of the data unstructured data that includes a large number of formats such as images, audio and video clips, SMS, and Data in general is a set of values that are in the form of GPS data [11]. numbers, letters, symbols and other forms where they are 3. Velocity: It represents the speed of data frequency from concerned with a particular idea and subject .The data does different sources, that is, the speed of data production not make sense without analysis, and is, therefore, compiled such as Twitter and Facebook. The huge increase in data for use. It represents input, while information is output after volume and their frequency dictates the need for a processing, i.e. data is entered into the system first, then system that ensures super-speed data analysis. processed until it comes out in the form of useful information 4. Veracity: It represents the quality of the data, it shows the that has a clear meaning and against which decisions are accuracy of the data and the confidence in the data made. content. The quality of the data captured can vary Big data comes from multiple sources including sensors and greatly, which affects the accuracy of analysis. Although free texts such as social media, unstructured data, metadata there is wide agreement on the potential value of big and other geospatial data collected from web logs, GPS, data, the data is almost worthless if it is not accurate medical devices, etc. [15]. The big data is gathered from [12]. different sources ,so it is in several forms, including: 5. Value: It represents the value of big data, i.e. it shows the importance of data after analysis. This is due to the fact 1. Structured data: It is the organized data in the form of that the data on its own is almost worthless. The value tables or databases to be processed. lies in careful analysis of the exact data, the information 2. Unstructured data: It represents the biggest proportion of and ideas it provides. The value is the final stage that data; it is the data that people generate daily as texts, comes after processing volume, velocity, variety, images, videos, messages, log records, click-streams,etc. contrast, validity and visualization [13] 6971 International Journal of Applied Engineering Research ISSN 0973-4562 Volume 12, Number 17 (2017) pp. 6970-6982 © Research India Publications. http://www.ripublication.com 3. Semi-structured data: or multi-structured ,It is regarded a evolution of multitasking technology tools the data has kind of structured
Recommended publications
  • Cloud Computing and Enterprise Data Reliability Luan Gashi University for Business and Technology, [email protected]
    University of Business and Technology in Kosovo UBT Knowledge Center UBT International Conference 2016 UBT International Conference Oct 28th, 9:00 AM - Oct 30th, 5:00 PM Cloud Computing and Enterprise Data Reliability Luan Gashi University for Business and Technology, [email protected] Follow this and additional works at: https://knowledgecenter.ubt-uni.net/conference Part of the Communication Commons, and the Computer Sciences Commons Recommended Citation Gashi, Luan, "Cloud Computing and Enterprise Data Reliability" (2016). UBT International Conference. 56. https://knowledgecenter.ubt-uni.net/conference/2016/all-events/56 This Event is brought to you for free and open access by the Publication and Journals at UBT Knowledge Center. It has been accepted for inclusion in UBT International Conference by an authorized administrator of UBT Knowledge Center. For more information, please contact [email protected]. Cloud Computing and Enterprise Data Reliability Cloud Computing and Enterprise Data Reliability Luan Gashi UBT – Higher Education Institution, Lagjja Kalabria, 10000 p.n., Prishtine, Kosovo [email protected] Abstract. Cloud services offer many benefits from information and communication technology that to be credible must first be secured. To use the potential of cloud computing, data is transferred, processed and stored in the infrastructures of these service providers. This indicates that the owners of data, particularly enterprises, have puzzled when storing their data is done outside the scope of their control. Research conducted on this topic show how this should be addressed unequivocally. The provided information on the organization of cloud computing models, services and standards, with a focus on security aspects in protecting enterprise data where emphasis shows how data access is treated with reliability from providers of these services.
    [Show full text]
  • Geospatial Data AS a Service: Use Cases
    GEOSPATIAL DATA AS A SERVICE USE CASES A Report of the National Geospatial Advisory Committee December 2018 Geospatial Data as a Service – Use Cases December 2018 Introduction This document was developed to complement the material in the NGAC report on Geospatial Data as a Service: A vital investment in American Enterprise and to provide real-world use cases of Data as a Service (DaaS) in action. DaaS opens new possibilities and innovation by improving access to and use of data. Users and producers across government, industry, open source, education, and the sciences are turning to DaaS to meet their huge appetite for information, exponential storage needs, data archiving and to minimize data duplication and reduce costs. To illustrate these points, we examined four case studies that provide more in-depth use cases from across Federal and local agencies and the private sector. These case studies showcase real world scenarios with large, distributed data sources to support a broad range of end user needs. The case studies presented are not intended to be an exhaustive list of examples of DaaS in use; there are many other relevant examples and programs of DaaS across the stakeholder community. To demonstrate use of DaaS for local governments and emergency response, we highlight the work of the Missouri Task Force One (MO-TF1) and Boone County Fire Protection District (BCFPD). MO-TF1 and BCFPD are using public cloud for basemap creation, data access, and delivery to enable a common operating picture of critical data during deployments. (Appendix 1 - Data as a Service & Cloud Computing: Disaster Response Just-in-Time Basemap Creation for Deployments).
    [Show full text]
  • Evolution of As-A-Service Era in Cloud
    Evolution of as-a-Service Era in Cloud Sugam Sharma Center for Survey Statistics and Methodology, Iowa State University, Ames, Iowa, USA Email: [email protected] Abstract. Today, a paradigm shift is being observed in science, where the focus is gradually shifting toward the cloud environments to obtain appropriate, robust and affordable services to deal with Big Data challenges (Sharma et al. 2014, 2015a, 2015b). Cloud computing avoids any need to locally maintain the overly scaled computing infrastructure that include not only dedicated space, but the expensive hardware and software also. In this paper, we study the evolution of as-a-Service modalities, stimulated by cloud computing, and explore the most complete inventory of new members beyond traditional cloud computing stack. Keywords. Cloud, as-a-Service 1. Introduction Today, the cloud computing also has emerged as one of the major shifts in recent information and communication age that promises an affordable and robust computational architecture for large-scale and even for overly complex enterprise applications (Fenn et al. 2008). It is a powerful and revolutionary paradigm that offers service-oriented computing and abstracts the software-equipped hardware infrastructure from the clients or users. Although, the concept of cloud computing is mainly popular in three praxises- 1) IaaS (Infrastructure-as-a-Service ), 2) PaaS (Platform-as-a-Service ), and 3) SaaS (Software-as-a-Service ), but in this data science age, should be equally expandable to DBaaS (Database- as-a-Service ) (Curino, et al., 2011; Seibold et al., 2012). Over the last few year, the cloud computing has evolved as scalable, secure and cost-effective solutions to an overwhelming number of applications from diversified areas.
    [Show full text]
  • Accelerate Delivery of Your Modern Data Analytics Platform with the Adatis Framework
    Accelerate delivery of your Modern Data Analytics Platform with the Adatis Framework Data analytics is more relevant in today’s business than ever but traditional data warehouse platforms are evolving. The demands placed on analytics platforms are to provide both the traditional, structured, standardised BI and to support new ways of analysing data based on agility, supporting rapid change and semi/unstructured data. To support this new form of advanced analytics, including machine learning, predictive and artificial intelligence, a new flexible, cloud-based platform is required. Adatis ’ Microsoft Azure based Data Platform Framework is used extensively on client “With Adatis’ projects to under-pin and fast track the development of enterprise data platforms that satisfy their clients data analytics requirements. The Adatis Framework has knowledge of Microsoft been developed over may years based on industry best practice and is constantly Azure Data Analytics, updated and enhanced to make the most of technology enhancements released by Microsoft. for the first time we are able to instantly The Adatis Framework uses the advanced data transformation capabilities of the Microsoft Azure stack, including Azure Data Factory, Azure ML Services, Azure monitor, predict, Databricks, and Azure Data Lake Store, to: troubleshoot Measurement and ▪ Ingest data from multiple sources and file types analyse inand ways review we ▪ Cleanse data and handle errors ▪ Track lineage and log activity would not believe ▪ Automate processes and generate code possible before.” ▪ Manage costs and resource use How it works The Adatis framework takes feeds from multiple source systems and, provides a technical framework and reference architecture for ingesting, preparing, shaping, integrating, and storing the data in a data lake Adatis Modern Data Platform | adatis.co.uk The Adatis framework brings many benefits to your team and your projects: Proven: We’ve built and refined the framework over years of working with companies on their transformation initiatives.
    [Show full text]
  • Lenovo Big Data Validated Design for Cazena Saas Using Cloudera on Thinkagile HX
    Lenovo Big Data Validated Design for Cazena SaaS using Cloudera on ThinkAgile HX Last update: 27 June 2019 Version 1.0 Configuration Reference Number: BGDCL03XX91 Reference architecture for Solution based on the Cloudera Enterprise with Cazena ThinkSystem HX platform running SaaS Data Lakes VMware vSphere virtualization Deployment considerations for Solution based on Cazena’s Fully scalable racks including detailed Managed SaaS Data Lake running validated bills of material on the ThinkAgile HX platform Dan Kangas (Lenovo) Venkat Chandra (Cazena) John Piekos (Cazena) Keith Adams (Lenovo) Ajay Dholokia (Lenovo) Gary Cudak (Lenovo) 1 Lenovo Big Data Validated Design for Cazena SaaS using Cloudera on ThinkAgile HX Table of Contents 1 Introduction .............................................................................................. 4 2 Business problem and business value .................................................. 5 Business Problem .................................................................................................... 5 Business Value ........................................................................................................ 5 2.2.1 Time to Production....................................................................................................................... 5 2.2.2 Deploy ML & Analytics Quickly with the AppCloud ..................................................................... 6 2.2.3 Plug & Play Enterprise Deployment: ..........................................................................................
    [Show full text]
  • Towards an Efficient Distributed Cloud
    TOWARDS AN EFFICIENT DISTRIBUTED CLOUD ARCHITECTURE BY PRAVEEN KHETHAVATH Bachelor of Engineering in Electronics and Communication Engineering Osmania University Hyderabad, AP, INDIA 2006 Master of Science in Computer Science University of Northern Virginia Annandale, VA 2008 Submitted to the Faculty of the Graduate College of the Oklahoma State University in partial fulfillment of the requirements for the Degree of DOCTOR OF PHILOSOPHY July, 2014 TOWARDS AN EFFICIENT DISTRIBUTED CLOUD ARCHITECTURE Dissertation Approved: Johnson P Thomas Dissertation Adviser Eric Chan-tin Dissertation Co-Adviser Subhash Kak Mary Gade ii LIST THE PUBLICATIONS YOU HAVE FROM THIS WORK Praveen Khethavath, Johnson Thomas. “Game Theoretic approach to Resource provisioning in a Distributed Cloud”, submitted at 28th IEEE International Conference on. Advanced Information Networking and Applications Workshops WAINA 2014(Accepted) Praveen Khethavath, Johnson Thomas, Eric Chan-Tin, and Hong Liu. "Introducing a Distributed Cloud Architecture with Efficient Resource Discovery and Optimal Resource Allocation". In Proceedings of 3rd IEEE SERVICES CloudPerf Workshop 2013 Praveen Khethavath, Nhat, Prof. Johnson P Thomas. “A Virtual Robot Sensor Network (VRSN)”. In Proceedings of 2nd International Workshop on Networks of Cooperating Objects CONET 2011 Praveen Khethavath, Johnson Thomas. “Distributed Cloud Architecture: Resource Modelling and Security Concerns”. In Proceedings of 3rd Annual conference on Theoretical and Applied Computer Science (TACS 2012) iii ACKNOWLEDGEMENTS I would like to express my deepest gratitude to my advisor, Dr. Johnson Thomas for his excellent guidance, patience, and providing me with an excellent atmosphere for doing research and throughout my thesis. His guidance helped me to successfully complete my research. For me, he was not only a respectable professor who led me on the way to do research, but also an attentive tutor who trained me to be a good teacher in my future career.
    [Show full text]
  • IDC TECHNOLOGY SPOTLIGHT Sponsored By: IBM
    IDC TECHNOLOGY SPOTLIGHT Sponsored by: IBM The need for consistent distributed cloud management and control will increase as enterprises ramp up data-intensive digital business solutions across public cloud, on-premises datacenters, and edge computing locations. Digital Business Depends on Distributed Clouds April 2021 Written by: Mary Johnston Turner, Research Vice President, Future of Digital Infrastructure Enterprise Digital Strategies Are Built on Cloud AT A GLANCE Regardless of industry or geography, business is becoming more digital and data centric. The global pandemic and pivot to remote and virtual workplaces KEY STATS accelerated this transformation. As a result, IDC predicts that by 2022, 46% of » IDC predicts that 46% of enterprise enterprise products and services will be digital or digitally delivered worldwide. products and services will be digital or digitally delivered by 2022. Much of the growth in digital business is underpinned by public cloud, » IDC expects that cloud will account for software-defined on-premises datacenters, and edge computing platforms. 63% of IT infrastructure spend by 2024. DevOps and agile development efforts using cloud-native container platforms and services have allowed many enterprises to rapidly build and deploy KEY TAKEAWAYS The most successful digital enterprises are modern online business offerings that harness the power of highly advanced those that can optimize cloud and workload data, analytics, and developer services in the cloud. DevOps and digital deployments to meet the needs of specific line-of-business (LOB) teams have benefited from having access to highly applications and business processes while scalable cloud resources, without needing to worry about configuring or simultaneously managing risk, maintaining maintaining the underlying infrastructure, monitoring, and security.
    [Show full text]
  • Charting New Territory with Google Cloud Platform
    Airbus: Charting new territory with Google Cloud Platform The Intelligence business line of Airbus Defence and Space uses Google Cloud Platform to build a scalable, online platform, enabling customers to access petabytes of satellite imagery in real time. Adds millions GOOGLE CLOUD RESULTS of square • Reduces the time required for customers to access satellite kilometers imagery from hours to less than half a second of imagery per day • Delivers a scalable, cloud-based platform for satellite images with no limits on size or users • Provides access to satellite imagery for smaller organizations with streaming technology, APIs, and innovative pricing schemes Airbus Intelligence, a unit of the Airbus entities that benefit from high quality we wanted to move forward, Defence and Space division, delivers imagery such as agricultural companies If we needed to have our comprehensive, up-to-date satellite and oil and gas firms. To stay ahead satellite images online and imagery and data to more than 1,200 of the curve, Airbus Intelligence was provide our customers with customers in over 100 countries around keen to build on its success and real-time access. Google Cloud the world. Since its founding more than redesign its satellite imagery platform Platform helps us to do that while three decades ago, Airbus Intelligence for scale and speed. To do that, it maintaining our capacity to offer has expanded its roster of clients from turned to Google Cloud Platform (GCP). very large amounts of imagery governmental security to include other and data. Laurent Gabet, Optical R&D Manager, Airbus CIVIL DEFENCE AGRI SECURITY CULTURE LAND INTELLI ADMIN.
    [Show full text]
  • The Foundations of Hybrid IT
    Foundations of Hybrid IT EDITED BY DAVID CHERNICOFF & RICHARD MCGILL MURPHY TABLE OF CONTENTS: FOUNDATIONS OF HYBRID IT 3 Executive summary By David Chernicoff & Richard McGill Murphy 5 SECTION The future is hybrid: An interview 01 with Craig Partridge, HPE Pointnext By Pam Baker 12 SECTION 02 Hybrid IT survival skills By Steven J. Vaughan-Nichols 23 SECTION How to evolve your infrastructure 03 to hybrid IT By Ken Hess 35 SECTION How hybrid IT enables the 04 software-defined data center By Alyson Behr 45 SECTION 05 Packaging your environment for hybrid IT By Frank Ohlhorst 51 SECTION 06 The edge vs. central IT By Pedro Pereira Executive summary By David Chernicoff & Richard McGill Murphy Hybrid IT provides the flexibility that enterprises need to navigate digital transformation. Hybrid infrastructure enables business growth, a strong ROI and a clear path to the future. « RETURN TO TABLE OF CONTENTS EXECUTIVE SUMMARY 01 02 03 04 05 06 4 FOUNDATIONS OF HYBRID IT | EXECUTIVE SUMMARY Executive summary Today, most businesses use a mix of traditional IT and cloud services, combining elements Key business from both ends of the spectrum to meet their goals for IT: business needs, often in an ad hoc fashion. • Increase customer The strongest enterprises understand that they will need engagement, share the right combination of next-generation apps, services and of wallet, satisfaction technologies that are widely distributed, continually changing and loyalty and delivered at massive scale. In this future, IT will deliver services when and where they are required, using the • Create new areas of appropriate technology for each workload.
    [Show full text]
  • The Xaas Revolution Distribution’S Pivot to the Future Xaas Factors in Focus and on Demand
    The XaaS Revolution Distribution’s Pivot to the Future XaaS Factors in Focus and On Demand Technology was never of greater importance than in the year 2020. With the right tools in place, companies were able to overcome global lockdowns, workplace disruptions and travel bans — and remain successful. At the root of that success? Everything as a service (XaaS), which ensured workers could continue collaborating, conduct business as usual and pioneer breakthroughs. Much of what organizations and end users depend on to produce these outcomes is delivered, financed and supported by the world’s leading distributors and the thousands of ICT integrators and consultants that depend on them. Through wave after wave of technology disruption, this community of companies has persevered through good times and bad. The world’s distributors of ICT goods and services are experts at helping their partners find their way through the fog created by this massive interruption. Those valued partners and enablers include members of the Global Technology Distribution Council, representing 22 distributors that do business in North America, South America, Europe, the Middle East, Asia and Africa. There are several factors driving XaaS adoption. These platforms allow businesses to better support remote workforces, quickly deploy new technologies, and make instant updates. Those Table of Contents advantages explain why XaaS is big business today. In 2019, global sales totaled nearly $120 billion, according to Reports and Data. The XaaS market isn’t just large; it continues to gain 03 XaaS Up Close: Scope, Size and Solutions momentum — increasing by 25% per annum, states Research Insights, and will soon top $345 07 Vendor Upstarts and Incumbents in the Mix billion globally.
    [Show full text]
  • A Data-Science-As-A-Service Model
    A Data-Science-as-a-Service Model Matthias Pohl, Sascha Bosse and Klaus Turowski Magdeburg Research and Competence Cluster, Faculty of Computer Science, University of Magdeburg, Magdeburg, Germany. Keywords: Data-Science-as-a-Service, Cloud Computing, Service Model, Data Analytics, Data Science. Abstract: The keen interest in data analytics as well as the highly complex and time-consuming implementation lead to an increasing demand for services of this kind. Several approaches claim to provide data analytics functions as a service, however they do not process data analysis at all and provide only an infrastructure, a platform or a software service. This paper presents a Data-Science-as-a-Service model that covers all of the related tasks in data analytics and, in contrast to former technical considerations, takes a problem-centric and technology- independent approach. The described model enables customers to categorize terms in data analytics environ- ments. 1 INTRODUCTION able recommendations, decision or application servi- ces for non-expert customers or just analogies to stan- An increasing keen interest in data science and data dard concepts is not clarified. Furthermore, a custo- analytics exists as reported by a trend analysis of mer is confronted to choose within a set of apparently the last 5 years (Google Trends). The data-intensive unclear buzzwords like data science, data mining, big world allures with generating revenue from analy- data analytics, etc. zing information and data that are simply and quickly Therefore we make a predefinition for the usage available. Different approaches for knowledge dis- of these terms and will diffuse a definition through covery (KDD) (Fayyad et al., 1996) or business- argumentation.
    [Show full text]
  • Agenda Cloud Computing – What Is
    Navigating Thunderstorms: University of California, San Diego Managing Risks in the Cloud Audit & Management Advisory Services Daren Kinser – Auditor, UCSD Jennifer McDonald – Auditor, UCSD Agenda Cloud Computing Overview Cloud Related Applications Interactive Risk Assessment Exercise Cloud Computing –What Is It? National Institute of Standards and Technology (NIST): “Cloud computing is a model for enabling convenient, on‐demand network access to a shared pool of configurable computing resources (e. g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. This cloud model is composed of five essential characteristics, three service models, and four deployment models.”1 1. Mell, Peter and Tim Grance. “The NIST Definition of Cloud Computing” The National Institute of Standards and Technology. September 2011. http://csrc.nist.gov/publications/nistpubs/800‐145/SP800‐145.pdf 1 Navigating Thunderstorms: University of California, San Diego Managing Risks in the Cloud Audit & Management Advisory Services Five Characteristics On‐Demand Self‐Service Broad Network Access Resource Pooling Rapid Elasticity Measured Service Three Service Models Software as a Service (SaaS) ‐ Applications hosted by an outside party and accessed via the internet. Avoids the need for complex in‐house software installation and management. Examples: Salesforce.com Infrastructure as a Service (IaaS) – Resources owned by an outside party and used on a
    [Show full text]