International Journals of Advanced Research in Computer Science and Software Engineering Research Article August ISSN: 2277-128X (Volume-7, Issue-8) a 2017 Data Integrity Techniques in Cloud Computing: An Analysis Neha Thakur Aman Kumar Sharma Research Scholar, Himachal Pradesh University, Professor, Himachal Pradesh University, Shimla, Himachal Pradesh, India Shimla, Himachal Pradesh, India DOI: 10.23956/ijarcsse/V7I8/0141 Abstract: Cloud computing has been envisioned as the definite and concerning solution to the rising storage costs of IT Enterprises. There are many cloud computing initiatives from IT giants such as Google, Amazon, Microsoft, IBM. Integrity monitoring is essential in cloud storage for the same reasons that data integrity is critical for any data centre. Data integrity is defined as the accuracy and consistency of stored data, in absence of any alteration to the data between two updates of a file or record. In order to ensure the integrity and availability of data in Cloud and enforce the quality of cloud storage service, efficient methods that enable on-demand data correctness verification on behalf of cloud users have to be designed. To overcome data integrity problem, many techniques are proposed under different systems and security models. This paper will focus on some of the integrity proving techniques in detail along with their advantages and disadvantages. Keywords: PDP, MAC, CSP, POR I. INTRODUCTION Cloud Computing has been envisioned as the definite and concerning solution to the rising storage costs of IT Enterprises. Cloud computing has been acknowledged as one of the prevailing models for providing IT capacities. Clouds have emerged as a computing infrastructure that enables rapid delivery of computing resources as a utility in a dynamically scalable virtualized manner. There are many cloud computing initiatives from IT giants such as Google, Amazon, Microsoft, IBM [1]. Data outsourcing [2] to cloud storage servers is raising trend among many firms and users owing to its economic advantages. This essentially means that the owner (client) of the data moves its data to a third party cloud storage server which is supposed to - presumably for a fee - faithfully store the data with it and provide it back to the owner whenever required. As data generation is far outpacing data storage it proves costly for small firms to frequently update their hardware whenever additional data is created. Also maintaining the storages can be a difficult task. Storage outsourcing of data to cloud storage helps such firms by reducing the costs of storage, maintenance and personnel. It can also assure a reliable storage of important data by keeping multiple copies of the data thereby reducing the chance of losing data by hardware failures. Storing of user data in the cloud despite its advantages has many interesting security concerns which need to be extensively investigated for making it a reliable solution to the problem of avoiding local storage of data. Many problems like data authentication and integrity. II. DATA INTEGRITY In terms of a database data integrity [3] refers to the process of ensuring that a database remains an accurate reflection of the universe of discourse it is modeling or representing. In other words there is a close correspondence between the facts stored in the database and the real world it models. Integrity, in terms of data security, is the guarantee that data can only be accessed or modified by those authorized to do so, in simple words it is the process of verifying data. Data Integrity is important among the other cloud challenges. As data integrity gives the guarantee that data is of high quality, correct, unmodified. After storing data to the cloud, user depends on the cloud to provide more reliable services to them and hopes that their data and applications are in secured manner. But that hope may fail sometimes the user‟s data may be altered or deleted. At times, the cloud service providers may be dishonest and they may discard the data which has not been accessed or rarely accessed to save the storage space or keep fewer replicas than promised [4]. Moreover, the cloud service providers may choose to hide data loss and claim that the data are still correctly stored in the Cloud. As a result, data owners need to be convinced that their data are correctly stored in the Cloud. So, one of the biggest concerns with cloud data storage is that of data integrity verification at untrusted servers. In order to solve the problem of data integrity checking, many researchers have proposed different systems and security models. 2.1 Data Integrity Issues A. Data Loss or Manipulation Users have a huge number of user files. Therefore, cloud providers provide storage as service (SaaS). Those files can be accessed every day or sometimes rarely. Therefore, there is a strong need to keep them correct. This need is caused by the nature of cloud computing since the data is outsourced to a remote cloud, which is unsecured and unreliable. Since the cloud is untrustworthy, the data might be lost or modified by unauthorized users. In many cases, data could be altered intentionally or accidentally. Also, there are many administrative errors that could cause losing data such as getting or restoring incorrect backups. The attacker could utilize the users‟ outsourced data since they have lost the control over it. © www.ijarcsse.com, All Rights Reserved Page | 121 Thakur et al., International Journal of Advanced Research in Computer Science and Software Engineering7(8) ISSN(E): 2277-128X, ISSN(P): 2277-6451, DOI: 10.23956/ijarcsse/V7I8/0141, pp. 121-125 B. Untrusted Remote Server Performing Computation Cloud computing is not just about storage. Also, there are some intensive computations that need cloud processing power in order to perform their tasks. Therefore, users outsource their computations. Since the cloud provider is not in the security boundary and is not transparent to the owner, no one will prove whether the computation integrity is intact or not. Sometimes, the cloud provider behaves in such a way that no one will discover a deviation of computation from normal execution. Because the resources have a value to the cloud provider, the cloud provider could not execute the task in a proper manner. Even if the cloud provider is considered more secure, there are many issues such as those coming from the cloud provider‟s underlying systems, vulnerable code or misconfiguration. III. DATA INTEGRITY AUTHENTICATION TECHNIQUES AND THEIR CHALLENGES In Cloud computing the issue of data integrity is still carried out by many researchers. There is lot of research still going on in this field to provide secure and efficient data integrity in cloud computing. Researchers have given many solutions to focus on resolving the issues of data integrity. This paper provides survey on the different techniques of data integrity. The basic schemes for data integrity in cloud are existing Provable Data Possession (PDP) and Proof of Retrievability (PoR). The following section describes the privacy techniques for data integrity [5]. 3.1 Provable Data Possession (PDP) Provable Data Possession (PDP) is a technique for assuring data integrity over remote servers. In PDP a client that has stored data at an unfaithful server can verify that the server possesses the original data without retrieving it. Ateniese is the first to consider public audit ability in their defined “provable data possession” model for ensuring possession of files on untrusted storages. [6] The working principle of PDP is as shown in Fig.1. It works in two stages i.e. pre-process and store stage and verifies file possession by server stage. Pre-process and store: The client generates pair of matching keys public & secret key by using probabilistic key generation algorithm. Public key along with the file will be sent to the server for storage by client. Verify file possession by server stage: The client challenges the server for a proof of possession for a subset of the blocks in the file. The client checks the response from the server. Fig1: Protocol for provable data possession [8] 3.2 Basic PDP Scheme Based on MAC In paper [9] author proposed Message Authentication Code [MAC] based PDP to ensure data integrity of file F stored on cloud storage in very simple way .The data owner computes a MAC of the whole file with a set of secret keys and stores them locally before outsourcing it to CSP. It Keeps only the computed MAC on his local storage, sends the file to the Cloud Service Provider [CSP]. Whenever a verifier needs to check the Data integrity of file F, He/she sends a request to retrieve the file from CSP, reveals a secret key to the cloud server and asks to recomputed the MAC of the whole file, and compares the re-computed MAC with the previously stored value. 3.3 Scalable PDP Author in [10] proposed Scalable PDP which is an improved version of the original PDP. The main difference is Scalable PDP uses the symmetric encryption whereas original PDP uses public key to reduce computation overhead. Scalable PDP can have dynamic operation on remote data. Scalable PDP has all the challenges and answers are pre-computed and limited number of updates. Scalable PDP does not require bulk encryption. It relies on the symmetric-key which is more efficient than public-Key encryption. So it does not offer public verifiability. © www.ijarcsse.com, All Rights Reserved Page | 122 Thakur et al., International Journal of Advanced Research in Computer Science and Software Engineering7(8) ISSN(E): 2277-128X, ISSN(P): 2277-6451, DOI: 10.23956/ijarcsse/V7I8/0141, pp. 121-125 3.4 Dynamic PDP Author in [11] proposed Dynamic PDP which is a collection of seven polynomial-time algorithms (KeyGen DPDP, Prepare Update DPDP, Perform Update DPDP, Verify Update DPDP, GenChallenge DPDP, Prove DPDP, Verify DPDP).
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages5 Page
-
File Size-