Upload
others
View
1
Download
0
Embed Size (px)
Citation preview
Improving the security in cloud using CSA by varying the key length
S.Raja shree1*, A.Chilambu Chelvan2,M.Rajesh3
1Research Scholar, Faculty of CSE, Department of Computer Science and Engineering,
Sathyabama Institute of Science and Technology, Chennai, India & Assistant Professor /CSE,
Jerusalem College of Engineering, Chennai
2Department of Electronics and Instrumentation Engineering,
R.M.D Engineering College, Chennai, India
3School of maritime studies, Vels University, Chennai, India
*Corresponding author. Tel.: +91-9486838080
E-mail address: [email protected]
Abstract:
Data outsourcing is considered as fundamental components of Cloud Computing that
centralizes users’ data to the cloud server (CS). Even though the cloud service provider (CSP)
is found provide a powerful and reliable infrastructures for users, a huge mass of data stored
in the CS makes it more vulnerable to active attack. A cloud service provider manages the
entity known as cloud servers, which provides the required storage space for the user data
while (third party auditor) TPA is an entity comprising the capability to access and determine
the risk observed in the cloud storage servers based on the users. Therefore, in the proposed
research work an efficient technique is proposed to resolve the data integrity problem in
cloud by employing Cuckoo search optimization technique (CSA) and encryption of data
using the third-party auditor.
1 Introduction
Cloud Computing is emerging as the next evolution of computing with a number of
essential characteristics, such as: on-demand self- service, ubiquitous network access,
location independent resource pooling, rapid elasticity, and measured service. There was a
high possibility of the data being deleted or updated by the service providers from the cloud
server. Thus, it is necessary that cloud services develop appropriate mechanisms to ensure the
privacy and security of the data stored to provide the trust for users. It is significant to offer
International Journal of Pure and Applied MathematicsVolume 119 No. 18 2018, 2133-2145ISSN: 1314-3395 (on-line version)url: http://www.acadpubl.eu/hub/Special Issue http://www.acadpubl.eu/hub/
2133
auditing service for the storage of cloud data. It is very necessary that cloud permits
investigation from the audit of single party that outsource data in order to make sure the
security of data and also save owner’s computation and data storage. In that owner believes
an independent TPA (Bhagatand Sahu, 2013)[1].
The major goal of the cloud framework is mainly to use the methods of simple &
sophisticated in order to attain a suitable environment and highly secured for the data sharing
in the cloud environments. The owner no longer acquires the control on the data that makes
extremely challenging to access the sharing of secured data and also to secure the
confidentiality of the data. Java Archive Files (JAR) is one of the compressed file formats,
which may be utilized for data sharing in cloud environment, (Mani, Jose and Stephen,
2014)[2].
According to Rathod and Sapkal (2014)[3], Cloud offers service-oriented applications
like IaaS (infrastructure as a service), SaaS (software as a service) and PaaS (platform as a
service), over the internet. Most of the owners store their information or data in the clouds
database, which may be acquired remotely over internet. Although the benefits of the cloud
computing are enormous, there are various security concerns. From that one of the most
significant security concerns is the stored data integrity on the cloud.
Data integrity is nothing but consistency & accuracy of the stored data, with devoid of
any modification to the data between the file/record in two updates. Cloud services must
ensure the data integrity &give hope to the privacy of user. This is usually directed by SLA
(Service Level Agreement), which in common terms user obligations& expectations and
mutual provider. The method for validating the accuracy of the data is to retrieve whole file
from the server and after that validate the data integrity by verifying the accuracy of hash
values of the whole file. The data user doesn’t store or hold any data. The admin of the cloud
request warning to the data user if any change or modification data by TPA. Devoid of any
validation of user of secret key or Meta data is to be clearly understood that data is to be
changed.
2 Related researches
Patil and Chandel (2014)[4] said that the major disputes faced by the concept of
cloud computing and its global recognition is that how to protect &secure the data and
practices which are the customers property. The data security in the cloud computing is great
concern for different reasons. From that one is in cloud environment and there is a monetary
International Journal of Pure and Applied Mathematics Special Issue
2134
contract between the cloud supplier and the clients. The cloud suppliers must promise that &
must recompense the clients for any failure that consequences from not satisfying the SLA.
Figure 1: Design of cloud data storage using TPA
Source: Ren et al. (2012)
The above figure 1 shows the architecture of cloud data storage with the help of third party
auditor. TPA is optional for auditing delegation and public auditing. Data auditing for cloud
server and clients are implement using service level agreement (SLA).
According to Srilatha Ch et al. (2014)[5], the cloud computing is altering the
nature that how companies use the information technology (IT). The transparency of utilizing
the cloud storage must be decreased as greatly as probable, such that data user doesn’t
required many number of processes in order to utilize data in further to data retrieving. There
can be above one data user have contacts to the same cloud storage in enterprise setting. It is
required that cloud alone entertains confirmation request from the particular designated party,
for the simplicity management.
Cloud computing, entrusts the remote services with data of user, software &
computation. It comprises of hardware & software requirements, which made accessible on
the network as controlled the third-party services. An entity that contains large quantity of
data files stored in cloud database and also relies on the database for the computation &
maintenance of the data of individual customers or client. The entity is controlled by cloud
service provider, which contains large storage space & computation source in order to
preserve the data of client (Rao et al., 2016)[6].
International Journal of Pure and Applied Mathematics Special Issue
2135
Nupoor, Yawale and Gadicha (2014)[7] have used the third party auditing for
securing data storage in cloud through trusted third party auditor using RC5. TPA has
automated and it is fully able to monitor the confidentiality and integrity of the data in a
successful manner. Further it uniquely integrates the data with the technique of random mask
to achieve the system of privacy-preserving public auditing. It is also suitable for the security
of cloud data storage. TPA has encrypted and decrypted all the data of user and the data
integrity validation has done through challenge verification. Finally, authors have pointed
that the TPA has performed multiple auditing tasks simultaneously.
3 Proposed System:
The proposed design is implemented to handle robustness with large scale data and
hence motivate the users in order to accept the services of cloud storage more confidently.
The completely developed model of the system on the profitable public cloud is considered as
the significant future extensions. The future work is estimated to control robustness against
the intruders with large scale data. The scheme of public auditing that offers the solutions to
complete data outsourcing and the validating of data integrity. In order to control the
robustness against the large dataset use privacy preserving public auditing system & also
support of the data dynamics.
4 Implementation plan:
In cloud, the major drawback or problem that exists in cloud is regarding the data
integrity and privacy of user data. In order to overcome the problem which exist in data
integrity we propose an efficient technique to resolve the data integrity problem in cloud by
employing optimization technique and artificial intelligence using the third party auditor.
The user provides the data with some encryption key for storing in cloud through the cloud
server. The Encryption is carried out in our proposed technique through Identity based
encryption technique and Hash function is used to generate the secret key which is shared
with the third party auditor. The auditor can access the cloud server for auditing the user data
as per user request through the secret key shared.
The entire data content cannot be viewed as the user has encrypted the data with IBE.
The auditor can audit storage area along with the data file size details and share the details
with users. For providing secured key to the auditor and user the secret key generation will be
done with the aid of hybrid optimization technique. We use Genetic cuckoo algorithm (GCA)
International Journal of Pure and Applied Mathematics Special Issue
2136
for key generation. Artificial neural network is employed in our proposed system to classify
the data integrity based on the data provided. The data integrity is compared on the basis of
classification accuracy and error rate using Modified neural network.
Fig 1.Encryption of data using cloud
The process flow which is implemented is of below,
Step 1: Data which has to be stored in the cloud are collected.
Step2: The next stage here is to encrypt the data for security purpose. Here in our proposed
method we have utilized Identity based encryption technique. We incorporate hash function
along with the encryption techniques.
User
Encryption of
data using
Identity based
technique
Secret key
selection in
hash function
using GCA
Modified
Neural
network
Input data
for storage
Third
party
Auditor
Cloud
server
Secured
Data to
cloud
Securing User Data
International Journal of Pure and Applied Mathematics Special Issue
2137
The Identity based encryption technique are formulated as below,
Identity-based encryption algorithms consist of 3 phases:
Key
Generation (Ex)
Encryption
(E)
Decryption
(D)
Key Generation:
The first stage of IBE algorithm is the key generation. It is done between the cloud service
provider and the cloud consumer.
Here we use hash function along with GCA for key management.
GCA is an optimization technique we design for the key selection process. GCA is a
combination of genetic and cuckoo search algorithm which can provide improved optimized
key pairs.
The various steps involved in our proposed GA is given below,
Generation of Chromosomes
Initially generate sN number of random chromosomes and the number of genes in each
chromosome rely on the different key that are employed. As discussed earlier, the generated
genes are the indices of the keys to be selected,
)(
1
)(
3
)(
2
)(
0
)( ,,,, i
n
iiii
M GGGGG 10 sNj , 10 nm
where,
n - Number of keys used
Fitness Function
Evaluate fitness function (it will give best input that will satisfy the best or actual solution).
Higher fitness function means better solution. Following are the purpose of fitness function.
International Journal of Pure and Applied Mathematics Special Issue
2138
• The fitness function is used for Parent selection
• Fitness function measure for convergence
• For Steady state: Selection of individuals to die
• Should reflect the value of the chromosome in some “real” way
Selection of Optimal Solution
Here, the best chromosomes are the chromosomes which have maximum fitness. The
obtained best chromosome is used to retrieve the keys.For selection various methods are
utilized and in our proposed method we used Roulette-Wheel Selection.
Roulette-Wheel Selection
The i th string in the population is chosen with a probability proportional toif . The
probability for selecting the i th string is
n
i
i
ii
f
fp
1
where n is the population size.
The average fitness of the population is calculated as
n
i
iff1
Here we will check our constraints are satisfied or not. If constraints are satisfied then we will
select this output and if not then we will use genetic algorithm operator for obtaining actual
output which is described below,
Crossover and Mutation
Among different types of crossovers, the two point crossover is selected with the crossover
rate ofRC . In the two point crossover, two points are selected on the parent chromosomes
.The genes in between the two points 1c and
2c are interchanged between the parent
chromosomes and so 2/sN children chromosomes are obtained. The crossover points 1c and
2c are determined as follows
International Journal of Pure and Applied Mathematics Special Issue
2139
3
)(
1
i
mGc
2
)(
12
i
mGcc
Then, the mutation is accomplished by replacing MN number of genes from every
chromosome with new genes. The replaced genes are the randomly generated genes without
any repetition within the chromosome. Then, chromosomes which are selected for crossover
operation, and the chromosomes which are obtained from the mutation are combined, and so
the population pool is filled up with the sN chromosomes. Then, the process is repeated
iteratively until it reaches a maximum iteration ofmaxI .
In our technique we have incorporated cuckoo search algorithm with GA for better
optimization. The process is as follows. The optimized results of GA is applied to cuckoo
search algorithm
Step 1: Initialization Phase
The population (mi, where i=1, 2, n) of host nest is initiated arbitrarily.
Step 2: Generating New Cuckoo Phase
With the help of the levy flights a cuckoo is selected randomly which generates novel
solutions. Subsequently, the engendered cuckoo is evaluated by employing the objective
function for ascertaining the excellence of the solutions.
Step 3: Fitness Evaluation Phase
The fitness function is evaluated shown hereunder, followed by the selection of the best one.
T
S
p
PP max
maxmax Ppopularityimumfitness
Where,
SP - signifies the selected population
TP - represents the total population
Step 4: Updation Phase
International Journal of Pure and Applied Mathematics Special Issue
2140
At the outset, the solution is optimized by the levy flights by employing the cosine transform.
The levy flights employed for the general cuckoo search algorithm is expressed by the
Equation shown below:
\)()()1(*
nLevymmmt
i
t
ii
Step 5: Reject Worst Nest Phase
In this section, the worst nests are ignored, in accordance with their possibility values and
novel ones are constructed. Subsequently, depending upon their fitness function the best
solutions are ranked. Thereafter, the best solutions are detected and marked as optimal
solutions.
Step 6: Stopping Criterion Phase
Till the achievement of the maximum iteration, the procedure is continued.
Step 7:
The next stage in our proposed method is to classify the data integrity based on the data
provided. Here we employ modified neural network which provides better classification
accuracy when compared with ordinary techniques.
Modified neural network:
Here we use Multilayer feed forward neural network is utilized in our methodology. Back
propagation algorithm is used to train the neural network, which is described below.
Step 1: Generate arbitrary weights within the interval [0, 1] and assign it to the hidden layer
neurons as well as the output layer neurons. Maintain a unity value weight for all neurons of
the input layer.
Step 2: Input the training dataset I to the classifier and determine the BP error as follows
Step 3: Adjust the weights of all neurons
Here, we incorporate the optimization algorithm for weight score calculation using the error
values. For optimization we use PSO which can aid in improving the classification.
The secured data from the neural network is then stored in cloud server which can be
accessed by the third party auditor for integrity verification as well.
International Journal of Pure and Applied Mathematics Special Issue
2141
5 Results and Discussions
This section describes in detail about the performance of the proposed algorithm .In the
Simulation environment , the data integrity in cloud environment using third party auditor
using RSA with Cuckoo Search Algorithm (CSA) considered by varying the size of the
private keys.
In communication systems, throughput is the rate of successful data delivery over a noisy
communication channel. In general, the throughput is estimated in bits per second and
calculates it in terms of packets per second. In this, the throughput is evaluated by dividing
the total data in bits by encryption time. Table 3 shows the throughput of various input file
size and different private key.
Table 1 Throughput for various private key length
Input file
size (KB)
Private key
length (128
bits)
Private key
length (256
bits)
Private key
length (512
bits)
10 0.128 0.20 0.30
20 0.136 0.23 0.33
30 0.145 0.28 0.36
40 0.154 0.32 0.38
50 0.180 0.36
0.52
Fig.2 Throughput for various private key length
International Journal of Pure and Applied Mathematics Special Issue
2142
Fig.2 demonstrates the throughput for different sizes of input text files and different
private key lengths such as 128 bits, 256 bits and 512 bits.
From the above graph, we observed that the there is a trade-off between throughput and
Private key length. Thus, if we need to increase the throughput, we have to decrease the
Private key length and to make compromise on the security of the data.
.
References:
1. Bhagat A and Sahu R K (2013), “Cloud Data Security while using Third Party
Auditor”, International Journal of Computer Applications (IJCA), Vol.70, No.16.
2. Mani P, Jose T and Stephen J (2014), “Cloud Security and Data Integrity with Client
Accountability Framework”, ACEEE International Journal on Network Security, Vol.
5, No. 1.
3. Rathod P and Sapkal S (2014), “Audit Service for Data Integrity in Cloud”,
International Journal of Advanced Research in Computer Science and Software
Engineering (IJARCSSE), Vol.4.
4. Patil V T and Chandel G S (2014),” Implementation of TPA and Data Integrity in
Cloud Computing using RSA Algorithm “,International Journal of Engineering
Trends and Technology (IJETT), Vol.12, No. 2.
5. Srilatha Ch et al. (2014), “Confidentiality Protective Public Appraising of Safe Cloud
Storage”, International Journal of Engineering & Science Research (IJESR), Vol. 4.
6. Rao et al. (2016), “Secure User Data Using Encryption for Preserving Private Data in
Cloud”, International Journal of Innovative Science, Engineering & Technology
(IJISET), Vol.3.
7. Wu, L., Garg, S. K., & Buyya, R. (2015, December). Service level agreement (SLA)
based SaaS cloud management system. In Parallel and Distributed Systems
(ICPADS), 2015 IEEE 21st International Conference on (pp. 440-447). IEEE.
8. Puthal, D., Sahoo, B. P. S., Mishra, S., & Swain, S. (2015, January). Cloud computing
features, issues, and challenges: a big picture. In Computational Intelligence and
Networks (CINE), 2015 International Conference on (pp. 116-123). IEEE.
9. Di Spaltro, D., Polvi, A., & Welliver, L. (2016). U.S. Patent No. 9,501,329.
Washington, DC: U.S. Patent and Trademark Office.
International Journal of Pure and Applied Mathematics Special Issue
2143
10. Khan, K. M., & Malluhi, Q. (2013). Trust in cloud services: providing more controls
to clients. Computer, 46(7), 94-96.
11. Sunyaev, A., & Schneider, S. (2013). Cloud services certification. Communications of
the ACM, 56(2), 33-36.
12. Yu, Y., Au, M. H., Ateniese, G., Huang, X., Susilo, W., Dai, Y., & Min, G. (2017).
Identity-based remote data integrity checking with perfect data privacy preserving for
cloud storage. IEEE Transactions on Information Forensics and Security, 12(4), 767-
778.
13. Al-Saffar, A. M. H. (2015). Identity Based Approach for Cloud Data Integrity in
Multi-Cloud Environment. Identity, 4(8).
14. Bachhav, S., Chaudhari, C., Shinde, N., & Kaloge, P. Secure Multi-Cloud data
sharing using Key Aggregate Cryptosystem for scalable data sharing. International
Journal of Computer Science and Information Technologies (IJCSIT), ISSN, 0975-
9646.
15. Kumar, R. S., & Saxena, A. (2011, January). Data integrity proofs in cloud storage.
In Communication Systems and Networks (COMSNETS), 2011 Third International
Conference on (pp. 1-4). IEEE.
16. Kumar, P. S., & Subramanian, R. (2011). Homomorpic distributed verification
protocol for ensuring data storage security in cloud computing. Information-An
International Interdisciplinary Journal, 14(10), 3465-3476.
17. Hao, Z., Zhong, S., & Yu, N. (2011). A privacy-preserving remote data integrity
checking protocol with data dynamics and public verifiability. IEEE transactions on
Knowledge and Data Engineering, 23(9), 1432-1437.
18. Yu, Y., Xue, L., Au, M. H., Susilo, W., Ni, J., Zhang, Y., ... & Shen, J. (2016). Cloud
data integrity checking with an identity-based auditing mechanism from RSA. Future
Generation Computer Systems, 62, 85-91.
19. Imran, M., Hlavacs, H., Haq, I. U., Jan, B., Khan, F. A., & Ahmad, A. (2017).
Provenance based data integrity checking and verification in cloud
environments. PloS one, 12(5), e0177576.
20. Tan, S., Jia, Y., & Han, W. H. (2015). Research and development of provable data
integrity in cloud storage. Chinese Journal of Computers, 38(1), 164-177.
International Journal of Pure and Applied Mathematics Special Issue
2144
2145
2146