Upload
others
View
1
Download
0
Embed Size (px)
Citation preview
Hybrid Chaotic Magic Transformation with
Advanced LZW based Encryption-then-
Compression of Images 1N.Mahendiran, 2Dr.C.Deepa
1Research Scholar, Department of Computer Science, Sri Ramakrishna College of Arts & Science, (Formerly
SNR Sons College), Coimbatore, Tamilnadu, India
2Associate Professor, Department of Information Technology, Sri Ramakrishna College of Arts & Science,
(Formerly SNR Sons College), Coimbatore, Tamilnadu, India
[email protected],[email protected]
Abstract: In the field of information security, image encryption
plays a vital role. In order to communicate the confidential
information, encryption of images is proved as a successful
method, for which countless procedures are discovered. Images
have to be encrypted prior to compression, in few practical cases.
A hybrid Chaotic Magic Transformation (CMT) with Advanced
LZW (ALZW) based encryption-then-compression scheme is
brought-in in our work for hiding the information to safeguard
the information. With the help of the median filter, the input
images were preprocessed initially, in order to eliminate the
unwanted noise to give a quality image. Then, to encrypt the
image into various blocks, the de-noised image is segregated
into several segments. Through hybrid CMT, these blocks of
images were encrypted. For compressing the encrypted image,
we make use of the ALZW based lossless compression scheme,
to minimize the space of the image. From the experimental
analysis, it is confirmed that the suggested hybrid CMT with
ALZW acquires better results when distinguished with the
current Improved LZW (ILZW) and LZW compression schemes
in terms of Compression Ratio (CR) and Peak-Signal-Noise-
Ratio (PSNR).
Keywords: Encryption, compression, median filter,
segmentation, chaotic magic transformation, huffman coding,
I.INTRODUCTION
A great deal of concern in the fields of secure transmission
and compression of images is increased because of the quick
demand if image transmission through public network like
social networking sites. It is required to minimize the size of the
images before being broadcasted, because of few inherent
features in digital images like high correlation among
neighboring pixels and having bulk data capacity. In the
interim, the secured way of image transmission has emerged
the requirement of combining image encryption and
compression. Let’s assume a practical scenario, where content
owner (say, Alice) requires to efficiently broadcasting the
International Journal of Pure and Applied MathematicsVolume 119 No. 18 2018, 3133-3147ISSN: 1314-3395 (on-line version)url: http://www.acadpubl.eu/hub/Special Issue http://www.acadpubl.eu/hub/
3133
image Ito recipient (say, Bob) via an insecure bandwidth-
constrained communication channel provider (say, Charlie).
Compressing the data and then en-crypt with the help of
any secret key, is the conventional way for data reduction and
protection, where, Alice compresses original image I into Ic
initially and then encrypts into with the help of secret
key. Encrypted data will be forwarded to Charlie, who
simply passes it to Bob. Being an authorized user, Bob does
the successive decryption and decompression get the re-
constructed image .
For compression–encryption techniques viz. techniques
based on Compressive Sensing (CS), various have been stated
in literature, where the property of sampling and compression
is possessed at the same time. Later, the focus has turn on
whether CS can be brought-in in image encryption algorithms.
Y.Zang et al[1], gave an assessment on CS based techniques
for information security, where it examines the three main
factors of security such as image encryption based on chaos
and CS , encryption based on CS and optics and the
encryption techniques based on CS, optics and chaos. Huang
et al. [2] explained a parallel image encryption technique based
on CS, where the cipher structures comprises of scrambling,
mixing, S-box and chaotic lattice XOR is established to further
en-crypt the quantized data. A hybrid compression and
encryption technique with key-controlled measurement matrix
in CS is established by Zhou et al. [3]. Additional, a new
hybrid compression encryption algorithm based on CS is
explained in [4] in which measurement matrix is built as partial
Hadamard matrix and managed by a chaos index sequence. An
image compression and encryption technique based on 2D CS
and fractional Mellin transform (FrMT) is suggested by Zhou
et al.[5], where it makes use of the nonlinearity of FrMT to
oppose the common attacks and control the compression
capability of CS.
An image compression encryption scheme based on hyper-
chaotic system and 2D CS is proposed by Zhou et al. [6],
which achieves the compression and encryption at the same
time. Through the cycle shift operation, which is controlled by
the hyper-chaotic system, the CS will minimize the data amount
and then the compressed encrypted image is re-encrypted. A
new technique is suggested by Alfalou et al. [7], where the
compression and encryption were executed at the same time in
a dependent way, which is nothing but a combination of
spectral fusion based on the properties of Discrete Cosine
Transformation (DCT). Zhu et al.[8] provided an effective
technique which utilizes the hyper-chaos and Chinese
remainder theorem, where the 2-D hyper chaos helps to mix up
the original image and Chinese remainder theorem and it is
enforced to spread and compress the shuffled image at the
same time. In [9], an optical algorithm for simultaneous
compression and encryption of phase-shifting digital
holograms for 3-D object reconstruction is explained, which
works according to the spectral fusion.
The above addressed compression–encryption a scheme
satisfies various secure transmission cases. But, in new
applications, the conventional way has to be revisited and its
knowledge is acquired with an example. For instance: assume
that Alice requires transmitting the information to Bob,
whereas Charlie is the network provider. Alice requires
safeguarding the privacy of her data from everyone even to
the Charlie through encryption. She has restricted
computational resources, in order to minimize the size of data.
So, before encryption, she won’t make use of her limited
computational resources to minimize the size of image and it is
strangely true when Alice utilizes a resource-divested mobile
device. Char-lie has an overriding interest to increase the
usage of network by compress ing all network traffic
counterclockwise. Alice hides the secret key which is utilized
to encrypt the image data from Charlie, since she doesn’t trust
on Charlie; on fully encrypted data, Charlie does the
compression and finally it will be pass to BoB. Joint
decompression and decryption is done by Bob, in order to re-
build the image from the encrypted and compressed data.
Compressing the images in fully encrypted domain is the great
International Journal of Pure and Applied Mathematics Special Issue
3134
dispute within such Encryption-then-Compression (ETC)
system.
For compressing general purpose digital images, lossy
compression methods were utilized, where minor loss of data
isn’t an obstacle but, lossless compression presenting an
image in the smallest number of bits without losing any
information. For image compression, we concentrate on
lossless compression scheme. Zhang designed an image
encryption scheme through pixel-domain permutation and
explained that the encrypted file can be effectively compressed
by eliminating the excessively rough and fine information of
coefficients in the transform domain [10]. A new compression
approach for encrypted images, were proposed by Zhang et.
Al[11], which works according to the multi-layer
decomposition. For effectual ETC method [12-18], different
techniques have been developed.
The current ETC systems still fall considerably short in the
compression performance when distinguished with the state-
of-the-art lossless image coders that demand unencrypted
inputs, in spite of extensive efforts in recent years. Designing a
pair of image encryption and compression schemes for images
is the main attention of our work. This scheme concentrates
much on the hybrid CMT encryption with advance LZW
compression for images. From experimental analysis it is
confirmed that the proposed system acquires good result
when distinguished with the current LZW and Improved LZW
(ILZW) compression schemes. The other session of this paper
is organized as follows: Section 2 explains the current ETC
scheme for images. Section 3 provides the details of our
proposed ETC system, where lossless compression is
considered. In Section 4, Experimental analysis was stated to
validate our findings. Section 5 presents the conclusion.
II. RELATED WO RK
Different ETC exiting schemes and their limitations has been
explained here. A scalable coding technique is suggested by
Zhang et al. [12] for compression of encrypted image via a
multi-resolution construction. Original pixel values were
masked with the help of the addition modulo-256 with pseudo
random numbers in X.Zang[12]. Through number of iterations,
this work is scalable and performs well. The down sampling
based techniques were stated in literature [13-15], apart from
the CS and quantization based techniques, in order to minimize
the size of encrypted images up to a preferred level viz. a new
compression technique for encrypted image is again
suggested by Zhang et al. [13] via the multilayer
decomposition.
A novel compression technique for encrypted images is
suggested by Zhang et al. [14], for generating little auxiliary
information with optimized quantizing parameters to enhance
the compression performance. The prediction error clustering
and random permutation helps to achieve encryption and it is
explained by Zhou et al. [15], where the compression
performance is examined for both lossless and lossy
compression proposes an efficient ETC technique. A new
encryption then compression technique using rate distortion
optimization is given by Wang et al. [16], which distinguishes
it from the others.
Vaish et al. [17] suggested a prediction error based ETC
technique, were the prediction errors were computed with the
help of a sub-image and efficient compression is accomplished
through quantization and Huffman coding. Kumar and Vaish
[18] explained that an efficient compression of encrypted image
is acquired with the help of wavelet difference coding.
A Tangent-Delay Ellipse Reflecting Cavity-Map System (TD-
ERC), wavelet neural networks (WNN), and XOR operation on
binary data is suggested by Zhang and Fang [19], which
accomplished cipher image. Here, addressed attacks are as
follows: key size being 10195, histogram analysis, correlation
analysis, and differential analysis. The proposed system can
be enforced to give the secured information
Symmetric chaotic economic map (CEM) is brought-in by
Askar et al., [20], with key space 1084, the entropy that closes
to ideal value 8, and low coefficient correlation that closes to 0.
A chaotic sequence is created with fraction decimal values to
integers by CEM. The following were the attacks which are
International Journal of Pure and Applied Mathematics Special Issue
3135
mentioned earlier, they are: key sensitivity analysis, correlation
analysis, and analysis of information entropy.
The chaotic cat map algorithm is chosen by Kanso and
Ghebleh [21] and it is utilized for medical image security
applications with rounds and every round has two phases:
shuffling and masking enforced for block level and also the full
image. The pseudo random matrix of the same size is utilized as
an input image to maximize processing speed, for masking
phase of every round. For medical image robustness, statistical
cryptanalytic attacks like key search and differential attacks
were examined and for ROI and for full image, same encryption
and decryption technique were enforced and also it
accomplishes the same level of security in ROI and full image.
Examining the brute-force attack by assuming the key space is
huge. But, the author didn’t mention the decrypted image
quality and information entropy.
The new 2D-Sine Logistic Modulation Maps (2D-SLMM)
based on logistic and sine maps with effective image pixel
shuffling algorithm called as Chaotic Magic Transform (CMT)
is suggested by Hua et al., [22], in order to derive random pixel
property encryption image. Generally, high redundancy data
will be there in the digital images, because of the high
correlation of pixels. CMT helps to break these correlations
and this will modify the pixels values in random position. 2D
chaotic maps have good performance with respect to
generating chaotic sequence than 1D chaotic map, but they
require comparatively difficult hardware structure and cost. At
the time of shuffling when compared with early chaotic maps,
CMT’s performance is good. With the help of following
parameters, chaotic performance is examined: trajectory,
Lyapunov exponent, and Lyapunov dimension and
Kolmogorov entropy surviving chaotic maps were generally
classified into 1D chaotic maps and high-dimensional maps.
Chaotic Map Lattices (CML) had weakness in conversation of
floating values into pixel value which leads to data loss in
image and it is explained by Jasteazebski and Kotulski [23].
Jasteazebski and Kotulski suggested Improved CML, which
works according to the CBC method but lacks from different
security services like noise attacks, differential attacks, and
statistical attacks. Image encryption hides few particular
issues, for instance, huge size of image pixels and redundancy.
The value of pixel in encryption process will depend on the
neighbored pixel value, that is, pixels blocks, in few scenarios.
The brute-force attack happens, because of the small key size.
The time complexity, space complexity, noise attacks,
differential attacks, statistical attacks, and so forth were
conceived here. Encryption is advanced based on modular
arithmetic operator, on the medical images and it is explained in
J. B Lima [24]. The Current ETC techniques still fail in
compression performance when [32] distinguished to the state-
of-art lossless or lossy image compression techniques, despite
of many attempts in the past few years.
III. PRO PO SED METHO DO LO GY
The proposed hybrid CMT encryption with advanced LZW
compression schemes is explained in this section.
A. System Overview
Figure 1 explains the process proposed ETC scheme work flow.
Initially, the Digital Imaging and Communications in
Medicine (DICOM) images were preprocessed to discard the
noise. Then, with the help of the vertical and horizontal
segmentation, the de-noised image is segmented or
decomposed into 4 blocks. Through new block image
encryption scheme based on hybrid chaotic magic
transformation approach, encryption process takes place.
Utilizing the using ALZW method and Improved Huffman
Coding (IHC), the lossless compression is performed. At last,
the recovered image is displayed.
International Journal of Pure and Applied Mathematics Special Issue
3136
Fig.1 Work flow of Proposed ETC Scheme
B. Preprocessing
Here, DICOM brain images are considered as input. To
enhance the unwanted distortion in image, pre—processing
technique is utilized here. The noise filtering method is utilized
here to enhance the input image quality. Without minimizing
the sharpness of the image, it has the ability to remove the
noise. In image compression and decompression, median filter
is its edge preserving property, which is considered as the
best features. Median filter assumes every pixel in the image
and search at its nearby neighbors to decide whether or not it
is representative of its surroundings. This filter will replace the
pixels with the median rather than replacing the pixel value with
the mean of neighboring pixel values. By sorting entire pixel
values from the surrounding neighborhood into numerical
order and then replacing the pixel being conceived with the
middle pixel value, the median value is computed. (If the
neighborhood under consideration has - an even number of
pixels, the average of the two middle pixel values is used.) A
3×3 square neighborhood is utilized for creating more severe
smoothing.
C. Segmentation
With the help of vertical and horizontal segmentation, the
de-noised image is segmented or decomposed into 4 blocks.
Consider that the size of the input image is N × N. Classify the
input image into 4 blocks, block 1, block 2, and block 3 and
block 4 where every block is the size of N/4 since the size of
the image is 256 × 256 with the help of vertical and horizontal
segmentation. Every block has 64 vertical and horizontal lines.
D. Hybrid CMT Based Encryption
Through new block image encryption scheme based on
hybrid chaotic magic transformation approach, encryption is
done. Chaotic research for an image encryption has an
importance, because of the sensitive dependencies on initial
conditions, system parameters, random behavior, non-periodic
and topological transitivity, and so forth; the chaotic systems
assists for encrypting the image, which can’t be identified by
the malicious users. The image will not be recognized, even if
the attacker is intercepted, so it can transfer successfully over
the Internet which assures the security of image
communication. Various encryption methods haven’t
mentioned the security services like pixel correlation, chosen-
plaintext attack, cipher attack, histogram analysis, and entropy
[25-27]. So, a hybrid Chaotic Magic Transformation (CMT)
approach is suggested to give more robustness for
safeguarding the images from different attacks such as key
space analysis, key sensitivity, pixel correlation, histogram
analysis, chosen-plaintext attack, cipher attack entropy, and
noise analysis. Lanczos algorithm is utilized in CMT, in order
to create the root characteristics and eigenvectors, so we name
it as hybrid CMT.
Consider plain image P, which is provided as input to the
hybrid chaotic magic transformation encryption process. It has
four steps: image column pixel values were sorted in ascending
order and it does a row sorting. Pixel confusion phase achieves
Input image
Pre-processing using median
filter
Segmentation
Encryption using hybrid CMT
Lossless compression using
ALZW with IHC
Decompressed image
International Journal of Pure and Applied Mathematics Special Issue
3137
confusion property by randomly shuffling entire pixel
positions, obtaining confused image matrix M. The overall
process of hybrid CMT is illustrated in figure 2.
In order to generate key (K) with a size of host image, the
Chaotic sequence generator was utilized. This key ( ) is
provided to the Lanczos algorithm, in order to recognize the
vector characteristics, which enhances the key space and
enhance security against the potential attacks Cipher image
matrix (Z) is acquired by performing the multiplication
operation among the key vectors (K) and confusion matrix (M).
It enhances the key space and improves the security against
the potential attack.
Fig.2 Workflow of Hybrid CMT Approach for Encryption
1) Hybrid Chaotic Magic Transform for Encryption
The target of encryption algorithm is to confound the position
of pixels for every block of the image according to the
following steps: Hybrid CMT algorithm shuffles matrix [28]:
Algorithm 1: Hybrid CMT Algorithm
Step 1: Sort each column of in ascending order to obtain
sorted matrix .
Step 2: Generate shuffled index matrix by connecting the
pixels in with locations
with respect to CO.
Step 3: The pixel shuffling process is done by shuffling the
pixels positions to the right in the clockwise directions.
Computation speed of encryption process has been increased
by directly shuffling row by row and column by column
instead of pixel by pixel.
Step 4: The resultant shuffled matrix is . The shuffling
process is done by using the hybrid CMT algorithm; here,
random chaotic matrix with size × is used to produce the
shuffled index matrix of size × , where index matrix is
defined by
Let be the original image with size × and be the
resultant shuffled image. The pixel shuffling process of the
original image is defined by
Where is defined the generation of shuffled indexed matrix
from chaotic matrix and sorted matrix is generated by
sorting each column of chaotic matrix in ascending order.
The index matrix shows the position of data where they are
permuted from chaotic matrix . Where is the original image
matrix and is the resultant shuffled matrix obtained from
HCMT.
Step 5: A linear congruential generator (LCG) is used to
generate × pseudorandom numbers by using
Where and are integers and is the start value.
Step 6: Lanczos Algorithm [29]. The application of Lanczos
algorithm is to perform normalization on large eigenvalues and
eigenvectors. It was invented by Cornelius Lanczos. We used
1 as the random vector, matrix ― .‖ is the characteristic
roots and is the characteristic vectors, for loops being
used to calculate eigenvalues and eigenvectors.
Step 7: Finally, the cipher image has been computed for image
compression. Here, the lossless compression has been
focused and it has been explained in next segment.
E. Image Compression
The lossless Image compression is executed on encrypted
image. With the help of Advanced LZW method, this
Input image P
Column sorting
Row sorting
Pixel confusion
Pixel shuffling
Confusion matrix M
Chaotic key generation
Key matrix K
Lanczos algorithm
Vector characteristic
calculation
Z=M*K
Cipher image Z
International Journal of Pure and Applied Mathematics Special Issue
3138
compression process is executed, through Improved Huffman
Coding (IHC). Hybrid compression provides better
compression ratio when compared with the single
compression. We will get a better compression ratio, if the data
image is compressed by Huffman Coding and then by LZW.
So we call it as ―Data compression using Huffman based LZW
Encoding.
A compression technique with the help of the two lossless
methodologies IHC and LAW coding to compress image is
suggested here. The image is compressed with Huffman
coding resulting the Huffman tree and Huffman Code words is
done in the initial stage. Entire Huffman code words were
concatenated together and then compressed with the help of
Lempel Ziv Welch coding, in the next stage. The Retinex
algorithm is utilized on the compressed image for improving
the contrast of image and enhances the quality of image, in the
final stage.
To maintain the image into bit stream as compact as likely
and to display the decoded image in the monitor as exact as
possible, we make use of the Image compression coding
technique. Let’s assume the encoder and a decoder; the image
file will be converted into a series of binary data, which is
names as bit stream, when the encoder receives the original
image file. The encoded bit stream and decodes it to create the
decoded image, when the decoder receives the encoded file.
Image compression takes place if the total number of data
quantity is lesser when compared with the total data quantity
of the original image. In the encoding and decoding process,
image compression technique algorithms were utilized for
compression and the reverse process takes place for
decompression.
1) Huffman Coding and Decoding Process
The probability distribution of the alphabet of the source to
establish the code words for symbols is utilized by Huffman
Encoding Algorithms. In order to compute the probability
distribution, the frequency distribution of entire characters of
the source is computed. The code words were assigned, based
on the probabilities, like, shorter code words for higher
probabilities and longer code words for smaller probabilities. A
binary tree is built, where the leaves are nothing but the
symbols, this is based on their probabilities, and paths were
nothing but the code words. Static Huffman Algorithms and
Adaptive Huffman Algorithms were the two families of
Huffman Encoding and it is suggested here. For both the
compression and decompression processes, static Huffman
Algorithms initially computes the frequencies and then it
creates a common tree. The compression program stores the
count every symbol which appears so far in the source text.
The symbol counts were utilized as estimates of the relative
probabilities of the symbols and a table of Huffman codes
based on these frequencies is built, in order to encode the next
symbol and for encoding, the Huffman code in the table were
utilize. From the de-compressed image pixels, the decoding
algorithm can re-create the same set of symbol frequencies and
it can utilize the table to re-build the same table of Huffman
codes. Hence, it can uniquely decode one symbol, update the
frequency count of that symbol, update its table of Huffman
codes and then decode the next symbol, and so on.
Where Avg is count the average of the probability of all
symbols. P is utilized for probability the input symbols L1 is
the leaf which count the lower probability of the symbols.L2 is
the second lower probability symbols. L1+L2 generate the
parent of the node. This process is iterated while every symbol
was processed. The tree is generated, after processing every
symbol. In Huffman tree entire leaf node allocate the lower
probability and addition of last two leaf nodes and it produce
the parent of the leaf this process can be iterated until the
entire tree is processed. For encoding, Huffman algorithm is
utilized and for decoding the reverse process helps.
The conventional HC can’t assure that entire nodes with
greater weight were added in the higher level of nodes. If it
doesn’t have four uncoded nodes remaining in the end, the
number of child nodes in the root will be deficiency, which
International Journal of Pure and Applied Mathematics Special Issue
3139
means, there is no use of the nodes with minimum weight but
the application of the nodes with maximum weight. Then the
average code length will be raised and coding efficiency gets
reduced, resulting in waste.
Improved Huffman Coding (IHC) algorithm is brought-in to
minimize the above mentioned issue and it provides priority to
the nodes with greater weight. These nodes will be shifted up,
closest to the root, so as to make sure both the tree root and
high-level node possess of 4 child nodes, with vacancy only in
the lowest level. The algorithm process is as follows:
Algorithm 2: Improved Huffman Coding (IHC) Algorithm
Step 1: Compute the total number of nodes based on J
characters.
Step 2: Compute the number of child nodes which cannot form
a collection of four by set as k.
Step 3: If k = 0, execute Step (5), otherwise execute Step (4).
Step 4: First, k child nodes with the minimum weight are used
to generate their father node by set as K. Then k nodes are
deleted from the node collection and the father node K is
added to constitute node collection by set as J∗.
Step 5: Construct the rest of Huffman tree based on the
traditional quaternary algorithm with J∗.
Compared with traditional one, the improved Huffman
algorithm shows a significant improvement in compression
ratio. To reduce the execution time and of IHC, the LZW
scheme has been presented. It reduced the code words of IHC
algorithm to increase the processing speed, so this LZW with
help of IHC is named as ALZW.
2) LZW Coding and Decoding
Lempel, Ziv and Welch (LZW) say, that compression algorithm
is Simple, lossless and dictionary based compression algorithm
[30]. The compression and extraction dictionary were both
fixed-length, and its length is 256. Looking for the codes in the
dictionary accept sequential traversal. Nevertheless, the
sequential traversal in the search, every time spending on it is
long so as to increase the compression. A frequently utilized
entries based forward moving concept is suggested to rectify
this issue. The Huffman code words are again compressed
with the help of ALZW is utilized in our work. A real realization
is that including a new variable counter to every node for
maintaining the count of the amount of utilization.
Furthermore, it makes the list bidirectional to navigate the node
easily. When the codes’ amount of usage fulfills the specified
amount, which is a mark of moving, the node is navigated
behind the head node. Hence, non-expandable codes will be
pin-pointed in the front end of the chained lists, which will
speed up the process to look for the codes, thus, save
compression time.
Algorithm 3: Advanced LZW (ALZW)
Step1: Initialize dictionary. Dictionary holds every single
character in the data stream
Step2: Set the particular amount for the spot of moving
Step3: Set the Prefix P Null
Step4: Read the subsequent character in the data stream as the
recent character C.
Step5: Evaluate whether the string P + C is in the current
dictionary.
1) Yes, set P = P + C, that is extending P with C.
2) No.
Output P’s corresponding code to the encoded data stream.
Moreover, the counter of P’s corresponding code adds 1.
Judge the present counter whether meet the mark
a. Yes, move this node behind the last nonexpendable code.
b. No, do nothing
Judge whether the dictionary achieves to the maximum
capacity: If it does not, add the string P + C to the dictionary,
otherwise do not do that.
Define P = C. (P only contains C right now.)
Step 6: Judge whether there are characters in the data stream:
1) Yes, return step3 to continue the encoding process.
2) No, output P’s corresponding code to the encoded data
stream.
End.
To minimize the file size by removing the similarity replication,
the ALZW is utilized. This optimized ALZW accomplishes
International Journal of Pure and Applied Mathematics Special Issue
3140
superior compression performance for the providing the input.
Thereby, we have advantages of LZW compression; the size
of files typically raises to a great extent when it includes lots of
repetitive data or monochrome images [31]. The entire work
flow of proposed ALZW algorithm is explained in figure 3. The
proposed scheme step by step process is given below
Algorithm 4: Overall Compression Process
Step 1: Read the image on to the workspace of the mat lab.
Step 2: Call a function which will find the symbols (i.e. pixel
value which is non-repeated).
Step 3: Call a function which will calculate the probability of
each symbol.
Step 4 : Probability of symbols are arranged in decreasing
order and lower probabilities are merged and this step is
continued until only two probabilities are left and codes are
assigned according to rule that the highest probable symbol
will have a shorter length code.
Step 5: Further Huffman encoding is performed i.e. mapping of
the code words to the corresponding symbols will result in a
Huffman codeword’s
Step 6 : Concatenate all the Huffman code words and apply
LZW encoding will results in LZW Dictionary and final
encoded Values (compressed data).
Step 7: LZW decoding process applied on Final Encoded
values and output the Huffman code words
Step 8: Huffman Encode value is applied on the LZW
Encoding process.
Step 9: In final apply the Multiscale Retinex Algorithm on
compressed image to enhance the quality and color of the
image.
Step 10: At last, the Recovered image is generated
Fig.3 Workflow of Proposed LZW with IHC
F. Image Enhancement Using Retinex Algorithm
To improve the image contrast for dynamic compression,
Retinex algorithm is utilized. For explaining the human’s visual
model, and to develop the illumination invariance model,
Retinex theory is brought-in by Land, for which the color has
nothing to do with. The target of Retinex model is proceed the
image reconstruction, and creating the image after
reconstruction the same as the observer saw the images at the
scene. According to the reflection imaging illumination model,
Retinex model works and it is similar to homomorphism
filtering: irradiation light is more smooth when compared with
the modifications of reflected light, you can utilize the low-
pass filter to predict the fuzzy computing on the input image;
reflected light is classified from the input image and smooth
images. The retinex algorithm steps are listed below
Algorithm 5: Retinex Algorithm
Step 1: Do smoothing
Step 2: Increase brightness
Step 3: Incidence component L should be as close as possible
to the output brightness of the image.
Cipher image Z
Compression using Improved
Huffman coding
Reduce Huffman code words
using ALZW
Compressed image
Decoder image
Decode using ALZW coding
Decode using improved
huffman coding
Retinex algorithm
Recovered image
International Journal of Pure and Applied Mathematics Special Issue
3141
Step 4: Incident light in the image borders should have
smoothness similar constant.
Step 5: Filter the similar contents
Step 6: Enhance the image quality
IV. RESULTS AND DISCUSSIO N
DICOM brain images computed the suggested image
compression, where 760 images were used for computation.
Digital Imaging and Communications in Medicine (DICOM) is
a standard for handling, storing, printing, and transmitting
information in medical imaging. Every image has the size 256 x
256 and wholly 65536 pixel sizes with a resolution of 96dpi.
Some of the Samples are as shown in figure 4.
For different DICOM images, the performance of the
suggested hybrid CMT with ALZW system is done. The
proposed system performances were distinguished with the
current ILZW [32] and LZW based lossless image
compression algorithms. The proposed image compression
and encryption technique has been executed in the working
platform of MATLAB.
Fig.4 Input DICOM image
Fig.5 Preprocessed image
The DICOM image is considered as an input, which is shown
in figure 4. To eliminate the noise from input image median
filtering is utilized. The preprocessed image is shown in figure
5.
Fig.6 Segmented image
In figure 6 preprocessed images is divided into 4 blocks with
the help of vertical and horizontal segmentation.
Fig.7 Encryption process
The hybrid chaotic magic transformation based image
encryption is executed in figure 7. The image column pixel
values were sorted in ascending order and it does the row
sorting, in encryption process. By randomly shuffling entire
pixel positions, obtaining confused image matrix, Pixel
confusion phase accomplishes the confusion property.
Filtered Image
Input Image
Image div ided in blocks:
International Journal of Pure and Applied Mathematics Special Issue
3142
Fig.8 Lossless compression process using ALZW with IHC
With the help of the Advanced LZW method through
Improved Huffman Coding (IHC), the lossless compression is
performed, and it is explained in figure 8. Through advanced
LZW approach, the Huffman codes were minimized.
Fig.9 Decompressed image
The above compression processes were reversed to acquire
a decompressed image, which is shown, in figure 9.
Performance Measure
The performance of suggested hybrid CMT with ALZW
based compression algorithm is distinguished with the current
ILZW and LZW based compression algorithms with respect to
PSNR, compression ratio, execution time and MSE. The
proposed system accomplishes 1.98% of compression ratio,
which is 0.4% and 0.77% higher than the ILZW and LZW
methods.
The MSE of proposed system accomplishes 2.2%, which is
0.94 % and 1.72%lower than the ILZW and LZW methods. The
proposed hybrid CMT with ALZW method gives the higher
PSNR results of 46dB , while other algorithms like ILZW and
LZW gives the results of 1 dB and 8dB correspondingly. The
execution time of proposed hybrid CMT with ALZW system
accomplishes 6.5sec, which is 0.3 sec and 0.7sec higher when
distinguished with the ILZW and LZW methods.
Compression Ratio (CR)
The compression ratio is calculated with the help of below
equation
The above equation determines the ratio among the size of the
original image and the size of the encrypted image.
Peak Signal to Noise Ratio (PSNR)
PSNR is indicated as Peak signal to Noise Ratio. PSNR is
opposite to MSE, that if the small value of PSNR means that
the removal of noise in the image doesn’t give good result.
Mean Square Error (MSE)
MSE is known as the cumulative squared error among the
trampled and the real image. The formula is as follows
MSE= (2)
Where I(x,y) is called as the real image, I'(x,y) is called the
estimated version (i.e. the decompressed image) and M,N are
known as the magnitudes of the images. A lesser value for
MSE signifies less error, and as conceived from the inverse
relation amid the MSE
1 2 3 4 50
0.2
0.4
0.6
0.8
1
1.2
1.4
1.6
1.8
2
Number of images
CR
hybrid CMT with ALZW
ILZW
LZW
Fig.10 CR Comparison among all ETC Schemes
From the Figure 10, it can be noticed that the comparison of
CR for all ETC schemes. In x-axis the number of images is
considered and for y-axis CR is considered. The proposed
hybrid CMT with ALZW and current methods were calculated.
International Journal of Pure and Applied Mathematics Special Issue
3143
The proposed scheme acquires high CR when compared with
the ILZW and LZW schemes. The proposed scheme acquires
better CR, because of the effectual preprocessing and
encryption schemes.
1 2 3 4 50
0.5
1
1.5
2
2.5
3
3.5
4
4.5x 10
-3
Number of images
MS
E
hybrid CMT with ALZW
ILZW
LZW
Fig.11 MSE Comparison among all ETC Schemes
From the Figure 11, it can be noticed that the comparison of
MSE for all ETC schemes. In x-axis the number of images is
considered and for y-axis MSE is considered. The proposed
hybrid CMT with ALZW and current methods were calculated.
The proposed scheme reaches less MSE when distinguished
with ILZW and LZW schemes. The proposed scheme reaches
better MSE, because of the effectual preprocessing scheme,
Fig.12. PSNR Comparison among all ETC Schemes
From the Figure 12, it can be noticed that the comparison of
PSNR for all ETC schemes. In x-axis, the number of images is
considered and for y-axis, PSNR is considered. The proposed
hybrid CMT with ALZW and current methods were calculated.
The proposed scheme attained high PSNR distinguished with
ILZW and LZW schemes. The proposed scheme attained
better PSNR, because of the effectual preprocessing and block
segmentation schemes,
1 2 3 4 50
1
2
3
4
5
6
7
Number of images
Exe
cutio
n T
ime
(s)
hybrid CMT with ALZW
ILZW
LZW
Fig.13 Execution T ime Comparison among all ET C Schemes
From the Figure 13, it can be noticed that the comparison of
execution time for all ETC schemes. In x-axis the number of
images is considered and for y-axis execution time is
considered. The proposed hybrid CMT with ALZW and
current methods were calculated. The proposed scheme
attained high execution time when distinguished with the
ILZW and LZW schemes.
V. CO NCLUSIO N
A hybrid CMT with ALZW based ETC scheme has been
suggested for images to enhance the security in public
network. Through median filter, the input images were pre-
processed. With the help of vertical and horizontal
segmentation, the de-noised image is segmented or
decomposed into 4 blocks. Through new block image
encryption scheme based on hybrid chaotic magic
transformation approach, encryption is executed.
Lanczos algorithm has been utilized to discover eigenvector
and eigen values in low-time complexity, in the hybrid CMT.
Pixel shuffling process is executed by shuffling the pixels
positions to the right in the clockwise directions . Hybrid CMT
has been enhanced the image security.
At last, the ALZW using IHC has been enforced for
compressing the image to a great extent with respect to CR and
PSNR. The proposed hybrid CMT with ALZW Scheme
accomplishes at the most 50 percentage compression ratios to
the Huffman coding. The hybrid CMT with ALZW Scheme
1 2 3 4 5 0
5
10
15
20
25
30
35
40
45
50
Number of images
PSNR (dB)
hybrid CMT with ALZW ILZW LZW
International Journal of Pure and Applied Mathematics Special Issue
3144
also results in an algorithm with significant time, and merits
become most obvious for images with more in size. So, the
reproduced image and the original image are equal; the ALZW
Scheme is a lossless compression scheme. Enhancing the
compression ratio with the help of the new techniques is
concentrated in future work. The proposed technique can be
experimented on various varieties of data sets such as audio,
video, text as until now it is restricted to images.
REFERENCES
1. Y. Zhang, L.Y. Zhang, J. Zhou, L. Liu, F. Chen, X. He, A review of
compressive sensing in information security field, IEEE Access. 4
(2016) 2507–2519.
2. R. Huang, K.H. Rhee, S. Uchida, A parallel image encryption
method based on compressive sensing, Multimed. Tools Appl.
72(1) (2012) 71–93.
3. N. Zhou, A. Zhang, F. Zheng, L. Gong, Novel image compression–
encryption hy-brid algorithm based on key-controlled
measurement matrix in compressive sensing, Opt. Laser Technol.
62 (2014) 152–160.
4. N. Zhou, A. Zhang, J. Wu, D. Pei, Y. Yang, Novel hybrid image
compression—encryption algorithm based on compressive sensing,
Optik 125 (2014) 5075–5080.
5. N. Zhou, H. Li, D. Wang, S. Pan, Z. Zhou, Image compression and
encryption scheme based on 2D compressive sensing and fractional
Mellin transform, Opt. Commun. 343 (2015) 10–21.
6. N. Zhou, S. Pan, S. Cheng, Z. Zhou, Image compression–
encryption scheme based on hyper-chaotic system and 2D
compressive sensing, Opt. Laser Tech-nol. 82 (2016) 121–133.
7. A. Alfalou, C. Brosseau, N. Abdallah, M. Jridi, Assessing the
performance of a method of simultaneous compression and
encryption of multiple images and its resistance against various
attacks, IEEE Trans. Inf. Theory (2013) 167–175, submitted for
publication.
8. H.G. Zhu, C. Zhao, X.D. Zhang, A novel image encryption –
compression scheme using hyper-chaos and Chineseremainder
theorem, Signal Process., Image Commun. 28 (2013) 670–680.
9. A. Alfalou, C. Brosseau, Implementing compression and
encryption of phase-shifting digital holograms for three-
dimensional object reconstruction, Opt. Commun. 307 (2013) 67–
72.
10. X. Zhang, ―Lossy compression and iterative recobstruction for
encrypted image,‖ IEEE Trans. Inf. Forensics Security, vol. 6, no .
1, pp. 53–58, Mar. 2011.
11. X. Zhang, G. Sun, L. Shen, and C. Qin, ―Compression of encrypted
images with multilayer decomposition,‖ Multimed. Tools Appl.,
vol. 78,no. 3, pp. 1–13, Feb. 2013.
12. X. Zhang, G. Feng, Y. Ren, Z. Qian, Scalable coding of encrypted
images, IEEE Trans. Image Process. 21(6) (2012) 3108–3114.
13. X. Zhang, G. Sun, L. Shen, C. Qin, Compression of encrypted
images with mul-tilayer decomposition, Multimed. Tools Appl. 72
(2014) 489–502.
14. X. Zhang, Y. Ren, L. Shen, Z. Qian, G. Feng, Compressing
encrypted images with auxiliary information, IEEE Trans.
Multimed. 16(5) (2014) 1327–1336.
15. J. Zhou, X. Liu, Oscar C. Au, Yuan Yan Tang, Design an efficient
encryption then compression system via prediction error clustering
and random permutation, IEEE Trans. Inf. Forensics Secur. 9(1)
(2014) 39–50.
16. C. Wang, J. Ni, Q. Huang, A new encryption-then-compression
algorithm us-ing the rate-distortion optimization, Signal Process.,
Image Commun. 39 (2014) 141–150.
17. A. Vaish, M. Kumar, Prediction error based compression of
encrypted images, in: ICCCT -2015, ACM Digital Library, 2015,
pp.228–232.
18. M. Kumar, A. Vaish, An efficient compression of encrypted images
using WDR coding, in: SocPros-2015, in: Springer Series Advances
in Intelligent Systems and Computing, vol.436, 2016, pp.729–741.
19. K. Zhang and J.-B. Fang, ―Color image encryption algorithm based
on TD-ERCS system and wavelet neural network,‖ Mathematical
Problems in Engineering, vol. 2015, Article ID 501054, 10 pages,
2015.
20. S. S. Askar, A. A. Karawia, and A. Alshamrani, ―Image encryption
algorithm based on chaotic economic model,‖ Mathematical
Problems in Engineering, vol. 2015, Article ID 341729, 10 pages,
2015
21. Kanso and M. Ghebleh, ―An efficient and robust image encryption
scheme for medical applications,‖ Commun ications in Nonlinear
Science and Numerical Simulation, vol. 24, no. 1–3, pp. 98–116,
2015.
22. Z. Hua, Y. Zhou, C.-M. Pun, and C. L. P. Chen, ―2D Sine Logistic
modulation map for image encryption,‖ Information Sciences, vol.
297, pp. 80–94, 2015.
International Journal of Pure and Applied Mathematics Special Issue
3145
23. K. Jasteazebski and Z. Kotulski, ―On improved image encryption
scheme based on chaotic map lattices,‖ Engineering Transcations,
vol. 69, no. 84, 2009.
24. J. B. Lima, F. Madeiro, and F. J. R. Sales, ―Encryption of medical
images based on the cosine number transform,‖ Signal Processing:
Image Communication, vol. 35, pp. 1–8, 2015.
25. H.-M. Chao, C.-M. Hsu, and S.-G.Miaou, ―A data-hiding technique
with authentication, integration, and confidentiality for electronic
patient records,‖ IEEE Transactions on Information Technology
in Biomedicine, vol. 6, no. 1, pp. 46–53, 2002.
26. F. Cao, H. K.Huang, and X.Q. Zhou, ―Medical image security in
aHIPAAmandated PACS environment,‖ ComputerizedMedical
Imaging and Graphics, vol. 27, no. 2-3, pp. 185–196, 2003.
27. Z.Hua, Y. Zhou, C.-M. Pun, and C. L. P. Chen, ―2D Sine Logistic
modulation map for image encryption,‖ Information Sciences,vol.
297, pp. 80–94, 2015.
28. https://en.wikipedia.org/wiki/Lanczos algorithm.
29. http://www.TheLZWcompressionalgorithm.html
30. Badshah, G., Liew, S. C., Zain, J. M., & Ali, M. (2016). Watermark
compression in medical image watermarking using Lempel-Ziv-
Welch (LZW) lossless compression technique. Journal of digital
imaging, 29(2), 216-225.
31. N.Mahendiran, Dr.G.P.Ramesh Kumar, a block wise encryption
then compression of images based on scrambling-substitution of
pixels and improved LZW method, Journal of Advanced Research
in Dynamical and Control Systems Special Issue – 2 / 2017,
pp.480-497
32. L.Shammi, L.Shafnam,‖ Network Clustering And Bootstrapping In
Wireless Sensor Networks‖, International Journal Of Innovations
In Scientific And Engineering Research, Vol .1, Issue. 7 , 2014,
Pp.391-396.
33. Dr G. Agila , Dhamayanthi Arumugam, ‖ A Study On Effectiveness
Of Promotional Strategies At Prozone Mall With Reference To
Visual Merchandising‖ International Journal Of Innovations In
Scientific And Engineering Research, Vol .5,Issue. 6, 2018.
Pp.No.47-56.
N.Mahendiran is working as Assistant Professor in
the Department of Computer Science in Sri
Ramakrishna College of Arts & Science (Formerly
SNR Sons College), Coimbatore, Tamilnadu, India. He
had an experience of 10 years and currently pursuing
Ph.D in Digital Image Processing. He has published more than 5
papers National and International journals
Dr. C. Deepa is currently working as Associate
Professor in Information Technology in Sri
Ramakrishna College of Arts & Science (Formerly
SNR Sons College), Coimbatore, Tamilnadu, India.
She has an experience of 16 Years in teaching and
research. She had published 10 papers in International journals and has
authored an book. Her area of interest are Web mining, Networks and
Communications
International Journal of Pure and Applied Mathematics Special Issue
3146
3147
3148