Abstract
Image data play important role in various real-time online and offline applications. Biomedical field has adopted the imaging system to detect, diagnose, and prevent several types of diseases and abnormalities. The biomedical imaging data contain huge information which requires huge storage space. Moreover, currently telemedicine and IoT based remote health monitoring systems are widely developed where data is transmitted from one place to another. Transmission of this type of huge data consumes more bandwidth. Along with this, during this transmission, the attackers can attack the communication channel and obtain the important and secret information. Hence, biomedical image compression and encryption are considered the solution to deal with these issues. Several techniques have been presented but achieving desired performance for combined module is a challenging task. Hence, in this work, a novel combined approach for image compression and encryption is developed. First, image compression scheme using wavelet transform is presented and later a cryptography scheme is presented using confusion and diffusion schemes. The outcome of the proposed approach is compared with various existing techniques. The experimental analysis shows that the proposed approach achieves better performance in terms of autocorrelation, histogram, information entropy, PSNR, MSE, and SSIM.
1 Introduction
Nowadays, tremendous growth is noticed in the medical field. The current advancement in this field has led to appropriate diagnosis, improvement in the medical infrastructure of the country, and helped to overcome health-related issues. These advanced diagnosis systems use numerous techniques such as surgery, vaccination, medications, and many more. In order to perform the diagnosis, the medical experts require the data of patient, which can be in the form of text, image, or video. Currently, biomedical images are widely adopted by clinicians for diagnosis because these images are considered as a source of rich medical information. These medical images are obtained by using various imaging technologies such as radiographs, ultrasound, X-ray, Magnetic Resonance Imaging (MRI), Computed Tomography (CT), Photoacoustic imaging, and many others [1].
The growing imaging systems generate huge data, which cause an increase in the size of digital data, especially still images, because high-quality images are desirable to diagnose and examine the disease. Thus, large storage space is required to store these images. For illustration, in 1990s, a CT scanning of thorax would contain 25 slices where thickness of each slice is 10 mm. These data require around 12 MB of space. In the current scenario, this scanning requires 600 MBs to GB space to store the data [2]. However, the data storage techniques in digital computers have grown dramatically to store the data with reduced storage space requirement. Currently, telemedicine and e-health applications are introduced where biomedical data are transmitted over a wireless network. Due to bandwidth limitations in these communications, faster data transmission is required while ensuring successful data transmission [3]. The storage and bandwidth issues can be addressed by incorporating an efficient approach of data compression [4]. In addition to this, the medical images contain several private information of patients including diagnostic and examination records. While transmitting these data over a wireless channel, the data can be stolen or modified by the eavesdroppers, hence, maintaining data security is a prime task of hospitals and medical service organizations [5].
Several techniques have been presented during the last decade for image compression. Generally, the image compression techniques are classified into two categories as lossy and lossless image compression [4,5]. In lossy compression, a negligible amount of data loss occurs during reconstruction whereas lossless compression schemes reconstruct the entire image without losing any data. Based on lossy compression, several techniques are presented such as discrete cosine transform (DCT), discrete wavelet transform (DWT), vector quantization, and many more. On the other hand, the lossless compression scheme includes decorrelation-based methods such as set partitioning in hierarchical trees (SPIHT), embedded zerotree wavelet and embedded block coding with optimized truncation, and entropy coding-based schemes such as run length coding, Huffman, and Lempel-Ziv-Welch coding [6,7].
Similarly, several schemes have been developed to improve the encryption and decryption mechanism for data security. These techniques are adopted in the biomedical field. The encryption techniques are broadly classified into two categories such as public (asymmetric) and private (symmetric) key cryptography [8]. The asymmetric key cryptography scheme uses two different keys as public key to encrypt the data and private key to decrypt the data. On the other hand, the symmetric cryptography schemes use the same key to encrypt and decrypt the data [9]. The public key cryptography includes Rivest–Shamir–Adleman (RSA) [10] and Diffie Hellman algorithms [11]. Similarly, the private key cryptography techniques include data encryption standard [12], advanced encryption standard [13], and Blowfish algorithms [14].
Research have suggested to combine the compression and encryption phases to improve the diagnosis using telemedicine systems. Several techniques have been presented based on this concept such as grayscale JPECG image encryption and compression [15], combined encryption and lossless compression [16], chaos and compression sensing for compression-encryption [17], and many more [18]. However, these techniques suffer from various performance related issues such as poor reconstruction quality, computational complexity, low peak signal to noise ratio (PSNR), encryption and decryption time. In this work, these issues are considered and a combined scheme of image compression and encryption is developed. The first phase performs image compression module and second phase performs data encryption.
Rest of the manuscript contains following subsections: Section 2 describes the existing techniques of image compression and encryption, Section 3 presents the proposed combined model for biomedical image compression and encryption where first phase contains the image compression model and second phase describes the image encryption technique, Section 4 presents the experimental analysis where the outcome of the proposed approach is compared with the existing techniques. Finally, Section 5 presents the concluding remarks and future scope of the work.
2 Literature survey
This section presents the brief review study about the existing techniques of biomedical image compression, encryption, and combined compression and encryption schemes.
2.1 Image compression techniques
This section describes the study about various existing techniques of medical image compression. Bruylants et al. [25] reported that JPEG 2000 is a promising technique for DICOM image compression; however, the performance of JPEG 2000 can be improved for volumetric medical image compression. In order to take the advantage of JPEG 2000, authors presented a novel framework which uses generic coded framework. This framework supports JP3D volumetric extension along with different types of wavelet transforms and intra-band prediction modes. Zuo et al. [26] focused on both lossless and lossy image compression scheme. It is well-known that lossy compression schemes lose some data, which are not suitable for medical image and lossless schemes preserve the data but have low compression rates. Hence, authors took advantage of both the schemes and presented a region of interest (ROI)-aided compression scheme where the ROI part of image is compressed using lossless approach and non-ROI region is compressed using wavelet-based lossy compression scheme.
Lone [27] reported that conventional lossless compression techniques suffer from computational complexity and huge memory to encode and decode the data. The lossless compression schemes at lower bitrates also suffer from the computational complexity issue. To overcome these issues, authors developed a coding algorithm which is efficient in compression, computational complexity, and memory. This coding scheme is similar to the wavelet block tree coding algorithm which uses spatial orientation block tree coding approach. In this scheme, the image is divided into 2 × 2 blocks which are processed through the block tree coding to obtain the redundancy information in the sub band. Song et al. [28] presented a novel scheme based on irregular segmentation and region-based prediction to improve the performance of lossless compression. The first phase of this approach is to introduce a new scheme for the adaptive irregular segmentation, which is obtained by combining geometry adaptive and quadtree partitioning scheme for the performance of lossless compression. Later, least square predictors are designed for different regions and sub blocks. Along with spatial correlation, this scheme utilizes local structure similarity to improve the reconstruction. Geetha et al. [29] reported that vector quantization (VQ) is widely adopted for image compression. The Linde–Buzo–Gray (LBG) is the mostly used type of VQ, which compresses the image by constructing the local optimal codebook. In this work, authors considered the codebook construction as an optimization problem which is solved by using bio-inspired optimization technique.
2.2 Image encryption techniques
Hua et al. [19] presented a new encryption approach for medical image encryption. This approach is mainly of three phases where first random data bits are generated and padded with the image, in next step, high speed scrambling along with pixel diffusion is performed which shuffles the neighboring pixels and spreads the padded data over entire image. This scheme is applied by using two types of operations in the pixel adaptive diffusion which includes bitwise XOR which has higher efficiency in hardware platforms and module arithmetic achieves faster speed in software applications. Nematzadeh et al. [20] adopted optimization scheme and presented a combination of modified genetic algorithm and coupled map lattices for medical image encryption. In the first phase, this scheme applies coupled map lattice to obtain the secure cipher-images which can be used as initial population for genetic algorithm. In the next phase, modified genetic algorithm (GA) is applied to enhance the entropy and minimize the computation time. Belazi et al. [21] developed an encryption method based on the combination of chaos and deoxyribonucleic acid (DNA)-based encryption scheme for medical image encryption. This scheme performs two encryption rounds which are led by the key generation, permutation, and substitution and diffusion operations. In addition to this, secure hash algorithm (SHA)-256 hash function is applied to generate the secret key. The proposed approach is composed of six stages including permutations, substitutions, encoding, decoding, and diffusion. Kumar et al. [22] introduced a novel approach using chaotic map with the help of fractional discrete cosine transform (FrDCT) coefficients. The complete approach is divided into two phases where first FrDCT is applied in the image to generate the chaotic map and later FrDCT coefficients are obtained. Amirtharajan et al. [23] developed a hybrid scheme where chaotic maps are generated for the DICOM image by using integer wavelet transform which are later fused with DNA sequence in spatial domain. The 3D Lorenz attractor helps to generate the chaotic map and logistic maps help to generate the keys for encryption. Similar to the scheme reported in ref. [21], this scheme consists of several steps such as substitution, permutation, encoding, and decoding. Ding et al. [24] developed deep-learning-based image encryption and decryption network. In this scheme, cycle-generative adversarial network is applied for learning to map the original image to target domain. The target domain helps to realize the encryption mechanism. The encrypted output image is stored in the text-form and processed through the reconstruction network to achieve the decrypted output.
2.3 Combined compression and encryption
This section discusses about the combined techniques of image compression and encryption for biomedical images. Raja [30] focused on the development of a combined framework which uses public key cryptography scheme for security and encoding scheme for the compression of medical images stored over cloud. Multiscale transform techniques are discussed for compression such as wavelet which offers appropriate localization of data in time and frequency domain, curvelet transform can handle the discontinuity curves, bandlet transform helps to obtain the geometric regularity, and contourlet transform helps to obtain the smooth contours and edge information at different orientations. For encryption, authors adopted the RSA algorithm. Ghaffari [31] introduced joint compression-encryption technique using 2D sparse representation and chaotic system. In the first phase, the input image is extended in transform domain which is used to obtain the sparse representation. Further, this sparse representation is scrambled with the help of chaotic confusion. This scrambling step ensures better security and improves the sparse recovery. Further, singular value decomposition is applied for compression and XOR operation is performed to obtain the final encrypted data. Gan et al. [32] reported that information entropy of compressed sensing-based schemes is lower than 7 which makes them vulnerable to entropy attacks. In order to overcome this issue, authors developed a compressed sensing and game of life (GOL)-based scheme. According to the first phase, the game-of-life based scrambling is applied to shuffle the coefficient matrix of plain image and a permutation matrix is formulated based on the rules of GOL. This matrix helps to reduce the pixel correlation and improves the scrambling. In the next phase, confused matrix is processed and compressed with the help of computer sensing and it is further diffused by using key matrix to generate the cipher image. Moreover, a 5D memristive hyperchaotic system is also used to generate the chaotic sequence. Zhang et al. [33] developed a joint scheme for image compression and encryption based on compressive sensing and Fourier transform. This scheme uses a chaos system and two-dimensional fractional Fourier transform to perform the encryption.
3 The proposed model
In this section, the proposed joint solution for image compression and encryption for biomedical images is presented. The first phase of this section describes the proposed image compression module and second phase presents the proposed solution for encryption.
3.1 Overview of the proposed model
This approach is mainly focused on minimizing the requirement of data storage by using image compression and improving the security by incorporating hashing, and encryption mechanism. The overall architecture of this approach is depicted in Figure 1. First, the image compression is performed by using forward lifting wavelet scheme which generates approximation and detailed coefficient by using lifting wavelet scheme. These coefficients are processed through the Huffman encoding phase which generates the binary encoded sequence. Then, considering this encoded sequence, the proposed encryption standard is applied which includes SHA-256 Algorithm for Hash generation, chaotic map generation, and random sequence generation. At this stage, the compressed, encoded, and encrypted data are obtained which can be transmitted over a wireless channel. After receiving this data at the receiver end, data decryption, Huffman decoding, and image decompression steps are applied to reconstruct the image data.
Lifting wavelet transform: this is a type of wavelet transform scheme which is also known as second generation wavelet transform. It is used for designing the wavelets and performing the DWT. This scheme considers the DWT signal and factorizes in a series of elementary convolution operation, called lifting phase. This simplifies the signal and reduces the arithmetic operation by a factor of 2.
Huffman coding: it is a well-known data compression scheme which is used to reduce the size of data without losing any bit or detail of the data. It generates a tree based on the occurrence of frequencies of symbols and then generates code for each character.
SHA-256 hash algorithm: The SHA 256 algorithm is part of the SHA 2 family of algorithms. This generates a 256-bit long output for the given data by performing certain functions.
Chaotic map: it is a model which is based on chaos theory and is used to perform the cryptographic operations. During this process, the repeated rounds of encryption help to obtain the desired diffusion and spreads the initial region over the entire space (Table 1).
Notations used | Description |
---|---|
|
Predict phase |
|
Update phase |
|
Filter coefficient for prediction phase |
|
Filter coefficient for update phase |
|
Detailed coefficient |
|
Approximation coefficient |
|
Hash value |
|
Input for Hash |
|
Size of input matrix |
|
Hash sequence |
|
Constants for chaotic map, values of these parameters are 35, 3, 12, 7, and 0.085 |
|
Chaotic sequences |
|
Transformed sequence from
|
|
Input image for encryption |
|
Initial vector for permutation, permutation output |
3.2 Image compression
In this section, the proposed scheme of image compression is described. The proposed scheme is hybrid of linear predictive coding (LPC), discrete wavelet, and Huffman coding. These encoding and decoding schemes help to achieve the lossless compression. According to this scheme, first, the input image is processed through LPC [34] which generates a coded image that is further given as input to the DWT. In the next stage, the wavelet decomposed image is processed through the Huffman coding where zigzag DCT scanning is applied. This step generates the compressed data. Later, the compressed data are processed through the Huffman decoding, inverse DWT, and inverse LPC to reconstruct the image.
3.2.1 Wavelet transform
In this subsection, the implementation of wavelet transform for image compression phase is described. The wavelet transform decomposes the image into four sub bands as HH, HL, LL, and LH. The DWT-based scheme uses Haar filter in lifting scheme, and filter type-I. The type-I filters’ coefficients are given as follows:
where
This scheme splits the input signal into two parts using the split function. These samples are denoted as even and odd data samples and expressed as follows:
Correlation between the odd and even samples are obtained. Generally, the difference between the actual, original, and predicted samples is known as wavelet coefficient and this process is called as lifting scheme.
In the next step, the update phase is applied where the even and odd sample values are updated based on the input samples. This generates the scaling coefficients. These coefficients are passed to the next step for further processing. This is expressed as follows:
where
After finishing these steps, the odd elements are replaced by the difference and even elements by the average values. This approach helps to obtain the integer coefficients which helps to make it reversible. Similarly, the reverse lifting scheme is also applied to reconstruct the original signal. Thus, inverse wavelet transform is applied. The reverse operation has three steps which includes update, predict, and merge. Figure 3 shows the architecture of reverse lifting scheme.
These operations are given as follows:
In this process, the samples are reduced by factor of two and final step produces a single output. In this phase, the input signal is processed through the low-pass and band-pass filters simultaneously and the output data are down sampled in each stage. The complete process is repeated several times until the single output is generated by combining the four output bands
3.2.2 Huffman coding
In this section, Huffman coding for image compression is discussed. The Huffman coding is a widely adopted technique for lossless data compression. Mainly, this scheme assigns the variable length codes to the input data and this length depends on the frequencies of characters in the input data stream. Generally, the characters with high frequency occurrence are assigned the small code and characters with low-occurrence are assigned the largest code. The variable length codes are known as the prefix codes. These codes are assigned in such a way that once the code is assigned to one character, it should not be assigned to any other character. This helps to ensure no uncertainty while decoding the data. The Huffman coding scheme contains two main steps which include construction of tree and assigning the prefix codes to the characters.
Huffman encoding: In this phase, the Huffman tree is constructed which has the unique characters along with their frequency of occurrence. Now, with the help of unique character, leaf node is built, which contains the unique characters. Later, all leaf nodes are considered and a minimum heap is constructed. This heap is used for prioritizing the queue. The character which has the least occurrence, is assigned as the root of the tree. In the next phase, extract two nodes which are having the minimum heap and assign them as left and right child. This node is also added to the tree. This process is repeated until the heap contains only one node.
Huffman decoding: The Huffman tree which contains the entire information of characters is considered. At the receiver end, the receiver starts scanning this tree in a zigzag scanning process. When the tree is scanned towards the left, then “0” is assigned as decoding, otherwise “1” is assigned as the decoded output.
3.3 Image encryption
This section presents the proposed solution for image encryption by using chaotic maps, permutations, and diffusion operations. Initially, the preliminaries of the system mapping the arbitrary size of data to a fixed data using hash operation, hyper-chaotic sequence, and random sequence generation for encryption and decryption purpose are described.
3.3.1 Hash function
Hash function is a function which considers any arbitrary size of data and maps this data to a structure of fixed size elements. In this approach, SHA-256 hash function is adopted to generate the 256-bit hash value
where
This repmat operation generates a large matrix
3.3.2 Chaotic map generation
The proposed approach uses hyper-chaotic system and Chebyshev maps to accomplish the encryption process. In this approach, four-dimensional hyper-chaotic system with four initial conditions and four initial parameters is adopted. The hyper chaotic system can be defined as follows:
where
where
Algorithm 1
Image compression and securing process
Step 1: Input image, simulation parameters. |
Step 2: Obtain the wavelet bands of the input image as
|
Step 3: identify the approximation and detailed coefficients
|
Step 4: Apply Huffman encoding on the obtained coefficients. |
Step 5: Generate hash of these coefficients as
|
Step 6: Generate chaotic map of the hash image. |
Step 7: Apply encryption process as mentioned in algorithm 2. |
Step 7: Initialize the reconstruction phase. |
Step 8: Rearrange the random sequence. |
Step 9: Rearrange the chaotic maps of the current image. |
Step 10: Apply de-hashing process to obtain the original information. |
Step 11: Apply inverse lifting processing. |
Step 12: Rearrange the wavelet bands to reconstruct the image. |
3.3.3 Random sequence generation
In the chaotic sequence generation, initial values of the chaotic system are considered to generate the four sequences as
Further, these sequences are transformed into inter value sequence as follows:
Here
The main idea of the proposed approach is to use permutation and diffusion-based approach for image encryption. The chaotic sequence helps to shuffle the pixels. The permutation does not affect the pixel value and establishes a complicated statistical relationship between the cipher and key; thus, the attackers cannot infer the data. Similarly, the diffusion is a process where plaintext data affect the bits of cipher text which helps to improve the sensitivity.
3.3.4 Encryption process
This section presents the proposed encryption approach which is comprised of three steps. First, chaotic sequence is applied to the generation module, in the next phase, permutation module is applied, and the third phase consists of diffusion process. The complete approach is described as follows:
Algorithm 2
Image encryption
Step 1: Let us consider an input image of size
|
Step 2: In the next phase, the chaotic sequence
|
Step 3: In the next stage, permutation operation is performed on the image vector
|
(11)
|
where
|
(12)
|
In order to prevent the attacks on scrambling sequence a factor
|
Step 4: In this phase, the combined confusion and diffusion steps are applied to encrypt the element wise data of
|
(13)
|
Step 5: Encrypt the i-th element of image vector data as follows: |
|
|
(14)
|
where
|
(15)
|
3.3.5 Decryption process
In this section, the decryption process is presented to reconstruct the original image from the encrypted data. The complete decryption process is as follows:
Step 1: Initially, generate the required parameters such as chaotic sequences such as
Step 2: Transform the ciphertext image in a one-dimensional vector and obtain the intermediate ciphertext as follows:
Step 3: Obtain the decrypted last element of
This process is repeated from last pixel to first pixel until all elements of sequence is obtained as
4 Results and discussion
In this section, the experimental analysis of the proposed combined approach of image compression and encryption is presented. The experiments are conducted using MATLAB and Python tools installed on windows platform. This approach is tested on biomedical images where different types of modalities such as ultrasound, CT, and MRI images are included. For each type of modality, five images are considered. Ultrasound images are obtained from https://www.ultrasoundcases.info/cases/abdomen-and-retroperitoneum/, while CT and MRI images are obtained from https://www.osirix-viewer.com/. In order to evaluate the performance of the proposed approach, several analyses are performed such as histogram analysis, autocorrelation of adjacent pixels, information entropy, number of pixels change rate (NPCR), unified average changing intensity (UACI analysis), PSNR, and mean squared error (MSE).
4.1 System requirements
The proposed approach is simulated by using MATLAB simulation tool running on windows platform. The operative device has 4GB of NVIDIA graphic memory, 8 GB RAM, and 1TB storage space and Intel i7 10th gen processor.
4.2 Qualitative analysis by using histogram assessment
In this section, the histogram analysis of the proposed approach for different images is presented. The histogram analysis of any image illustrates the graphical representation of tone distribution in any given digital image. The similarity of histogram between the encrypted images shows the better cryptography whereas the actual histogram of different images differs from each other. The histogram amplitude of the original and reconstructed image shows the deviation in the quality of the image while reconstructing. Figure 4 depicts the histogram analysis of image encryption.
Column 1 of Figure 4 shows the sample original images which include three multimodal images, namely, ultrasound, MRI, and CT. Column 2 shows the histogram of image which illustrates the right skewed distribution of histogram and column 3 shows the encrypted image data. We also performed the histogram analysis on encrypted image as depicted in column 4 of Figure 4. Decrypted image and its corresponding histogram values are presented in columns 5 and 6 of Figure 4, respectively. The histogram of original image shows the right-skewed distribution whereas the histogram of corresponding encrypted image shows the normal distribution, and the final reconstructed output shows the distribution similar to the original histogram.
Here chaotic map generation helps to randomize the distribution which ensures the robustness of the encryption. Similarly, the quality of reconstruction depends on the approximation and detailed coefficient fusion and reconstruction along with Huffman coding.
4.3 Quantitative analysis
Further, we perform the Quantitative analysis by considering different set of images from three different modalities such as ultrasound, MRI, and CT. Figure 5 shows the sample images of these modalities which are used for autocorrelation and entropy information analysis.
4.3.1 Autocorrelation analysis
In this section, the autocorrelation analysis of adjacent pixels for different images is presented. For this analysis, horizontal (H), vertical (V), and diagonal (D) pixels are considered. The horizontal analysis considers horizontal pixel matrix, vertical analysis considers vertical pixel matrix, and diagonal analysis considers diagonal pixel matrix. For each analysis, five images from each category are considered. Table 2 shows a comparative analysis of autocorrelation of five different ultrasound images. The obtained performance is compared with the existing techniques such as GA, grey wolf optimization (GWO) and self-adaptive GWO [37].
Images | Standard | GA [35] | GWO [36] | Adaptive GWO | Proposed | |
---|---|---|---|---|---|---|
Image 1 | H | −0.0082 | 0.0229 | 0.0194 | 0.0008 | 0.0006 |
V | 0.0025 | −0.0198 | −0.0179 | 0.0243 | 0.0202 | |
D | 0.0325 | −0.0135 | 0.0007 | 0.0202 | 0.0158 | |
Image 2 | H | −0.0093 | 0.0089 | 0.0129 | −0.0302 | 0.0013 |
V | −0.0141 | 0.0078 | 0.0118 | 0.0145 | 0.0121 | |
D | -0.0033 | 0.0268 | −0.0057 | −0.0037 | −0.0012 | |
Image 3 | H | −0.0251 | −0.0195 | 0.0051 | 0.0006 | 0.0001 |
V | −0.0036 | 0.0117 | −0.0116 | −0.005 | -0.0056 | |
D | −0.0148 | −0.0115 | −0.0076 | −0.0192 | 0.0013 | |
Image 4 | H | −0.0028 | 0.0221 | −0.0121 | 0.0206 | 0.0255 |
V | −0.0006 | 0.0276 | 0.0084 | −0.0169 | −0.0152 | |
D | 0.0166 | −0.0152 | 0.014 | −0.0023 | −0.2013 | |
Image 5 | H | −0.0095 | −0.0124 | 0.0066 | 0.005 | 0.0015 |
V | −0.0134 | −0.0365 | −0.0151 | 0.0121 | 0.0011 | |
D | −0.019 | 0.0294 | 0.0001 | −0.0365 | −0.0251 |
The autocorrelation analysis shows the similarity between the original and reconstructed images. However, the proposed approach performs several steps such as histogram lifting wavelet, Huffman encoding, SHA-256, and diffusion, but the robustness of lifting helps to maintain the pixel quality, HAS-256 helps to secure the information and facilitates the appropriate reconstruction. Moreover, the enhanced diffusion scheme helps to enhance the quality of mage by merging the detailed coefficients. Similarly, different CT images are considered and the autocorrelation analysis of adjacent pixels are applied. The comparative analysis for CT image is presented in Table 3.
Images | Standard | GA [35] | GWO [36] | Adaptive GWO | Proposed | |
---|---|---|---|---|---|---|
Image 1 | H | 0.017 | −0.0193 | 0.0078 | 0.0029 | 0.0001 |
V | −0.0421 | 0.0243 | −0.0204 | 0.0064 | 0.0025 | |
D | −0.0053 | −0.0448 | −0.0088 | 0.015 | 0.0026 | |
Image 2 | H | 0.0065 | 0.0011 | 0.015 | 0.0063 | 0.0055 |
V | 0.0043 | 0.0002 | −0.0187 | 0.033 | 0.0257 | |
D | 0.011 | −0.0106 | −0.0021 | −0.0237 | 0.0015 | |
Image 3 | H | −0.0291 | −0.0014 | 0.0289 | 0.0077 | 0.0056 |
V | −0.0275 | 0.0011 | 0.0126 | −0.0044 | −0.0024 | |
D | 0.025 | 0.024 | 0.0066 | −0.0163 | −0.0117 | |
Image 4 | H | 0.0106 | 0.004 | −0.023 | 0.0056 | 0.0024 |
V | −0.0197 | 0.0045 | −0.0151 | 0.0094 | 0.001 | |
D | 0.0038 | 0.0155 | −0.0113 | −0.0076 | −0.0125 | |
Image 5 | H | 0.0105 | −0.0011 | 0.0185 | 0.0235 | −0.0011 |
V | −0.006 | −0.0131 | −0.007 | −0.0127 | 0.0085 | |
D | −0.0029 | −0.0042 | −0.0364 | −0.0081 | −0.0012 |
Further, the autocorrelation analysis for MRI images is performed. The obtained outcome is presented in Table 4.
Images | Standard | GA [35] | GWO [36] | Adaptive GWO | Proposed | |
---|---|---|---|---|---|---|
Image 1 | H | 0.0136 | −0.0118 | −0.0017 | 0.0064 | −0.0051 |
V | −0.0043 | −0.0179 | 0.0104 | −0.0204 | −0.0127 | |
D | −0.0095 | −0.0076 | 0.0045 | −0.002 | 0.0019 | |
Image 2 | H | −0.0122 | −0.0078 | −0.0062 | 0.023 | 0.0022 |
V | 0.0056 | 0.0122 | 0.0307 | 0.0036 | −0.0541 | |
D | −0.0177 | 0.0225 | 0.0222 | 0.025 | 0.0124 | |
Image 3 | H | 0.0086 | 0.0061 | −0.0026 | −0.0157 | −0.0129 |
V | 0.0314 | −0.0111 | −0.0137 | 0.0373 | 0.0019 | |
D | −0.0077 | 0.0164 | 0.0072 | 0.0176 | 0.0011 | |
Image 4 | H | 0.0106 | 0.004 | −0.023 | 0.0056 | 0.0024 |
V | −0.0197 | 0.0045 | −0.0151 | 0.0094 | 0.001 | |
D | 0.0038 | 0.0155 | −0.0113 | −0.0076 | −0.0125 | |
Image 5 | H | 0.0105 | −0.0011 | 0.0185 | 0.0235 | −0.0011 |
V | −0.006 | −0.0131 | −0.007 | −0.0127 | 0.0085 | |
D | −0.0029 | −0.0042 | −0.0364 | −0.0081 | −0.0012 |
The autocorrelation analysis shows that the proposed approach achieves better performance when compared with the existing techniques for each scenario of analysis including horizontal, vertical, and diagonal analysis.
4.3.2 Information entropy analysis
Here the information entropy analysis is presented for different modality images. Generally, the information entropy
where
Image modalities | Methods | Image 1 | Image 2 | Image 3 | Image 4 | Image 5 |
---|---|---|---|---|---|---|
Ultrasound | Standard | 7.951 | 7.956 | 7.956 | 7.958 | 7.96 |
GA [35] | 7.968 | 7.964 | 7.967 | 7.965 | 7.964 | |
GWO [36] | 7.966 | 7.965 | 7.965 | 7.964 | 7.965 | |
Adaptive GWO [37] | 7.965 | 7.967 | 7.965 | 7.967 | 7.966 | |
Proposed | 7.986 | 7.986 | 7.97 | 9.985 | 7.982 | |
CT | Standard | 7.959 | 7.955 | 7.962 | 7.957 | 7.95 |
GA [35] | 7.966 | 7.967 | 7.966 | 7.965 | 7.964 | |
GWO [36] | 7.967 | 7.966 | 7.968 | 7.966 | 7.965 | |
Adaptive GWO [37] | 7.967 | 7.968 | 7.968 | 7.968 | 7.968 | |
Proposed | 7.986 | 9.976 | 7.988 | 7.991 | 7.982 | |
MRI | Standard | 7.951 | 7.956 | 7.951 | 7.952 | 7.95 |
GA [35] | 7.966 | 7.965 | 7.967 | 7.969 | 7.965 | |
GWO [36] | 7.967 | 7.966 | 7.965 | 7.964 | 7.965 | |
Adaptive GWO [37] | 7.968 | 7.967 | 7.965 | 7.967 | 7.967 | |
Proposed | 7.982 | 7.995 | 7.991 | 7.986 | 7.979 |
4.3.3 Differential analysis
Generally, the image encryption schemes are sensitive with respect to the change in the plain images i.e., minor change in the plain image affects the encryption performance. Differential analysis is a process which allows attacker to change the plain image and regenerate the encrypted image. This analysis is performed by using NPCR and the UACI. Let us consider that initially the image is denoted as
where
Image modalities | Attacks | NPCR | UACI | ||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
Ultrasound | Methods | Image 1 | Image 2 | Image 3 | Image 4 | Image 5 | Image 1 | Image 2 | Image 3 | Image 4 | Image 5 |
Standard | 99.634 | 99.683 | 99.536 | 99.39 | 99.609 | 12.207 | 12.451 | 12.427 | 12.793 | 12.891 | |
GA [35] | 99.683 | 99.683 | 99.561 | 99.487 | 99.585 | 12.085 | 12.769 | 13.501 | 12.598 | 13.745 | |
GWO [36] | 99.756 | 99.634 | 99.609 | 99.487 | 99.707 | 12.207 | 13.281 | 12.28 | 12.329 | 11.23 | |
Adaptive GWO [37] | 99.561 | 99.463 | 99.658 | 99.756 | 99.634 | 12.524 | 12.744 | 12.891 | 11.572 | 13.403 | |
Proposed | 99.765 | 99.821 | 99.863 | 99.891 | 99.875 | 14.251 | 13.41 | 14.251 | 13.335 | 14.502 | |
CT | Standard | 99.512 | 99.585 | 99.536 | 99.39 | 99.609 | 12.354 | 12.891 | 12.427 | 12.793 | 12.891 |
GA [35] | 99.634 | 99.78 | 99.561 | 99.487 | 99.585 | 12.207 | 12.28 | 13.501 | 12.598 | 13.745 | |
GWO [36] | 99.39 | 99.634 | 99.609 | 99.487 | 99.707 | 12.72 | 12.744 | 12.28 | 12.329 | 11.23 | |
Adaptive GWO [37] | 99.731 | 99.536 | 99.658 | 99.756 | 99.634 | 12.891 | 12.915 | 12.891 | 11.572 | 13.403 | |
Proposed | 99.863 | 99.962 | 99.823 | 99.864 | 99.892 | 12.912 | 13.251 | 12.956 | 13.208 | 14.551 | |
MRI | Standard | 99.634 | 99.805 | 99.536 | 99.39 | 99.609 | 14.038 | 13.647 | 12.427 | 12.793 | 12.891 |
GA [35] | 99.512 | 99.683 | 99.561 | 99.487 | 99.585 | 12.744 | 12.744 | 13.501 | 12.598 | 13.745 | |
GWO [36] | 99.609 | 99.658 | 99.609 | 99.487 | 99.707 | 12.231 | 11.914 | 12.28 | 12.329 | 11.23 | |
Adaptive GWO [37] | 99.805 | 99.78 | 99.658 | 99.756 | 99.634 | 13.062 | 12.793 | 12.891 | 11.572 | 13.403 | |
Proposed | 99.127 | 99.868 | 99.921 | 99.789 | 99.863 | 14.587 | 14.621 | 13.251 | 12.881 | 13.852 |
The proposed SHA and diffusion-based schemes help to reconstruct the images efficiently and also these schemes ensure the prevention of different attacks.
4.3.4 Comparative analysis for image compression
In this section, the comparative analysis for image compression scheme is presented. Similar to previous experiment, in this section, histogram analysis for the considered medical images is performed. Figure 6 depicts the outcome of histogram analysis for ultrasound, CT, and MRI images.
Similar to the experiment presented for Figure 4, we perform the histogram analysis for compressed images. Histogram of original, compressed, and reconstructed image is depicted in Figure 6.
Column 1 of Figure 6 depicts the sample original images which include three multimodal images, namely, ultrasound, MRI and CT. these images are used as input for compression purpose. Column 2 shows the histogram of image which illustrates the normal distribution of histogram, while column 3 shows the compressed image data. The histogram of compressed image is presented in column 4 which shows the random distribution from which it can be concluded that the detailed coefficients present the most significant part of the image. Similarly, reconstructed image and histogram are depicted in columns 5 and 6.
The performance of the proposed approach is measured in terms of PSNR, MSE, and SSIM. These parameters can be computed as mentioned in Table 7.
Parameter | Formula |
---|---|
Mean square error |
|
Peak signal to noise ratio |
|
SSIM |
|
The outcome of the proposed approach is compared with the existing schemes. Table 8 shows a comparative analysis for image compression.
Images | Technique | PSNR | MSE | SSIM |
---|---|---|---|---|
Ultrasound | SPIHT | 31.83780 | 42.5892 | 0.768457 |
DWT | 37.2111 | 12.3586 | 0.84581 | |
DCT | 37.502427 | 11.5568 | 0.8587 | |
Proposed approach | 47.13192 | 1.2586 | 0.95667 | |
MRI | SPIHT | 36.5017 | 14.5513 | 0.752314 |
DWT | 34.8624 | 21.2243 | 0.786128 | |
DCT | 32.5415 | 36.2178 | 0.89014 | |
Proposed approach | 44.0333 | 2.5689 | 0.983516 | |
CT | SPIHT | 34.73046 | 21.8793 | 0.755633 |
DWT | 33.0050 | 32.5516 | 0.798745 | |
DCT | 35.07131 | 20.2278 | 0.845519 | |
Proposed approach | 43.0480 | 3.2231 | 0.985941 |
The comparative analysis shows that the proposed approach achieves better performance when compared with the existing techniques in terms of PSNR, MSE, and SSIM.
5 Conclusion
In this work, the focus is on biomedical imaging and it was identified that currently telemedicine diagnosis systems are widely adopted. In these systems, the data are transmitted to the remote location which consumes more bandwidth and also the medical images require huge storage space. Moreover, during the transmission, maintaining the security is also considered as a prime task. Hence, the work presents a combined approach for data compression to reduce the storage requirement and encryption to maintain the security. The compression scheme is based on the hybrid approach of predictive coding, Huffman coding, and DWT whereas encryption scheme is based on the diffusion and confusion framework. The work presents an extensive experimental analysis and the outcome of proposed approach are compared with the existing schemes which shows that the proposed approach achieves better performance. This approach achieves PSNR values as 47.13, 44.03, and 43.04 dB for ultrasound, MRI, and CT images, respectively. However, this approach is tested for 2D biomedical images, and hence 3D image processing and multispectral images still remain a challenging task which can be incorporated in the future research.
-
Funding information: No funds or grants were received by any of the authors.
-
Author contributions: Latha H.R. and A. Ramaprasath contributed to the design and methodology of this study, the assessment of the outcomes, and the writing of the manuscript.
-
Conflict of interest: There is no conflict of interest among the authors.
-
Code availability: Not applicable.
-
Ethical approval: This research does not involve clinical or animal samples and therefore does not require ethical approval.
-
Data availability statement: All data generated or analyzed during this study are included in the manuscript.
References
[1] Palma-Chavez J, Pfefer TJ, Agrawal A, Jokerst JV, Vogt WC. Review of consensus test methods in medical imaging and current practices in photoacoustic image quality assessment. J Biomed Opt. 2021;26(9):090901.10.1117/1.JBO.26.9.090901Search in Google Scholar PubMed PubMed Central
[2] Liu F, Hernandez-Cabronero M, Sanchez V, Marcellin MW, Bilgin A. The current role of image compression standards in medical imaging. Information. 2017;8(4):131.10.3390/info8040131Search in Google Scholar PubMed PubMed Central
[3] Amri H, Khalfallah A, Gargouri M, Nebhani N, Lapayre JC, Bouhlel MS. Medical image compression approach based on image resizing, digital watermarking and lossless compression. J Signal Process Syst. 2017;87(2):203–14.10.1007/s11265-016-1150-5Search in Google Scholar
[4] Hussain AJ, Al-Fayadh A, Radi N. Image compression techniques: A survey in lossless and lossy algorithms. Neurocomputing. 2018;300:44–69.10.1016/j.neucom.2018.02.094Search in Google Scholar
[5] Kumari M, Gupta S, Sardana P. A survey of image encryption algorithms. 3D Res. 2017;8(4):37.10.1007/s13319-017-0148-5Search in Google Scholar
[6] Rahman M, Hamada M. Lossless image compression techniques: A state-of-the-art survey. Symmetry. 2019;11(10):1274.10.3390/sym11101274Search in Google Scholar
[7] Uthayakumar J, Vengattaraman T, Dhavachelvan P. A survey on data compression techniques: From the perspective of data quality, coding schemes, data type and applications. J King Saud Univ Comput Inf Sci. 2018;33:119–40.Search in Google Scholar
[8] Henriques MS, Vernekar NK. Using symmetric and asymmetric cryptography to secure communication between devices in IoT. 2017 international conference on IoT and application (ICIOT). IEEE; 2017, May. p. 1–4.10.1109/ICIOTA.2017.8073643Search in Google Scholar
[9] Jara-Vera V, Sánchez-Ávila C. Cryptobiometrics for the generation of cancellable symmetric and asymmetric ciphers with perfect secrecy. Mathematics. 2020;8(9):1536.10.3390/math8091536Search in Google Scholar
[10] Jain M, Kumar A. RGB channel based decision tree grey-alpha medical image steganography with RSA cryptosystem. Int J Mach Learn Cybern. 2017;8(5):1695–705.10.1007/s13042-016-0542-ySearch in Google Scholar
[11] Mane YD, Khot UP. Detection and deactivation of application layer-based DDoS attack from private Tor network. Inventive communication and computational technologies. Singapore: Springer; 2021. p. 725–35.10.1007/978-981-15-7345-3_62Search in Google Scholar
[12] Schneier B. Data encryption standard (DES). Applied cryptography, second edition: Protocols, algorthms, and source code in C; 2015. p. 265–301.10.1002/9781119183471Search in Google Scholar
[13] Shakir HR. An image encryption method based on selective AES coding of wavelet transform and chaotic pixel shuffling. Multimed Tools Appl. 2019;78(18):26073–87.10.1007/s11042-019-07766-zSearch in Google Scholar
[14] Jawad LM, Sulong G. Chaotic map-embedded Blowfish algorithm for security enhancement of colour image encryption. Nonlinear Dyn. 2015;81(4):2079–93.10.1007/s11071-015-2127-9Search in Google Scholar
[15] Chuman T, Sirichotedumrong W, Kiya H. Encryption-then-compression systems using grayscale-based image encryption for JPEG images. IEEE Trans Inf Forensics security. 2018;14(6):1515–25.10.1109/TIFS.2018.2881677Search in Google Scholar
[16] Kurihara K, Imaizumi S, Shiota S, Kiya H. An encryption-then-compression system for lossless image compression standards. IEICE Trans Inf Syst. 2017;100(1):52–6.10.1587/transinf.2016MUL0002Search in Google Scholar
[17] Zhu S, Zhu C, Wang W. A novel image compression-encryption scheme based on chaos and compression sensing. IEEE Access. 2018;6:67095–107.10.1109/ACCESS.2018.2874336Search in Google Scholar
[18] SerElkhetm S, Heshmat S. A survey study on joint image compression-encryption methods. 2020 international conference on innovative trends in communication and computer engineering (ITCE). IEEE; 2020, February. p. 222–6.10.1109/ITCE48509.2020.9047777Search in Google Scholar
[19] Hua Z, Yi S, Zhou Y. Medical image encryption using high-speed scrambling and pixel adaptive diffusion. Signal Process. 2018;144:134–44.10.1016/j.sigpro.2017.10.004Search in Google Scholar
[20] Nematzadeh H, Enayatifar R, Motameni H, Guimarães FG, Coelho VN. Medical image encryption using a hybrid model of modified genetic algorithm and coupled map lattices. Opt Lasers Eng. 2018;110:24–32.10.1016/j.optlaseng.2018.05.009Search in Google Scholar
[21] Belazi A, Talha M, Kharbech S, Xiang W. Novel medical image encryption scheme based on chaos and DNA encoding. IEEE Access. 2019;7:36667–81.10.1109/ACCESS.2019.2906292Search in Google Scholar
[22] Kumar S, Panna B, Jha RK. Medical image encryption using fractional discrete cosine transform with chaotic function. Med Biol Eng Comput. 2019;57(11):2517–33.10.1007/s11517-019-02037-3Search in Google Scholar PubMed
[23] Amirtharajan R. A robust medical image encryption in dual domain: Chaos-DNA-IWT combined approach. Med Biol Eng Comput. 2020;58(7):1445–58.10.1007/s11517-020-02178-wSearch in Google Scholar PubMed
[24] Ding Y, Wu G, Chen D, Zhang N, Gong L, Cao M, et al. DeepEDN: a deep learning-based image encryption and decryption network for internet of medical things. IEEE Internet Things J. 2020;8:1504–18.10.1109/JIOT.2020.3012452Search in Google Scholar
[25] Bruylants T, Munteanu A, Schelkens P. Wavelet based volumetric medical image compression. Signal Processing: Image Commun. 2015;31:112–33.10.1016/j.image.2014.12.007Search in Google Scholar
[26] Zuo Z, Lan X, Deng L, Yao S, Wang X. An improved medical image compression technique with lossless region of interest. Optik. 2015;126(21):2825–31.10.1016/j.ijleo.2015.07.005Search in Google Scholar
[27] Lone MR. A high speed and memory efficient algorithm for perceptually-lossless volumetric medical image compression. J King Saud Univ Comput Inf Sci. 2020;34:2964–74.10.1016/j.jksuci.2020.04.014Search in Google Scholar
[28] Song X, Huang Q, Chang S, He J, Wang H. Lossless medical image compression using geometry-adaptive partitioning and least square-based prediction. Med Biol Eng Comput. 2018;56(6):957–66.10.1007/s11517-017-1741-8Search in Google Scholar PubMed
[29] Geetha K, Anitha V, Elhoseny M, Kathiresan S, Shamsolmoali P, Selim MM. An evolutionary lion optimization algorithm‐based image compression technique for biomedical applications. Expert Syst. 2021;38(1):e12508.10.1111/exsy.12508Search in Google Scholar
[30] Raja SP. Joint medical image compression–encryption in the cloud using multiscale transform-based image compression encoding techniques. Sādhanā. 2019;44(2):28.10.1007/s12046-018-1013-9Search in Google Scholar
[31] Ghaffari A. Image compression-encryption method based on two-dimensional sparse recovery and chaotic system. Sci Rep. 2021;11(1):1–19.10.1038/s41598-020-79747-4Search in Google Scholar PubMed PubMed Central
[32] Gan Z, Chai X, Zhang J, Zhang Y, Chen Y. An effective image compression–encryption scheme based on compressive sensing (CS) and game of life (GOL). Neural Comput Appl. 2020;32(17):14113–41.10.1007/s00521-020-04808-8Search in Google Scholar
[33] Zhang M, Tong XJ, Liu J, Wang Z, Liu J, Liu B, et al. Image compression and encryption scheme based on compressive sensing and Fourier transform. IEEE Access. 2020;8:40838–49.10.1109/ACCESS.2020.2976798Search in Google Scholar
[34] Shirsat TG, Bairagi VK. Lossless medical image compression by integer wavelet and predictive coding. Int Sch Res Not. 2013;2013.10.1155/2013/832527Search in Google Scholar
[35] Abdullah AH, Enayatifar R, Leeb M. A hybrid genetic algorithm and chaotic function model for image encryption. Int J Electron Commun (AEU). 2012;66:806–16.10.1016/j.aeue.2012.01.015Search in Google Scholar
[36] Mirjalili S, Mirjalili SM, Lewis A. Grey wolf optimizer. Adv Eng Softw. 2014;69:46–61.10.1016/j.advengsoft.2013.12.007Search in Google Scholar
[37] Koppu S, Viswanatham VM. Medical image security enhancement using two dimensional chaotic mapping optimized by self-adaptive grey wolf algorithm. Evolut Intell. 2018;11(1):53–71.10.1007/s12065-018-0159-zSearch in Google Scholar
© 2023 the author(s), published by De Gruyter
This work is licensed under the Creative Commons Attribution 4.0 International License.