Upload
vladimir-kulyukin
View
103
Download
4
Tags:
Embed Size (px)
Citation preview
Wavelets & Wavelet Algorithms
Data Compression, Edge Detection, & Blur Detectionwith
2D Fast Haar Wavelet Transform
Vladimir Kulyukin
www.vkedco.blogspot.comwww.vkedco.blogspot.com
Outline
● Data Compression● Edge Detection● Blur Detection
Data Compression
Data Compression
● There are many situations when some frequencies may not be needed: they may originate from noise or be too detailed
● These frequencies may be eliminated from the transformed data and then the inverse FHWT can be applied to obtain a different version of the transformed data that can, in turn, be compressed more easily
Data Compression: Example 01
.0,1,1,3,2,0,1,50,1,1,2
82,2,0,1,
2
82 :sweep 3rd
;0,1,1,8,2,0,1,20,2
97,1,
2
97,2,
2
22,1,
2
22 :sweep nd2
;0,9,1,7,2,2,1,22
99,
2
99,
2
68,
2
68,
2
40,
2
40,
2
13,
2
13 :sweep1st
.9,9,6,8,4,0,1,3 toFHWT Place-InApply
33
23
13
2
s
s
s
s
.,,,,,,,
0 ,1 ,1 ,3 ,2 ,0 ,1 ,5
:result FHWT Place-In
23
11
22
00
21
10
20
00
0
ccccccca
s
Data Compression Example 01: Removing Highest Frequencies
.,,,
:first computed are sfrequenciehighest that theRecall
s.frequenciehighest theremove want to weSuppose
.,,,,,,,
0 ,1 ,1 ,3 ,2 ,0 ,1 ,5
:result FHWT Place-In
23
22
21
20
23
11
22
00
21
10
20
00
0
cccc
ccccccca
s
Data Compression Example 01: Removing Highest Frequencies
0 ,1 ,0 ,3 ,0 ,0 ,0 ,5
:sample ed transformfollowing in the results
0 to themsettingby sfrequenciehighest theRemoving
.,,,,,,,
0 ,1 ,1 ,3 ,2 ,0 ,1 ,5
:result FHWT Place-In
23
11
22
00
21
10
20
00
0
ccccccca
s
Data Compression Example 01: Restoring Sample
.9 ,9 ,7 ,7 ,2 ,2 ,2 ,2
:sample above the toFHWT inverse place-in Applying
.0 ,1 ,0 ,3 ,0 ,0 ,0 ,5
:sample ed transformfollowing in the results
0 to themsettingby sfrequenciehighest theRemoving
.,,,,,,,
0 ,1 ,1 ,3 ,2 ,0 ,1 ,5
:result FHWT Place-In
23
11
22
00
21
10
20
00
0
ccccccca
s
Data Compression Example 01: Original & Restored Samples
9 ,9 ,6 ,8 ,4 ,0 ,1 ,3 :sample Original 9 ,9 ,7 ,7 ,2 ,2 ,2 ,2 :sample Restored
9 ,7 ,2 ,2 tocompressed becan 9 ,9 ,7 ,7 ,2 ,2 ,2 ,2 sample Restored
Data Compression Example 02: Restoring Sample
0 ,0 ,0 ,3 ,2 ,0 ,0 ,5
isresult The .2,3
:magnitudeslargest with twotscoefficienonly retain can We
.,,,,,,,
0 ,1 ,1 ,3 ,2 ,0 ,1 ,5
:result FHWT Place-In
21
00
23
11
22
00
21
10
20
00
0
cc
ccccccca
s
Data Compression Example 02: Restoring Sample
8 ,8 ,8 ,8 ,4 ,0 ,2 ,2
:result the toFHWT inverse place-inApply
0 ,0 ,0 ,3 ,2 ,0 ,0 ,5
isresult The .2,3
:magnitudeslargest with twotscoefficienonly retain can We
.,,,,,,,
0 ,1 ,1 ,3 ,2 ,0 ,1 ,5
:result FHWT Place-In
21
00
23
11
22
00
21
10
20
00
0
cc
ccccccca
s
Data Compression Example 02: Original & Restored Samples
9 ,9 ,6 ,8 ,4 ,0 ,1 ,3 :sample Original 8 ,8 ,8 ,8 ,4 ,0 ,2 ,2 :sample Restored
8 ,4 ,0 ,2 tocompressed becan 8 ,8 ,8 ,8 ,4 ,0 ,2 ,2 sample Restored
Data Compression Example 03: Retaining Highest Magnitude Wavelets
.
on Recurse :3 Step
;
1 0 0 1
1 0 0 2
1 3 2 4
1 1 4 6
:Rearrange :2 Step
;
1 0 0 1
1 2 3 4
1 0 0 2
1 4 1 6
2 2 0 6
0 4 2 8
4 4 3 5
2 6 7 9
:Transform :1 Step
13
12
11
10
00
00
13
12
13
12
11
10
11
10
13
12
13
12
11
10
11
10
Rearrange
13
13
12
12
13
13
12
12
11
11
10
10
11
11
10
10
BHWT 2
aa
aa
DV
HA
ddvv
ddvv
hhaa
hhaa
dvdv
haha
dvdv
haha
D
13
12
13
12
11
10
11
10
13
12
00
00
11
10
00
00
00
00
00
00
Based-Column 1
Based-Row 1 2
1 0 0 1
1 0 0 2
1 3 0 1
1 1 1 4
:matrixresult Return the :5 Step
.
0 1
1 4
2
11
2
352
11
2
35
1 3
1 5
2
24
2
24
2
46
2
46
2 4
4 6 Transform :4 Step
ddvv
ddvv
hhdv
hhha
dv
haHWTD
HWTDBHWTD
2. and 3, 4, :veletshighest wa twoand average retain thecan We
Data Compression Example 03: Retaining Highest Magnitude Wavelets
13
12
13
12
11
10
11
10
13
12
00
00
11
10
00
00
1 0 0 1
1 0 0 2
1 3 0 1
1 1 1 4
:matrix dTransforme
ddvv
ddvv
hhdv
hhha
2. and 3, 4, :veletshighest wa twoand average theretainingby compresscan We
.
4 4 1 1
4 4 7 7
4 4 2 6
4 4 2 6
0 0 0 0
0 0 0 0
0 0 2- 2
0 0 2- 2
0 0 3- 3-
0 0 3 3
0 0 0 0
0 0 0 0
4 4 4 4
4 4 4 4
4 4 4 4
4 4 4 4
0 0 0 0
0 0 0 0
0 0 1- 1
0 0 1- 1
2
0 0 1- 1-
0 0 1 1
0 0 0 0
0 0 0 0
3
1 1 1 1
1 1 1 1
1 1 1 1
1 1 1 1
4:matrix Restored
2 2 0 6
0 4 2 8
4 4 3 5
2 6 7 9
:matrix Original
Data Compression Example 04: Retaining Lowest Wavelets
13
12
13
12
11
10
11
10
13
12
00
00
11
10
00
00
1 0 0 1
1 0 0 2
1 3 0 1
1 1 1 4
:matrix dTransforme
ddvv
ddvv
hhdv
hhha
0. and 1, 1, 4, :eletslowest wav threeand average theretainingby compresscan We
.2 4
4 6 tocompressedfurther becan result This
.
2 2 4 4
2 2 4 4
4 4 6 6
4 4 6 6
1- 1- 1 1
1- 1- 1 1
1- 1- 1 1
1- 1- 1 1
1
1- 1- 1- 1-
1- 1- 1- 1-
1 1 1 1
1 1 1 1
1
1 1 1 1
1 1 1 1
1 1 1 1
1 1 1 1
4:matrix Restored
2 2 0 6
0 4 2 8
4 4 3 5
2 6 7 9
:matrix Original
Edge Detection
1D Theory
● Images can be viewed as sequences of 1D rows or 1D columns
● In grayscale images, each sequence has values in [0, 255], where 0 is black and 255 is white
● Edges can be detected by 1D Haar wavelets of a given frequency (e.g., every 2 consecutive pixels, every 4 consecutive pixels, etc)
1D FHWT-Based Method
● Go through image row by row & column by column● Convert each row in an array of pixel values in [0, 255]● Apply 1D FHWT to that array● Mark individual pixels that contain edges detected
either on row-based or column-based edges with values of a given value
2D FHWT-Based Method
● Select a window of size 2n by 2n, n > 0 ● Iterate that window over an image● Apply 2D FHWT to the pixels covered by the window● Mark regions covered by the window with sufficiently
large horizontal, vertical, or diagonal wavelets as horizontal, vertical, or diagonal edges
Blur Detection
Theoretical Foundations
● Mallat & Hwang [1] argue that signals carry information via irregularites
● These researchers show that the local maxima of the wavelet transform detect locations of irregularites
● For example, 2D HWT maxima indicate possible locations of edges in images
Indirect & Direct Blur Detection Methods● Tong et al. [2] classify image blur detection methods into direct and
indirect
● Indirect methods characterize image blur as a linear function IB = B
* IO
+ N, where IO
is the original image, B is an unknown blur
function, N is a noise function, and is the result image blur and noise are introduced in the image
● Direct methods are based on detection of distinct features directly computed in the images, e.g., corners, edges, color histograms, etc
Edge Classification
Tong, H., Li, M., Zhang, H., and Zhang, C. "Blur detection for digital images using wavelet transform," In Proceedings of the IEEE International Conference on Multimedia and Expo, vol.1, pp. 27-30, June 2004.doi: 10.1109/ICME.2004.1394114.
Edge-Based Blur Detection
● Tong et al. [2] propose a direct method based on 2D Haar Wavelet Transform
● Main assumption is that introduction of blur has different effects on the four main types of edges
● In blurred images, Dirac and A-Step edges are absent whereas G-Step and Roof edges lose their sharpness
● Images are classified as blurred on the basis of presence/absence of Dirac & A-Step edges
Another 2D HWT-Based Blur Estimation Method
● Another 2D HWT- based blur estimation method is presented in [3]● It may not be necessary to detect any explicit features such as
corners or edges● Rather, it may be possible to detect regions with pronounced
changes without explicitly computing causes of those changes● After those regions are detected, they can be combined into larger
segments
Finding Regions with Pronounced Changes
● 2D HWT can be used to find image regions with pronounced horizontal, vertical, or diagonal changes
● A captured frame is divided into N x N windows (aka tiles) where N is an integral power of 2
● Border pixels at the right and bottom margins are discarded when captured frames are not divisible by N
Tile Processing
● Each tile is processed by four iterations of the 2D HWT● The number of iterations is a parameter and can be
increased or decreased● Each tile is represented by three numbers: horizontal
change (HC), vertical change (VC), and diagonal change (DC)
● These values can be thresholded to retain the tiles with only large changes
Tile Clustering
● After the tiles with pronounced changes are found, the depth-first search (DFS) algorithm is used to combine them into larger tile clusters
● DFS starts with an unmarked tile with a pronounced change and connects to it its immediate horizontal, vertical, and diagonal neighbors if they also have pronounced changes
● If such tiles are found, they are marked with the same cluster number and the search continues recursively
● After all tiles reachable from the current tile are found, the algorithm looks for another unmarked tile
● The algorithm terminates when no more unmarked tiles are found
Tile Cluster Filtering● After the tile clusters are found, two cluster-related rules are used to
classify a whole image as sharp or blurred● The 1st rule uses the percentage of the total area of the image covered by
the clusters● The 2nd rule uses the number of the tiles in each cluster to discard small
clusters● The 1st rule captures the intuition that sharp images have many tiles with
pronounced changes● The 2nd rule captures the intuition that small clusters should be discarded
as irrelevant
Pseudocode
Book References
● Y. Nievergelt. “Wavelets Made Easy.” Birkhauser, 1999.● C. S. Burrus, R. A. Gopinath, H. Guo. “Introduction to
Wavelets and Wavelet Transforms: A Primer.” Prentice Hall, 1998.
● G. P. Tolstov. “Fourier Series.” Dover Publications, Inc. 1962.
Paper References
[1] Mallat, S. and Hwang, W. L. “Singularity detection and processing with wavelets.” IEEE Transactions on Information Theory, vol. 38, no. 2, March 1992, pp. 617-643.
[2] Tong, H., Li, M., Zhang, H., and Zhang, C. "Blur detection for digital images using wavelet transform," In Proceedings of the IEEE International Conference on Multimedia and Expo, vol.1, pp. 27-30, June 2004.doi: 10.1109/ICME.2004.1394114.
[3] Kulyukin, V. & Andhavarapu. S. “Image Blur Detection with 2D Haar Wavelet Transform and Its Effect on Skewed Barcode Scanning.” To appear in Proceedings of the 19th International Conference on Image Processing, Computer Vision, & Pattern Recognition (IPCV 2015). Las Vegas, NV, USA.