**Journal of Applied Mathematics and Physics**

Vol.07 No.03(2019), Article ID:91378,8 pages

10.4236/jamp.2019.73044

Gravel Image Auto-Segmentation Based on an Improved Normalized Cuts Algorithm

Chao Wang^{*}, Xiangliang Lin, Changsheng Chen^{ }

School of Information and Mathematics, Yangtze University, Jingzhou, China

Copyright © 2019 by author(s) and Scientific Research Publishing Inc.

This work is licensed under the Creative Commons Attribution International License (CC BY 4.0).

http://creativecommons.org/licenses/by/4.0/

Received: January 8, 2019; Accepted: March 22, 2019; Published: March 25, 2019

ABSTRACT

The study of the grain-size distribution of gravels is always an important and challenging issue in stratigraphy and morphology, especially in the field of automated measurement. It largely reduces many manual processes and time consumption. Precise segmentation method plays a very important role in it. In this study, a digital image method using an improved normalized cuts algorithm is proposed for auto-segmentation of gravel image. It added grain-size estimation, and used the feature vector based on color. It has made great improvements in many respects, especially in accuracy of edge segmentation and automation. Compared with manual measurement methods and other image processing methods, the method studied in this paper is an efficient method for precisely segmenting gravel images.

**Keywords:**

Normalized Cuts, Gravel Image, Auto-Segmentation

1. Introduction

Sedimentology requires standardization of the size and shape of gravel. Whether the size of gravel can be effectively identified is closely related to the efficiency of grain-size distribution research. Establishment of outdoor laboratory consumes manpower and material resources [1] - [7] . Grain-size measurement methods based on photographic images have been proposed by many researchers [2] [3] [4] [6] [8] [9] [10] . These methods have greatly improved the efficiency of grain-size measurement. However, lack of the accuracy of measurement is also the question of many methods.

In recent years, many effective methods have been proposed. To circumvent the problem of the more laborious fieldwork, a refined automated grain sizing method is proposed for automatic extraction of grain-size distribution based on digital photographs taken from a river-bed [11] [12] . In succession, a new automated image-processing algorithm can identify sand grains in digital images acquired with a standard digital camera without any extra hardware attached to it [13] .

The normalized cuts of digital image method have high accuracy in image segmentation [14] . The idea of the method is to treat a picture as a graph, calculate the weighted graph, and divide it into regions with the same characteristics (texture, color, lightness, etc.) [15] . The main focus of this study is to apply normalized cuts to extract the grain-size data and improve it. An automated normalized cuts method is proposed to extract the grain-size data from gravel-bed.

2. Methodology

2.1. The Normalized Cuts Framework

Normalized cut criterion is an unsupervised image segmentation technique proposed by Shi and Malik. It does not need initialization and has three main characteristics [15] : 1) it transforms the problem of image segmentation into the problem of graph partition; 2) it is a global criterion; 3) it maximizes the dissimilarity between different groups and similarity within the same group.

A weighted undirected graph, $G=\left(V,E\right)$ , can be divided into two unconnected point sets A and B by deleting some edges, so that $A\cup B=V$ , $A\cap B=\varnothing $ . The degree of dissimilarity between the two parts can be defined as the sum of the weights of all edges that were deleted from the original edges two parts. In graph theory, it is called $Cut$ [15] :

$Cut\left(A,B\right)={\displaystyle \underset{i\in A,j\in B}{\sum}w\left(i,j\right)}$ (1)

where $w\left(i,j\right)$ is the weight of the edges of connection point $i$ and point $j$ , and it represents the degree of similarity between the two points.

The optimal dichotomy of a graph is to minimize the value of $Cut$ . To overcome this disadvantage, Shi and Malik proposed a new measure of dissimilarity between different groups, i.e. normalized cut:

$Ncut(A,B)=\frac{cut(A,B)}{assoc(A,V)}+\frac{cut(A,B)}{assoc(B,V)}$ (2)

where $assoc(A,V)={\displaystyle \underset{i\in A,k\in V}{\sum}{w}_{ik}}$ is the total connection from nodes in A to all the nodes in the graph, likewise $assoc\left(B,V\right)$ . $Ncut$ can be rewritten as:

$Ncut\left(A,B\right)=\frac{{\displaystyle {\sum}_{{x}_{i}>0,{x}_{j}<0}-{w}_{ij}{x}_{i}{x}_{j}}}{{\displaystyle {\sum}_{{x}_{i}>0}{d}_{i}}}+\frac{{\displaystyle {\sum}_{{x}_{i}<0,{x}_{j}>0}-{w}_{ij}{x}_{i}{x}_{j}}}{{\displaystyle {\sum}_{{x}_{i}<0}{d}_{i}}}$ (3)

Let $D=diag\left({d}_{1},{d}_{2}\cdots {d}_{N}\right)$ , $W$ is a symmetric matrix of $N\times N$ , and $W\left(i,j\right)={w}_{ij}$ . The global optimal value problem can be simplified to:

${\mathrm{min}}_{x}Ncut(x)={\mathrm{min}}_{y}\frac{{y}^{T}(D-W)y}{{y}^{T}Dy}$ (4)

And ${y}_{i}\in \left\{1\text{\hspace{0.17em}}\text{\hspace{0.17em}}\frac{{\displaystyle {\sum}_{{x}_{i}>0}{d}_{i}}}{{\displaystyle {\sum}_{{x}_{i}<0}{d}_{i}}}\right\},\text{\hspace{0.17em}}{y}^{T}\left[\begin{array}{c}{d}_{1}\\ \vdots \\ {d}_{N}\end{array}\right]=0.$

Since $y$ is a real vector, the above equation is optimized by solving the generalized eigenvalue system:

$\left(D-W\right)y=\lambda Dy$ (5)

It was proved that the second smallest eigenvector of this system is the real valued solution to the normalized cuts problem. The solutions based on higher eigenvectors become unreliable. One key problem of the application of the presented method is the feature selection and the computation of weights for graph setting. As it was mentioned earlier, color/intensity and texture features can be used for evaluating the image regions. Intensities, color, spatial proximity and DOG (Difference of Gaussians) were suggested for texture characterization. In this work we derive texture features from the orientation histograms of each scale level. The weights of the graph are obtained by taking into account those obtained at higher scale levels.

2.2. The Weights Defining

We define the edge weight ${w}_{ij}$ between node $i$ and $j$ as the product of a feature similarity term and spatial location term.

${w}_{ij}={e}^{\frac{-{\Vert {F}_{i}-{F}_{j}\Vert}_{2}^{2}}{{\delta}_{I}^{2}}}\times \{\begin{array}{ll}{e}^{\frac{-{\Vert {X}_{i}-{X}_{i}\Vert}_{2}^{2}}{{\delta}_{X}^{2}}}\hfill & \text{}if\text{}{\Vert {X}_{i}-{X}_{j}\Vert}_{2}<r\hfill \\ 0\hfill & \text{}otherwise\text{}\hfill \end{array}$ (6)

where ${X}_{i}$ is the spatial location of node $i$ , and here ${F}_{i}$ is the feature vector based on color at that node defined as:

${F}_{i}=[{v}_{i},{v}_{i}\cdot {s}_{i}\cdot \mathrm{sin}({h}_{i}),{v}_{i}\cdot {s}_{i}\cdot \mathrm{cos}({h}_{i})]$ (7)

where ${h}_{i}$ , ${s}_{i}$ , ${v}_{i}$ are the HSV values, and the color parameters in this model are: hue (H), saturation (S), Value (V).

2.3. The Normalized Cuts for Gravel Segmentation

According to the characteristics of gravel image, we can estimate the number of gravels in the image. Getting a line from the gravel image, let $f$ be the gray

value of a line and $x$ be the position, $\frac{\partial f}{\partial x}$ is the rate of change of $f$ and $\frac{{\partial}^{2}f}{\partial {x}^{2}}$ indicates the acceleration of the change of pixel value, shown as Equation (2) and Equation (3). A specific example is shown in Figure 1.

$\frac{\partial f}{\partial x}=f\left(i+1\right)-f\left(i\right)$ (8)

$\frac{{\partial}^{2}f}{\partial {x}^{2}}=\frac{\partial \left(f\left(i+1\right)-f\left(i\right)\right)}{\partial x}=f\left(i+1\right)-2f\left(i\right)+f\left(i-1\right)$ (9)

Figure 1. The distribution regular of pixel ((a) gray image, (b) pixel value of gray image, (c) the rate of change of pixel value, (d) the acceleration of change of pixel value)).

Figure 2. Original gravel images.

For the image of the compactly distributed gravel, the gray value of the edge of the gravel changes faster as shown in Figure 2(d). Remarkable edge features of gravel facilitate accurate statistics of the length of gravel intercepted.

The algorithm may intercept any position of the gravel. It may be an edge part, its length close to zero. It may be a positive center and its length can represent the length of the gravel. The interception length obtained by statistics is related to the longest length of gravel measurement, but they cannot be equated. Let $Ls$ be the interception length obtained by statistics, $n$ be the number of statistics and $Le$ be the estimated value of grain-size, $Le$ can be obtained by $Ls$ , as

$Le=\alpha \frac{{\displaystyle \underset{i=1}{\overset{n}{\sum}}L{s}_{i}}}{n}$ (10)

where $\alpha $ is the correlation coefficient between $Le$ and the mean of $Ls$ .

Traditional normalized cuts need to manually adjust the related parameters. To get better initial segmentation effect, it automatically calculates related parameters through the size of the grain-size estimated by previous work. The following steps are described for the specific implementation of automatic Ncut:

1) Consider image as an undirected graph $G=\left(V,E\right)$ and construct a Pixel Similarity Matrix (PSM). As stated before, each element of the PSM is the weight of edge $w\left(i,j\right)$ and is calculated by Equation (6).

2) Solve Equation (5) for the Eigenvectors with the smallest Eigen values.

3) Use the Eigen vector with the second smallest Eigen value to bipartition the image by finding the splitting points such that its Ncut value is minimized.

4) Recursively re-partition the segments (go to step i).

5) Exit, if Ncut value for segment is over some specified threshold $Ne$ .

In this process, we can determine $Ne$ by the estimated particle size:

$Ne=\lfloor \frac{Lw\cdot Lh}{\beta L{e}^{2}}\rfloor $

where $Lw$ is the width of the gravel image, $Lh$ is the height of the gravel image, $\beta $ is the shape coefficient of the gravel and $Le$ is the estimated value of grain-size.

3. Experimental Results and Analysis

In order to verify the effectiveness of the improved Ncut algorithm proposed in this paper. MATLAB software is used to simulate, and the size of the image in the simulation is 512 × 384. Images of gravel in different environments were captured and used in experiments to demonstrate the effectiveness of our algorithm. Compared with other algorithms, a class of gravel image is selected for processing.

3.1. The Results of Processing Different Kinds of Gravel Images

To show the effectiveness of our method, six types of gravel images are used in experiments, as shown in Figure 2. The corresponding results are shown in Figure 3. Without debugging of corresponding parameters, the effect of image segmentation is remarkable. The gravel image with clear edge texture and simple structure obtains pretty good results that segmentation close to the edge of gravel and there is few over-segmentation (Figures 3(a)-(d)). Even for complex gravel images, the effect is not bad that segmentation is not comprehensive and there is some over-segmentation (Figure 3(e) and Figure 3(f)).

3.2. The Results of Different Methods

In order to verify the effectiveness of the proposed method, we do experiments on a large number of different types of gravel images. It’s compared with Marker Controlled Watershed (MCW) [16] , Grain size measurement based on edge image (GSME) [13] and SLIC [17] in this paper. We chose an image for experiment and comparison (Figure 4). The experimental results as shown in Figure 5.

Figure 3. The segmentation results of gravel images by using our method.

Figure 4. Original gravel image.

Figure 5. The segmentation results by using different methods, (a) MCW, (b) GSME, (c) SLIC, (d) our method.

We should be clear that the edge is not accurate in the result of MCW, as shown in Figure 5(a). GSME can only get the edges that are not closed. It is difficult to measure. The result of SLIC looks good. However, there are some cases of over-segmentation and under-segmentation in the results. It is not difficult to find that our method performs well in this respect. The cases of over-segmentation and under-segmentation decreased so much.

4. Conclusion

Image segmentation plays an important role in quantitative research on the grain size distributions which is of great significance to geology, hydraulics and ecology. Some of the current methods have great improvement in saving time and measurement which moves one step toward the efficiency and automation of various research fields. The study in this paper has made great improvements in many respects, especially in accuracy of edge segmentation and automation. However, when the Ncut algorithm is used to solve the matrix, the computational complexity is complex. Many ways can be used to solve this problem. Combining Mean-Shift and Ncut, it shortens the processing time. Tradeoff between accuracy and time consumption is a problem we need to be concerned about.

Acknowledgements

We are grateful to anonymous reviewers for their constructive reviews on the manuscript, and the editors for carefully revising the manuscript. This research is financially supported by Hubei Provincial Department of Education (No. Q20181310) and Open Fund of Key Laboratory of Exploration Technologies for Oil and Gas Resources (Yangtze University), Ministry of Education (No. K2018-21). The supports are gratefully acknowledged.

Conflicts of Interest

The authors declare no conflicts of interest regarding the publication of this paper.

Cite this paper

Wang, C., Lin, X.L. and Chen, C.S. (2019) Gravel Image Auto-Segmentation Based on an Improved Normalized Cuts Algorithm. Journal of Applied Mathematics and Physics, 7, 603-610. https://doi.org/10.4236/jamp.2019.73044

References

- 1. Latulippe, C., Lapointe, M.F. and Talbot, T. (2001) Visual Characterization Technique for Gravel-Cobble River Bed Surface Sediments; Validation and Environmental Applications Contribution to the Programme of CIRSA (Centre Interuniversitaire de Recherche sur le Saumon Atlantique) Earth Surface Processes and Landforms. The Journal of the British Geomorphological Research Group, 26, 307-318. 3.0.CO;2-R>https://doi.org/10.1002/1096-9837(200103)26:3<307::AID-ESP160>3.0.CO;2-R
- 2. Sime, L.C. and Ferguson, R.I. (2003) Information on Grain Sizes in Gravel-Bed Rivers by Automated Image Analysis. Journal of Sedimentary Research, 73, 630-636. https://doi.org/10.1306/112102730630
- 3. Graham, D.G., Reid, I. and Rice, S.P. (2005) Automated Sizing of Coarse Grained Sediments: Image Processing Procedures. Mathematical Geology, 37, 1-28. https://doi.org/10.1007/s11004-005-8745-x
- 4. Graham, D.G., Rice, S.P. and Reid, I. (2005) A Transferable Method for the Automated Grain Sizing of River Gravels. Water Resources Research, 41, W07020. https://doi.org/10.1029/2004WR003868
- 5. Warrick, J.A., Rubin, D.M., Ruggiero, P., Harney, J.N., Draut, A.E. and Buscombe, D. (2009) Cobble Cam: Grain-Size Measurements of Sand to Boulder from Digital Photographs and Autocorrelation Analyses. Earth Surface Processes and Landforms, 34, 1811-1821. https://doi.org/10.1002/esp.1877
- 6. Graham, D.G., Rollet, A.J., Piegay, H. and Rice, S.P. (2010) Maximizing the Accuracy of Image-Based Surface Sediment Sampling Techniques. Water Resources Research, 46, W02508. https://doi.org/10.1029/2008WR006940
- 7. Singer, S.B. (2008) A New Sampler for Extracting Bed Material Sediment from Sand and Gravel Beds in Navigable Rivers. Earth Surface Processes and Landforms, 33, 2277-2284. https://doi.org/10.1002/esp.1661
- 8. McEwan, I.K., Sheen, T., Cunningham, G.J. and Allen, A.R. (2000) Estimating the Size Composition of Sediment Surfaces through Image Analysis. Proceeding of the Institution of Civil Engineers, Water Maritime and Energy, 142, 189-195. https://doi.org/10.1680/wame.2000.142.4.189
- 9. Butler, J.B., Lane, S.N. and Chandler, J.H. (2001) Automated Extraction of Grain-Size Data from Gravel Surfaces Using Digital Image Processing. Journal of Hydraulic Research, 39, 519-529. https://doi.org/10.1080/00221686.2001.9628276
- 10. Lane, S.N. (2001) The Measurement of Gravel-Bed River Morphology. In: Mosley, M.P., Ed., Gravel-Bed Rivers V, Caxton Press, Christchurch, 291-338.
- 11. Chang, F.J. and Chung, C.H. (2012) Estimation of Riverbed Grain-Size Distribution Using Image-Processing Techniques. Journal of Hydrology, 440, 102-112. https://doi.org/10.1016/j.jhydrol.2012.03.032
- 12. Chung, C.H. and Chang, F.J. (2013) A Refined Automated Grain Sizing Method for Estimating River-Bed Grain Size Distribution of Digital Images. Journal of Hydrology, 486, 224-233. https://doi.org/10.1016/j.jhydrol.2013.01.026
- 13. Baptista, P., Cunha, T.R., Gama, C. and Bernardes, C. (2012) A New and Practical Method to Obtain Grain Size Measurements in Sandy Shores Based on Digital Image Acquisition and Processing. Sedimentary Geology, 282, 294-306. https://doi.org/10.1016/j.sedgeo.2012.10.005
- 14. Sun, F. and He, J.P. (2009) A Normalized Cuts Based Image Segmentation Method. 2nd International Conference on Information and Computing Science, 2, 333-336.
- 15. Shi, J. and Malik, J. (2000) Normalized Cuts and Image Segmentation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 22, 888-905. https://doi.org/10.1109/34.868688
- 16. Parvati, K., Rao, P. and Mariya Das, M. (2008) Image Segmentation Using Gray-Scale Morphology and Marker-Controlled Watershed Transformation. Discrete Dynamics in Nature and Society, 2008, Article ID: 384346.
- 17. Achanta, R., Shaji, A., Smith, K., Lucchi, A., Fua, P. and Susstrunk, S. (2012) SLIC Superpixels Compared to State-of-the-Art Superpixel Methods. IEEE Transactions on Pattern Analysis and Machine Intelligence, 34, 2274-2281.