12
COMPUTER ANIMATION AND VIRTUAL WORLDS Comp. Anim. Virtual Worlds 2008; 19: 515–526 Published online 6 August 2008 in Wiley InterScience (www.interscience.wiley.com) DOI: 10.1002/cav.237 ........................................................................................... 3D virtual simulator for breast plastic surgery By Youngjun Kim * , Kunwoo Lee and Wontae Kim .......................................................................... We have proposed novel 3D virtual simulation software for breast plastic surgery. Our software comprises two processes: a 3D torso modeling and a virtual simulation of the surgery result. First, image-based modeling is performed in order to obtain a female subject’s 3D torso data. Our image-based modeling method utilizes a template model, and this is deformed according to the patient’s photographs. For the deformation, we applied procrustes analysis and radial basis functions (RBF). In order to enhance reality, the subject’s photographs are mapped onto a mesh. Second, from the modeled subject data, we simulate the subject’s virtual appearance after the plastic surgery by morphing the shape of the breasts. We solve the simulation problem by an example-based approach. The subject’s virtual shape is obtained from the relations between the pair sets of feature points from previous patients’ photographs obtained before and after the surgery. Copyright © 2008 John Wiley & Sons, Ltd. Received: 25 June 2008; Accepted: 25 June 2008 KEY WORDS: 3D human modeling; image-based modeling; breast surgery; simulation Introduction Recently, plastic surgeries have created substantial interest. Among them, breast augmentation is one of the most commonly performed cosmetic surgical procedures. According to the American Society of Plastic Surgeons, breast plastic augmentation is the top surgical procedure, and the number of surgeries continues to increase in the United States. 1 Since these breast plastic surgeries are actively performed, patients need a new type of simulation software to view the results prior to the surgery. Preoperative simulation software can be used as a communication tool, for (1) minimizing doctor–patient misunderstandings, (2) finding the optimal way of operation, (3) getting rid of the patient’s fear about the operation, (4) selecting the most desired figure, and so on. Surgeons can also persuade patients to undergo the operation via the simulating tool. Meanwhile, in computer graphics field many investigations on 3D human modeling and its applications have been undertaken. Various computer graphics technologies involving 3D human models *Correspondence to: Y. Kim, Human-Centered CAD Lab, Seoul National University, Shilim 9-Dong, Gwanak-Gu, Seoul, Korea. E-mail: [email protected] have been introduced and employed in many other industries. Once these technologies are leveraged, we can successfully implement a 3D simulator for breast plastic surgeries. From this viewpoint, as well as the demand of a breast implant company and some surgeons, we have attempted to develop a realistic and intuitive 3D simulator for breast augmentation surgery. We introduce our 3D human modeling and virtual simulation method in the remaining part of this paper. Related Work 2D breast surgery simulator Many 2D simulation programs have already been developed. 24 Image manipulation software provides tools that can be used to warp, stretch, shrink, and smooth features. They make cosmetic changes to digital images of the human face and body. Although such software can be used by many cosmetic clinics, they lack realism and are mostly unnatural. They are no more than image editing programs that collect suitable functions for simulation. 3D breast modeler and simulator Seo et al. 5 proposed a breast modeler based on the analysis of breast scans. Given a set of 28 scanned breasts, their breast modeler can be used to generate ............................................................................................ Copyright © 2008 John Wiley & Sons, Ltd.

3D virtual simulator for breast plastic surgery

Embed Size (px)

Citation preview

Page 1: 3D virtual simulator for breast plastic surgery

COMPUTER ANIMATION AND VIRTUAL WORLDSComp. Anim. Virtual Worlds 2008; 19: 515–526Published online 6 August 2008 in Wiley InterScience(www.interscience.wiley.com) DOI: 10.1002/cav.237...........................................................................................3D virtual simulator for breastplastic surgery

By Youngjun Kim*, Kunwoo Lee and Wontae Kim..........................................................................

We have proposed novel 3D virtual simulation software for breast plastic surgery. Oursoftware comprises two processes: a 3D torso modeling and a virtual simulation of thesurgery result. First, image-based modeling is performed in order to obtain a femalesubject’s 3D torso data. Our image-based modeling method utilizes a template model, andthis is deformed according to the patient’s photographs. For the deformation, we appliedprocrustes analysis and radial basis functions (RBF). In order to enhance reality, thesubject’s photographs are mapped onto a mesh. Second, from the modeled subject data, wesimulate the subject’s virtual appearance after the plastic surgery by morphing the shape ofthe breasts. We solve the simulation problem by an example-based approach. The subject’svirtual shape is obtained from the relations between the pair sets of feature points fromprevious patients’ photographs obtained before and after the surgery. Copyright © 2008John Wiley & Sons, Ltd.

Received: 25 June 2008; Accepted: 25 June 2008

KEY WORDS: 3D human modeling; image-based modeling; breast surgery; simulation

Introduction

Recently, plastic surgeries have created substantialinterest. Among them, breast augmentation is oneof the most commonly performed cosmetic surgicalprocedures. According to the American Society ofPlastic Surgeons, breast plastic augmentation is thetop surgical procedure, and the number of surgeriescontinues to increase in the United States.1 Sincethese breast plastic surgeries are actively performed,patients need a new type of simulation software toview the results prior to the surgery. Preoperativesimulation software can be used as a communication tool,for (1) minimizing doctor–patient misunderstandings,(2) finding the optimal way of operation, (3) getting ridof the patient’s fear about the operation, (4) selectingthe most desired figure, and so on. Surgeons can alsopersuade patients to undergo the operation via thesimulating tool. Meanwhile, in computer graphics fieldmany investigations on 3D human modeling and itsapplications have been undertaken. Various computergraphics technologies involving 3D human models

*Correspondence to: Y. Kim, Human-Centered CAD Lab, SeoulNational University, Shilim 9-Dong, Gwanak-Gu, Seoul, Korea.E-mail: [email protected]

have been introduced and employed in many otherindustries. Once these technologies are leveraged, we cansuccessfully implement a 3D simulator for breast plasticsurgeries. From this viewpoint, as well as the demandof a breast implant company and some surgeons, wehave attempted to develop a realistic and intuitive 3Dsimulator for breast augmentation surgery. We introduceour 3D human modeling and virtual simulation methodin the remaining part of this paper.

Related Work

� 2D breast surgery simulatorMany 2D simulation programs have already beendeveloped.2–4 Image manipulation software providestools that can be used to warp, stretch, shrink, andsmooth features. They make cosmetic changes todigital images of the human face and body. Althoughsuch software can be used by many cosmetic clinics,they lack realism and are mostly unnatural. They areno more than image editing programs that collectsuitable functions for simulation.

� 3D breast modeler and simulatorSeo et al.5 proposed a breast modeler based on theanalysis of breast scans. Given a set of 28 scannedbreasts, their breast modeler can be used to generate

............................................................................................Copyright © 2008 John Wiley & Sons, Ltd.

Page 2: 3D virtual simulator for breast plastic surgery

Y. KIM, K. LEE AND W. KIM...........................................................................................

the breast shape using principal component analysis(PCA). The subject’s shape is obtained by inputtingthe attribute parameters. It is an efficient way formodeling breasts. However, it needs abundant 3D scandata of nude female breasts to obtain a feasible result.Moreover, they did not consider photo-mapping to thereconstructed model.

Balaniuk et al.6 implemented a 3D breast simulatorusing the radial elements method (REM), which isa type of finite element method. They simulated thedeformation and dynamic motion of a breast. Theyobtained good results, but they used a 3D scanner toget the subject’s body data. Further, they simulatedonly the one side of breast excluding other sections ofthe torso.

INAModel7 is a breast visualization tool availableon the Internet. It interactively shows the 3Drepresentation of the female body model, which canbe manipulated to simulate the results of a breastaugmentation procedure with various shapes andsizes. The biggest drawback of INAModel is that itprovides simulation using a prepared model insteadof the actual subject. The subject feels that the virtualmodel is different from her, and it detracts from reality.

� 3D human body modelingMany research groups have presented variousmodeling methods for developing a 3D human body.A considerable amount of work has been undertakenwith respect to blending exemplar data to generatenew ones. In many of these studies, PCA is usedto efficiently analyze the statistical data. Allen et al.8

presented a method for human body shape modelingfor reconstruction and parameterization from rangescans. Thalmann et al.9 and Seo and Thalmann10

also developed a data-driven modeling method forrealistic virtual human models. Another data-drivenapproach is SCAPE developed by Anguelov et al.11

They developed a method for building a human shapemodel that spans variations in both subject shapeand pose. Sloan et al.12 presented a methodologyfor efficient run-time interpolation between multipleforms. Their system yields a continuous range ofshapes from example shapes. They applied shapeinterpolation to articulated arms and human facemodels.

� Image-based modelingAnother approach for 3D human body modeling isan image-based modeling method (Lee et al.,13 Parket al.,14 Ansari and Abdel-Mottaleb,15 Zhang et al.,16 Seoet al.17). The advantage of this modeling method is thatit does not need any expensive hardware. Generally, it

reconstructs the 3D model directly from orthogonal 2Dimages leveraging a 3D template data. So far, an image-based modeling method specific to the breasts has notbeen developed.

� Linear combination modelBlanz and Vetter18 introduced a morphable face modelfor reconstructing a new face and manipulating itaccording to changes in certain facial attributes. Theymodeled a new face by forming linear combinations of200 scanned face models. Annotated facial attributesare used to define the shape and texture vectors, andthey are added to or subtracted from the face. Hwanget al.19,20 also utilized the linear combination model.They applied it for reconstructing 2D face data frompartially damaged images or from a small number offeature points. Based on their idea, we use statisticalprediction based on exemplar cases.

Overview

Our software has two main parts: a 3D modeler of afemale torso and a virtual simulator of the person’s shapeafter breast surgery. Figure 1 shows the overview of oursoftware.

Three-dimensional scanning is the optimal methodto obtain the patient’s model data. However, a 3Dscanner is prohibitively expensive for use in a generalclinic. Therefore, we adopted image-based 3D modeling.Such a modeling method requires several orthogonalphotographs of the subjects. Then, according to the inputfeature points on the photographs, a template 3D modelis morphed by using our deforming methods. Detailedexplanations of our modeling methods are described inthe next section.

From the reconstructed patient’s model, virtualsimulation of the post-surgery data is performed. Weimplemented this simulation using an example-basedmethod. Feature points of the breast are moved to

subjectof Photographs with input feature points

withbody template 3D defined feature points

3D reconstructed body

Virtually simulated body

Patient DB before and after surgery

Local deformation Global deformation Photo-mapping

Example-based simulation

Figure 1. Overview of the entire process.

............................................................................................Copyright © 2008 John Wiley & Sons, Ltd. 516 Comp. Anim. Virtual Worlds 2008; 19: 515–526

DOI: 10.1002/cav

Page 3: 3D virtual simulator for breast plastic surgery

3D VIRTUAL SIMULATOR FOR BREAST SURGERY...........................................................................................

new positions by learning from the exemplar database.Then, all the mesh vertices are mapped by interpolatingthe moved feature points. To obtain a statisticallyderived result, we utilized photographs of 30 patientsbefore and after the plastic surgery. The followingsection presents our example-based simulation methods.The remaining sections discuss the results of ourwork.

Image-Based 3D Torso BodyModeling

Template Model

Generally, in image-based modeling, a 3D templatemodel is commonly used in order to compensate for thelack of information from 2D images. The template modelrequires a typical and representative shape. We used acommercial modeling tool, Maya 7.0,21 to make the 3Dtemplate model. Thereafter, we deformed the templatemodel according to the relations of the feature points ofthe template model and those from the images. In thispaper, we denote the template feature points as PT andthe calculated feature points from the images as PC. Weassigned the position of PT on the 3D template by manual

selection. Several template models were prepared, andthe most appropriate one is selected among them.

Feature Point

Feature points are necessary and important prerequisitesfor our image-based modeling method. We referred toLee et al.’s22 definition, and excluded some points anddefined some additional points for our purpose. Thenumber of feature points is 10 for both left and rightbreasts. For the abdominal part, we defined 18 featurepoints: six points each for the left, right, and centralwaist curves. Additional points such as the navel, armpit,shoulder, and so on are also defined. Table 1 and Figure 2show the definitions of feature points.

In our image-based 3D torso modeling, we use onefront-view and two side-view photographs of the subject.We need both left- and right-view images in order toreconstruct unsymmetrical breasts. A back-view photo-graph is optionally added, if needed, for photo-mapping.We define the feature points from these orthogonalimages. A white or blue background is recommendedbecause we detect the body curves via skin detection.

After taking the photographs of the subject, we assignfeature points on them. All these points or each pointcan be scaled and translated easily using our GUI. Some

FeatPt Definition

P0 Upper breast point (UBP), same y-coord. as LAP on Cbf & Cbs

P1 Mid-point of P0 & P2 on Cbs

P2 Bust point (BP) on Cbf & Cbs

P3 Mid-point of P2 & P4 on Cbs

P4 Bottom breast point (BBP) on Cbf & Cbs

P5 Mid-point of P0 & P6 on Cbs

P6 Outermost point on Cbf

P7 Mid-point of P6 & P4 on Cbf

P8 Mid-point of P4 & P9 on Cbf

P9 Inner breast point (IBP) on Cbf

FNP Front neck pointSP Shoulder point (SP (LSP, RSP)), same y-coord. as FNPCSP Center of shoulder point (CSP (LCSP, RCSP)) on Cbf, mid-point of FNP & SP in x-dir.OBP Outer breast point (OBP (LOBP, ROBP)) on Cbf, mid-point of P0 & P4 in y-dir.WP1 Same y-coord. as P4, on Cw or Cc

WP2-WP5 Equally dividing points between WP1 and WP6 in y-dir. on Cw or Cc

WP6 Same y-coord. as NP, on Cw or Cc

FCP Front center point, mid-point of LBP9 & RBP9NP Navel point, the same point as WP6 on Cc

Table 1. Definition of feature points. If, front-view image; Is, side-view image; Cbf, breast curve in If;Cbs, breast curve in Is; Cw, waist curve in If; Cc, belly curve in Is

............................................................................................Copyright © 2008 John Wiley & Sons, Ltd. 517 Comp. Anim. Virtual Worlds 2008; 19: 515–526

DOI: 10.1002/cav

Page 4: 3D virtual simulator for breast plastic surgery

Y. KIM, K. LEE AND W. KIM...........................................................................................

Figure 2. Definition of feature points (R, right; L, left; C, center).

points are constrained by the definitions listed in Table 1,and many of them are automatically positioned. Forexample, right shoulder point (RSP) and left shoulderpoint (LSP) have the same y-coordinates as front neckpoint (FNP). Their coordinates in the front-view imageare set at the outermost positions of the body at theFNP’s y-level. Among the feature points, front centerpoint (FCP) cannot be seen from the side view. Therefore,the z-position of this feature point is set on the side imageby using the user’s estimate.

The feature points P0, P2, and P4 in the front-viewimage are assumed to lie on the same plane in the3D coordinate system. These points are also definedin the side view. Internally, the side-view images arenormalized according to the height of the breast, fromP0 to P4, in the front-view image. All the featurepoints’ coordinates are calculated from the normalizedorthogonal images. Their x-coordinates are obtainedfrom the front image, and the z-coordinates are computedfrom the side image. The y-coordinates are commonvalues in the front and side images.

WP1-WP6 are detected automatically using skindetection. We used YCbCr color space for the skindetection. We collate only the pixels that are withinthe skin color range of the Cb and Cr values. We onlyhandle the Cb and Cr values because we are interestedin the section invariant to the illumination intensity inorder to facilitate skin detection. In this detection process,we have many small holes in the interior of the body.

Figure 3. Skin detection and feature points on the front- andleft-view images.

These holes act as obstacles in finding the body’s curves;therefore, we fill these holes if they are smaller than athreshold size proportional to the image size. Figure 3shows the skin detection result and feature points input.

Global Deformation

In the global deformation step, we compute the affinetransformation matrix to match the template model to

............................................................................................Copyright © 2008 John Wiley & Sons, Ltd. 518 Comp. Anim. Virtual Worlds 2008; 19: 515–526

DOI: 10.1002/cav

Page 5: 3D virtual simulator for breast plastic surgery

3D VIRTUAL SIMULATOR FOR BREAST SURGERY...........................................................................................

the feature points calculated from the images as closely aspossible. The affine transformation includes translation,rotation, and scaling. To minimize the sum of thesquared error between PC and PT , we use the procrustesanalysis.23 This method is computationally simple andstable. Global deformation is an auxiliary step, whichyields good initial conditions for local deformation.

minE(S, R, T ) =n∑

i=1

(pC,i − pT,i)2 (1)

where n is the number of feature points for breastpart, pC,i the ith feature point calculated fromorthogonal images, pT,i the ith feature point of templatemodel after global deformation, PT = (pTxpTypTz)T = S ·R(pT0xpT0ypT0z)T + T , PT0 the template model’s featurepoints before global deformation, S the scaling factor, Rthe rotation matrix, and T is the translation vector.

The procedures of the global deformation are asfollows:

(a) Normalize PC and PT0.(b) Let A = PT

C · PT0.(c) Singular value decomposition of A.

LDM = SVD(A)

(d) Compute the rotation matrix R = MLT .(e) Compute the scaling factor.

S =∑

diagonal(D) × ‖PC‖2

‖PT ‖2

(f) Compute the translating vector.

T = mean(PC) − (S)mean(PT0)(R)

(g) Transform the entire 3D template model’s verticesusing S, R, and T.

An example global deformation result is shown inFigure 4.

Figure 4. Before and after global deformation (dot: PT ,cross: PC ).

Local Deformation

After global rigid deformation roughly aligns thetemplate model to PC, local deformation refines andgenerates a more realistic model. For this process, aninterpolation function is estimated, and the rest ofthe points are mapped using this function. Amongthe interpolation methods, radial basis functions (RBF)are a powerful technique for performing interpolationin a multidimensional space.24–26 The basis function,R(d), depends only on the distance from the featurepoints, which are thus called radials. RBF construct theinterpolants as a linear combination of the basis functionsand then they can be used to determine the coefficientsof the basis functions.

f (p) =n∑

i=1

w(i)R(‖p − p(i)‖) (2)

where n is the number of feature points, f (p) thetransformed vertex through local deformation, w(i) thecoefficients of basis functions, p(i) the model’s featurepoints, and p is each vertex of generic model (inputpoint).

The most common RBF, R(d), are as follows:

R(d) = d (linear)R(d) = d3 (cubic)R(d) = d log(d) (thin plate spline)R(d) = (d2 + c2)u/2 (multiquadric)R(d) = e−d/c2 (Gaussian)

We investigated these RBF and selected the mostappropriate one:

R(d) = e−d/502(3)

To find the coefficients of the basis functions w(i), we useEquation (3) and let f (PT ) equal to PC.

pj

Cx = fx(pj

Tx) =n∑

i=1

wx(i)R(‖pj

Tx − p(i)‖) (4)

where pj

Cx is the x-coordinate of jth calculated featurepoint and p

j

Tx is the x-coordinate of jth template model’sfeature point.Because we have n pairs of p

j

Cx and pj

Tx, we can solveEquation (4) and obtain the coefficients of the basisfunctions.

[wx(1)wx(2) . . . wx(n)]T = H−1[p1Cxp

2Cx . . . pn

Cx]T (5)

where H is a matrix that consists of RBF values.

............................................................................................Copyright © 2008 John Wiley & Sons, Ltd. 519 Comp. Anim. Virtual Worlds 2008; 19: 515–526

DOI: 10.1002/cav

Page 6: 3D virtual simulator for breast plastic surgery

Y. KIM, K. LEE AND W. KIM...........................................................................................

Figure 5. Before and after local deformation from severaldifferent views (dot, PT ; cross, PC ).

By using the same steps mentioned above, we deformalong the y- and z-directions, too. We apply the RBFinterpolation to all the vertices of the template mesh dataand finally obtain the morphed data according to thesubject. Figure 5 shows an example of local deformationresult.

Photo-Mapping

For realism, the photographs of the subject are mappedonto the deformed mesh data. Because we have already

assigned feature points on the images, the texturecoordinates of each vertex can be easily calculated. Themain issue in texture mapping is how to blend three orfour orthogonal images. We verified the normal directionof the facets to determine the blending ratio. Here, letus assume that we use the front-, left-, and right-viewimages for texture mapping. The front- and side-viewimages are orthogonal; therefore, the blending ratio iscalculated by using the normal vector’s z-component.At this time, the left-view image should be used onlyfor the left part (x > 0), and vice versa for the right part.Equation (6) explains our texture image blending methodin detail.

i) x > 0; (i.e., left part)

if �nx > 0, αf = �nz, αl = 1 − �nz

otherwise, αf = 1, αl = 0

ii) x < 0; (i.e., right part)

if �nx < 0, αf = �nz, αr = 1 − �nz

otherwise, αf = 1, αr = 0

(6)

where �nx, �nz are the x- and z- components of the normalvector, respectively. αf , αl, αr are the blending ratios ofthe front-, left- and right-view images used in texturemapping, respectively.

Example-Based Simulation

The ultimate goal of this study is to obtain a virtuallysimulated result of the breast augmentation surgery. Inorder to accomplish this, we implemented an example-based simulation algorithm. In this section, we describea virtual breast surgery simulation method by learningactual surgical cases.

Exemplar Data Preparation

Our simulation algorithm utilizes the photographs ofpatients who have already undergone plastic surgeriesfor breast augmentation. With the support of plasticsurgeons, we could accumulate the patient data.Photographs of three orthogonal views of each patientwere taken before the surgery and about 2 weeks post-surgery. Further, 30 patients participated in this task. Weprepared a database of the sets of feature point pairs fromthese data. Among the feature points that we definedin the previous sections, only the feature points of thebreast part are needed in the morphing process for thesimulation. We have computed the 3D coordinates of

............................................................................................Copyright © 2008 John Wiley & Sons, Ltd. 520 Comp. Anim. Virtual Worlds 2008; 19: 515–526

DOI: 10.1002/cav

Page 7: 3D virtual simulator for breast plastic surgery

3D VIRTUAL SIMULATOR FOR BREAST SURGERY...........................................................................................

the feature points, which are manually positioned on thephotos, as aforementioned. The feature point sets in thedatabase are normalized with the height of each breast,from P0 to P4, respectively.

Basic Assumption and Procedures

In order to leverage the database of surgical cases, wehave assumed that the subject’s virtually simulated resultof the appearance would be similar to those who havesimilar breast figures. We can estimate the breast shapepost-surgery on the assumption that it would follow theresult of the patients in the database. If we suppose asubject who has exactly the same figure as one of theexemplar data, she would have the same result as theexemplar data’s surgical result. Each person’s individualtaste is neglected.

Since we have reconstructed our 3D model usingthe feature points, we can easily morph the modelby changing the coordinates of the feature points.This is why we prepared the exemplar database withphotographs, and not with 3D scan data. Three-dimensional scan data of patients are much more difficultto gather than photographs. After we move the positionsof the feature points to the virtually simulated positions,all of the mesh vertices in the breast segments aremoved accordingly. After morphing using our example-based simulation, additional modification is possiblewith our detailed morphing tool. In the entire simulationprocess, morphing is performed by applying the RBFinterpolation method to the template model with thenewly changed feature points.

Linear Combination of ExemplarData

The main idea of the simulation is that any subjectcan be expressed as a linear combination form ofthe exemplar data. Blanz and Vetter18 have presenteda morphable model for face modeling, and manyimplemented the linear combination model for variousapplications.8,19,27 From those motives, we express asubject’s feature point set P as a linear combination formof the surgery exemplar data prior to surgery Pi in thedatabase.

P =m∑

i=1

αiPi,

m∑i=1

αi = 1 (7)

where m is the number of exemplar data in database,P the combination model’s feature point set beforesurgery, Pi the ith patient data’s feature point set beforesurgery, and αi is the ith weighting factor for linearcombination.

The optimal values of the weighting factorsα∗ are thosewhich minimize the sum of the squared error between P

and PC.

α∗ = arg minα

E(α) (8)

The error function is given as

E(α) =n∑

j=1

(pC,j −

m∑i=1

αipi,j

)2

(9)

where n is the number of feature points in each set, mthe number of exemplar data in database, pC,j the jthcalculated feature point from the subject’s orthogonalimages, and pi,j is the jth feature point of ith patient databefore surgery.

Example-Based Simulation

In this section, we describe how we predict the virtuallysimulated data using the linear combination model. Wedenote the ith patient’s feature point set of before surgeryas Pi and that after surgery as P ′

i . Now, let the subject’scombination model’s feature point set be expressed asfollows:

P = {pj, j = 1, . . . , n}, P ′ = {p′j, j = 1, . . . , n} (10)

where n is the number of feature points in each set, P

the combination model’s feature point set before surgery,and P ′ the combination model’s feature point set aftersurgery.

Once we find α∗i , we apply them to the displacement

vectors between Pi and P ′i . The weighted sum of the

relative displacements is the virtually simulated positionof the subject’s feature points.

For each feature point pj(j = 1, . . . , n),

p′j = pj +

m∑i=1

α∗i (p′

i,j − pi,j) (11)

where pi,j is the jth feature point of ith patient databefore surgery, p′

i,j the jth feature point of ith patient

............................................................................................Copyright © 2008 John Wiley & Sons, Ltd. 521 Comp. Anim. Virtual Worlds 2008; 19: 515–526

DOI: 10.1002/cav

Page 8: 3D virtual simulator for breast plastic surgery

Y. KIM, K. LEE AND W. KIM...........................................................................................

Figure 6. Subject’s reconstructed model before surgery (left)and its virtually simulated model after surgery (right). Both

are displayed in wireframe mode and texture mode.

data after surgery, pj the jth feature point of combinationmodel before surgery, and p′

j is the jth feature point ofcombination model after surgery.

Equation (11) implies that each patient’s relative dis-placement during the surgery influences the simulationprocess according to the weighting factor. In other words,a patient, whose shape is similar to the subject, wouldhave a higher weighting factor, and her change duringthe surgical procedures would be more influential ascompared to the others. In our algorithm, the featurepoint’s relative displacement is directly related to thechange during surgery. Finally, after we move eachfeature point to p′

j , we apply RBF interpolation tothe mesh vertices with regard to the new positions ofthe feature points P ′. Figure 6 illustrates an examplesimulation result.

Detailed Morphing by User Input

In case the user wants to modify the results, our softwareprovides detailed morphing tools. Figure 7 shows anexample.

� Local smoothing

Figure 7. Before and after (a) local smoothing and (b) localextrusion/intrusion.

For using the local smoothing tool, the region ofinterest in the mesh is first selected by mouseoperation. Ohtake et al.’s28 smoothing algorithm hasbeen applied in this tool. It provides a fairing effect tothe selected region.

� Local extrusion/intrusionThe local extrusion/intrusion tool is required if theuser wants to extract or depress the selected region. Itis performed through an interpolation function suchas linear, squared, root squared function, etc., withthe user’s input parameters and displacement amount.The selected region’s mesh vertices are moved throughthe given interpolation function.

� Morphing using feature pointIf required, this tool is available to morph the meshby modifying each feature point’s position. When theuser selects a feature point and moves it to a newposition, the mesh is morphed according to the newfeature point set. Here too, the mesh data are obtainedthrough RBF interpolation.

Results

Figure 8 shows the result of our image-based modelingas compared to the front-view photographs. Asshown in this figure, both 3D geometry and textureinformation of each subject are effectively reconstructed

............................................................................................Copyright © 2008 John Wiley & Sons, Ltd. 522 Comp. Anim. Virtual Worlds 2008; 19: 515–526

DOI: 10.1002/cav

Page 9: 3D virtual simulator for breast plastic surgery

3D VIRTUAL SIMULATOR FOR BREAST SURGERY...........................................................................................

Figure 8. Examples of eight female subjects. The 1st row images are the frontal photographs of actual subjects. The 2nd rowillustrates the snapshots of the reconstructed 3D data, and the 3rd row, the virtually simulated data after surgery.

for visualization purposes. We analyzed the results bycomparing them with the subject’s actual 3D scan data.Table 2 lists these results. RapidForm29 was used toanalyze distance errors. The errors are rather excessivebecause of the limitations of the image-based approach.By considering only the orthogonal views of the bodymight make it difficult to reconstruct the area betweenthe breasts, since the images do not contribute to anyinformation on the depth for this particular area. Thispart of the body can only be properly reconstructed if thetemplate is very similar to the subject’s data. Moreover,

different postures of the subjects increased the errors.However, our result is sufficiently realistic to meet ouroriginal goal, which involves the virtual simulation ofbreast plastic surgery.

We evaluated the linear combination model by com-paring it with the actual model. A patient’s data beforesurgery were reconstructed using our image-basedmodeling, and the result yielded was compared withthat obtained from the linear combination model. Datafor the linear combination model were obtained fromthe exemplar database without considering the patient’s

Figure 9. Comparison of the reconstructed model with the linear combination model (a) distance error, (b) reconstructed model,and (c) linear combination model.

............................................................................................Copyright © 2008 John Wiley & Sons, Ltd. 523 Comp. Anim. Virtual Worlds 2008; 19: 515–526

DOI: 10.1002/cav

Page 10: 3D virtual simulator for breast plastic surgery

Y. KIM, K. LEE AND W. KIM..........................................................................

No. Min Max Mean SD

1 −14.9 12.9 −0.10 5.772 −8.95 13.1 0.96 4.873 −16.3 12.2 0.02 4.754 −21.7 18.1 0.42 7.225 −14.1 13.8 0.28 6.406 −15.5 19.5 0.09 4.847 −18.3 19.7 0.67 5.918 −13.1 19.8 0.10 5.60

Table 2. Distance errors between the scanneddata and the reconstructed data of the subjects

(unit, mm; SD, standard deviation)

data. The two models from the exemplar data and itsmodel obtained from the linear combination shouldbe identical; this is because the simulated model ofthe subject’s P ′

i value is predicted directly from thecombination model P ′. As shown in Figure 9, the twomodels are very similar.

We also evaluated the simulation results by cross-validation using all the examples in the database.Consequently, each example has been excluded andthereafter the simulation algorithm was tested. Theresults are shown in Figure 10. The results can beimproved as we have more exemplar cases.

Discussion

In this paper, we propose an image-based female torsomodeling and an example-based simulation method.First, some template models were prepared and thefeature points were defined. We implemented severalgeometric deforming algorithms and a texture mappingmethod. Our image-based 3D modeling method enables

users to get a subject’s 3D model without employingexpensive hardware such as 3D scanners. Then, thesubject’s appearance after breast surgery is simulatedthrough an example-based algorithm. It is based on adatabase involving pair sets of feature points calculatedfrom photographs obtained before and after surgery. Wehave developed this database from the photographs ofactual patients with the help of surgeons. By using therelationship of the feature points in the photographsbefore and after surgery, our algorithm predicts thesurgery results. We designed a linear combination modelto reasonably apply the relationship derived from thedatabase. If needed, a user can interactively refine theresult by using the morphing tool. It takes about 1–2 minutes to obtain the simulated result from taking thephotographs of the subject.

Our software enables surgeons to counsel patientswith the simulated results before the cosmetic breastsurgery. As preoperative simulation software, it hasmany benefits: minimizing patient–doctor misunder-standings, finding the optimal way of operation, gettingrid of the patient’s fear about the operation, selecting theoptimal figure, and so on.

With regard to future work, an automatic way ofpositioning the feature points needs to be developed.Currently, only the feature points related to the profilesof the body are automatically detected. Further, it isnecessary to enrich our database of procedure cases.We need to gather more exemplar data according toimplant volumes as well as various implant types.Breast implants differ by shape, profile, and size. Ifwe have sufficient amount of exemplar data in everycategory of implants, we would be able to simulate withregard to type, profile, and size of the implant. Thusfar, this is not possible due to the lack of number ofsurgical cases in the database. As many surgeons useour software for counseling, we can accumulate moreexemplar data, resulting in our software to become morerobust.

Figure 10. Cross-validation of simulation algorithm. Each example has been excluded and thereafter the simulation was tested.Fifteen out of total 30 patients’ data are shown here. First row: reconstructed model from before surgery’s photographs, 2nd row:

virtually simulated model, 3rd row: reconstructed model from after surgery’s photographs.

...........................................................................Copyright © 2008 John Wiley & Sons, Ltd. 524 Comp. Anim. Virtual Worlds 2008; 19: 515–526

DOI: 10.1002/cav

Page 11: 3D virtual simulator for breast plastic surgery

3D VIRTUAL SIMULATOR FOR BREAST SURGERY...........................................................................................

ACKNOWLEDGEMENTS

We thank Dr DongRak Lee and Dr SungJun Back for theiroperation data and advice. This work is financially supported bythe Ministry of Education and Human Resources Development(MOE), the Ministry of Commerce, Industry and Energy(MOCIE), and the Ministry of Labor (MOLAB) through thefostering project of the Lab of Excellency. This research was alsosupported in part by SNU-IAMD.

References

1. American Society of Plastic Surgeons. 2007 Report of the 2006Statistics: National Clearinghouse of Plastic Surgery Statis-tics. http://www.plasticsurgery.org/media/statistics

2. Befaft. Kaeria Sarl. http://www.befaft.com/free-virtual-plastic-surgery-morphing

3. Dr. PSs. BITcomputer. http://www.bit.co.kr/english/product/414.htm

4. Mirror Digital Imaging. The Plastic Surgery Group.http://www.theplasticsurgerygroup.net/surgery/digital-imaging.cfm

5. Seo H, Cordier F, Hong K. A breast modeler based on analysisof breast scans. Computer Animation and Virtual Worlds 2007;18(2): 141–151.

6. Balaniuk R, Costa I, Melo J. Cosmetic breast surgerysimulation. In VIII Symposium on Virtual Reality, 2006; 387–396.

7. INAModel. http://www.lookingyourbest.com/inamodel8. Allen B, Curless B, Popovic Z. The space of human body

shapes: reconstruction and parameterization from rangescans. In ACM SIGGRAPH 2003 Papers, 2003; 587–594.

9. Thalmann NM, Seo H, Cordier F. Automatic modeling ofvirtual humans and body clothing. Journal of Computer Scienceand Technology 2004; 19(5): 575–584.

10. Seo H, Thalmann NM. An example-based approach tohuman body manipulation. Graphical Models 2004; 66(1):1–23.

11. Anguelov D, Srinivasan P, Koller D, Thrun S, Rodgers J,Davis J. Scape: shape completion and animation of people.ACM Transactions on Graphics 2005; 24(3): 408–416.

12. Sloan PJ, Rose CF, Cohen MF. Shape by example. InProceedings of the 2001 Symposium on Interactive 3D GraphicsI3D’01, 2001; 135–143.

13. Lee W, Gu J, Thalmann NM. Generating animatable 3Dvirtual humans from photographs. Computer Graphics Forum2000; 19(3): 1–10.

14. Park I, Zhang H, Vezhnevets V. Image-based 3d facemodeling system. EURASIP Journal on Applied SignalProcessing 2005; 13: 2072–2090.

15. Ansari AN, Abdel-Mottaleb M. 3-D face modeling using twoviews and a generic face model with application to 3-D facerecognition. In Proceedings of the IEEE Conference on AdvancedVideo and Signal Based Surveillance, IEEE Computer Society,2003.

16. Zhang M, Ma L, Zeng X, Wang Y. Image-based 3d facemodeling. In Proceedings of the International Conference onCGIV’04, 2004; 165–168.

17. Seo H, Yeo Y, Wohn K. 3D body reconstruction from photosbased on range scan. In Edutainment 2006, 849–860.

18. Blanz B, Vetter T. A morphable model for the synthesis of 3Dfaces. In Proceedings of SIGGRAPH’99, ACM Press/Addison-Wesley Publishing Co., 1999; 187–194.

19. Hwang B, Lee S. Reconstruction of partially damaged faceimages based on a morphable face model. IEEE Transactionson Pattern Analysis and Machine Intelligence 2003; 25(3): 365–372.

20. Hwang B, Lee S, Blanz V, Vetter T. Face reconstruction froma small number of feature points. In International Conferenceon Pattern Recognition (ICPR’00), Vol. 2, 2000; 838–841.

21. Maya. Alias. http://www.autodesk.com22. Lee H, Hong K, Kim E. Measurement protocol of women’s

nude breasts using a 3D scanning technique. AppliedErgonomics 2004; 35(4): 353–359.

23. Gower JC, Dijksterhuis GB. Procrustes Problems. OxfordUniversity Press: Oxford, UK, 2004.

24. Carr JC, Fright WR, Beatson RK. Surface interpolation withradial basis functions for medical imaging. IEEE TransactionsMedical Imaging 1997; 16(1): 96–107.

25. Noh J, Fidaleo D, Neumann U. Animated deformations withradial basis functions. In ACM Virtual Reality and SoftwareTechnology (VRST), 2000.

26. Carr JC, Beatson RK, Cherrie JB, et al. Reconstruction andrepresentation of 3d objects with radial basis functions. InInternational Conference on Computer Graphics and InteractiveTechniques, 2001; 67–76.

27. Lee H. Development of a low cost foot-scanner for acustom shoe tailoring system. Master’s Thesis, Seoul NationalUniversity, Korea, 2004.

28. Ohtake Y, Belyaev A, Bogaevski I. Mesh regularization andadaptive smoothing. Computer-Aided Design 2001; 33(11):789–800.

29. Rapidform. INUS Technology. http://www.rapidform.com

Authors’ biographies:

Youngjun Kim is a PhD student in the Schoolof Mechanical and Aerospace Engineering at SeoulNational University in Korea. He received his BS(2001)and MS(2003) from Seoul National University. He alsohas been working at K&I Technology. His researchinterests include geometric modeling, computer vision,3D scanning, volume data processing, and medicalengineering.

............................................................................................Copyright © 2008 John Wiley & Sons, Ltd. 525 Comp. Anim. Virtual Worlds 2008; 19: 515–526

DOI: 10.1002/cav

Page 12: 3D virtual simulator for breast plastic surgery

Y. KIM, K. LEE AND W. KIM...........................................................................................

Kunwoo Lee is a professor in the School of Mechanicaland Aerospace Engineering at Seoul National Universityin Korea. He received his BS (1978) from SeoulNational University, and MS (1981) and PhD (1984) inmechanical engineering from MIT. He was previouslyan assistant professor in the Department of Mechanical& Industrial Engineering, University of Illinois atUrbana-Champaign, a visiting associate professor inthe Department of Mechanical Engineering at MIT, anda visiting professor in the Department of MechanicalEngineering at Stanford University. Currently, he is a Co-

Editor in chief of the journal “Computer-Aided Design”and the president of Advanced Institute of ConvergenceTechnology.

Wontae Kim received his BS (2004) and MS (2006) in theDepartment of Computer Science at Sogang Universityin Korea. His research interests include geometricmodeling, volume rendering, GPGPU, and real timerendering. Currently, he is developing applicationsfor volume graphics and 3D scanning system at K&ITechnology.

............................................................................................Copyright © 2008 John Wiley & Sons, Ltd. 526 Comp. Anim. Virtual Worlds 2008; 19: 515–526

DOI: 10.1002/cav