4
The 19th Korea -Japan Joint Workshop on Frontiers of Computer Vision Makeup Color Reproduction Based on Spectrum Data In-Su Jang, Jae Woo Kim, and Jin-Seo Kim Electronics and Telecommunications Research Institute(ETRI), Daejeon, Korea [email protected], jae [email protected] and [email protected] Abstract- A variety of internet services closely related with our daily lives are currently being provided due to the advances in information technology and computer industry. Among those services, virtual simulation applications can enable customers to experience various products before they purchase them. An example of such services is a virtual makeup simulation that allows customers to try cosmetics, hair styles, and accessories virtually on their facial images. Those services help people to make a decision when they purchase the products at the stores. However, most of current virtual makeup solutions and products are not realistic enough to precisely simulate makeup procedures and to predict appearances of the users after the makeup procedure is complete. One of the most significant problems is that simulation results are represented with simple RGB color channels without considering characteristics of color equipment such as cameras and monitors. A solution to those problems is color reproduction technology that determines color values that are displayed on monitors by developing color reproduction models based on measurements using special purpose measurement devices. In this paper we propose an approach to simulate makeup procedures and estimate skin colors after specific cosmetics are applied based on spectrum data of skins and cosmetics. Our approach produced a convincing simulation results with true color representation by using color equipments characterization and true color reproduction technology. Our approach based on spectral data makes it possible to represent true color of user's skin on which various cosmetics has been applied. Keywords-makeup; color; spectrumm; characterization; 3D imulation I. INTRODUCTION Rapid advances in computer graphics, virtual reality, and visualization technologies produced a variety of virtual environment (VE) based simulation systems that make it possible to conduct tasks which are difficult or impossible to conduct in real world. One of the most important performance criteria for such systems is that they must provide realistic environment so that users can be engaged in the tasks they conduct in virtual environment as they do in real world situations. Most of the VE based simulation however cannot provide satisfactory environments due to lack of realism in representing digital contents on computer systems. Color representation is especially important and has been a crucial issue in virtual makeup simulation systems because humans are very sensitive to skin colors and very good at distinguishing artifacts in human skin. 978-1 -4673 -5621 -3/13/$31.00 02013 IEEE 233 Digital color devices display different colors for the same RGB signals depending on their hardware characteristics. Various standards have been therefore established for realistic color reproduction of color devices such as projectors, printers, monitors, etc. Digital color devices deal with color information using hardware dependent ICC profiles to represent true color by minimizing differences in hardware characteristics [1]. However, hardware manufacturers do not usually provide their products' hardware characteristics and precise ICC profile data but mostly focus only on improving image and color quality of their products. The best approach to represent precise true color is to use reproduce colors by using color device characterization scheme. Digital color device characterization is conducted by modeling color transformation between two or more color devices. Color data in captured images are transformed to color data for display device based on the transformation model to reproduce true color of the original image. Additional process is necessary for makeup color simulation which determines color data considering interactions between human skin and several different cosmetics that have been applied on the skin [2]. It therefore additionally requires color information that is unique to the skin and cosmetics materials to accurately simulate makeup procedures. CIELAB or CIEXYZ standard color spaces are most widely employed for color image processing and true color reproduction. They however have limitations in representing true color information for mixed materials because they have been established by quantifying perceived colors by human perception under standard illumination conditions. Spectral reflectance can be used to overcome these limitations to represent makeup colors [3], [4]. Spectral reflectance represents a material's color characteristics and determines the spectral distribution of reflected light that represent the color of the material. In this paper we propose a novel approach to reproduce makeup colors by using spectral characteristics of human skins and cosmetics. II. SPECTRUM-BASED COLOR REPRODUCTION Humans perceive colors through three different types of photo receptors that are L, M, and S distributed on retina of human eyes and recognize color information by analyzing those tri-stimulus data. We can quantify recognized colors by modeling such processes as shown in equation (1). In equation (1) color data is transformed into CIEXYZ space by

[IEEE 2013 19th Korea-Japan Joint Workshop on Frontiers of Computer Vision (FCV2013) - Incheon, Korea (South) (2013.01.30-2013.02.1)] The 19th Korea-Japan Joint Workshop on Frontiers

  • Upload
    jin-seo

  • View
    214

  • Download
    2

Embed Size (px)

Citation preview

The 19th Korea -Japan Joint Workshop on Frontiers of Computer Vision

Makeup Color Reproduction Based on Spectrum Data

In-Su Jang, Jae Woo Kim, and Jin-Seo KimElectronics and Telecommunications Research Institute(ETRI),

Daejeon, [email protected], jae [email protected] and [email protected]

Abstract- A variety of internet services closely related withour daily lives are currently being provided due to the advancesin information technology and computer industry. Among thoseservices, virtual simulation applications can enable customers toexperience various products before they purchase them. Anexample of such services is a virtual makeup simulation thatallows customers to try cosmetics, hair styles, and accessoriesvirtually on their facial images. Those services help people tomake a decision when they purchase the products at the stores.However, most of current virtual makeup solutions and productsare not realistic enough to precisely simulate makeup proceduresand to predict appearances of the users after the makeupprocedure is complete. One of the most significant problems isthat simulation results are represented with simple RGB colorchannels without considering characteristics of color equipmentsuch as cameras and monitors. A solution to those problems iscolor reproduction technology that determines color values thatare displayed on monitors by developing color reproductionmodels based on measurements using special purposemeasurement devices. In this paper we propose an approach tosimulate makeup procedures and estimate skin colors afterspecific cosmetics are applied based on spectrum data of skinsand cosmetics. Our approach produced a convincing simulationresults with true color representation by using color equipmentscharacterization and true color reproduction technology. Ourapproach based on spectral data makes it possible to representtrue color of user's skin on which various cosmetics has beenapplied.

Keywords-makeup; color; spectrumm; characterization; 3Dimulation

I. INTRODUCTION

Rapid advances in computer graphics, virtual reality, andvisualization technologies produced a variety of virtualenvironment (VE) based simulation systems that make itpossible to conduct tasks which are difficult or impossible toconduct in real world. One of the most important performancecriteria for such systems is that they must provide realisticenvironment so that users can be engaged in the tasks theyconduct in virtual environment as they do in real worldsituations. Most of the VE based simulation however cannotprovide satisfactory environments due to lack of realism inrepresenting digital contents on computer systems. Colorrepresentation is especially important and has been a crucialissue in virtual makeup simulation systems because humans arevery sensitive to skin colors and very good at distinguishingartifacts in human skin.

978-1 -4673 -5621 -3/13/$31.00 02013 IEEE 233

Digital color devices display different colors for the sameRGB signals depending on their hardware characteristics.Various standards have been therefore established for realisticcolor reproduction of color devices such as projectors, printers,monitors, etc. Digital color devices deal with color informationusing hardware dependent ICC profiles to represent true colorby minimizing differences in hardware characteristics [1].However, hardware manufacturers do not usually provide theirproducts' hardware characteristics and precise ICC profile databut mostly focus only on improving image and color quality oftheir products. The best approach to represent precise true coloris to use reproduce colors by using color devicecharacterization scheme.

Digital color device characterization is conducted bymodeling color transformation between two or more colordevices. Color data in captured images are transformed to colordata for display device based on the transformation model toreproduce true color of the original image. Additional processis necessary for makeup color simulation which determinescolor data considering interactions between human skin andseveral different cosmetics that have been applied on the skin[2]. It therefore additionally requires color information that isunique to the skin and cosmetics materials to accuratelysimulate makeup procedures.

CIELAB or CIEXYZ standard color spaces are mostwidely employed for color image processing and true colorreproduction. They however have limitations in representingtrue color information for mixed materials because they havebeen established by quantifying perceived colors by humanperception under standard illumination conditions. Spectralreflectance can be used to overcome these limitations torepresent makeup colors [3], [4]. Spectral reflectancerepresents a material's color characteristics and determines thespectral distribution of reflected light that represent the color ofthe material. In this paper we propose a novel approach toreproduce makeup colors by using spectral characteristics ofhuman skins and cosmetics.

II. SPECTRUM-BASED COLOR REPRODUCTION

Humans perceive colors through three different types ofphoto receptors that are L, M, and S distributed on retina ofhuman eyes and recognize color information by analyzingthose tri-stimulus data. We can quantify recognized colors bymodeling such processes as shown in equation (1). In equation(1) color data is transformed into CIEXYZ space by

The 19th Korea -Japan Joint Workshop on Frontiers of Computer Vision

multiplying the intensity of the light source to spectralreflectance of the material and color matching function [5], [6].

1' 00Ok = I(A)R(;L)Ck(;L) dA

400 (1)

Here I denotes intensity of the light source, R denotes spectralreflectance of the material, and C denotes the color matchingfunction. Color data in CIEXYZ space quantifies colors usingcolor matching function that models human visual perceptionand can be transformed into various color spaces such as sRGB,Adobe RGB, and CIELAB. We can therefore reproduce truecolor after makeup procedure if we can estimate spectralreflectance of the skin surface after the makeup is complete.

III. SPECTRUM MEASUREMENT AND ANALYSIS

Spectral color estimation for makeup simulation requiresthree different categories of data measurements: (1) spectraldata of human skin, (2) spectral data of cosmetics samples, and(3) spectral data of human skin after makeup is complete [7].We therefore conducted those measurements using spectro­radiometer. Fig. 1 shows spectral reflectance data obtainedfrom 10 Asian persons' skin surfaces. We can note thatreflectance distribution of the data show similar patterns due tothe similarity of the subjects' skin colors. The black line in thefigure represents the spectral reflectance of a foundation thatwill be applied for experiment. We can see that the reflectancedistribution for the foundation does not show much change in400,,·jOOnm and 600,,-,700nm ranges, but increases drasticallyin 500,,-,600nm range. This observation shows that thereflectance of the foundation has high energy distribution in thehigh wavelength ranges and has relatively low energydistribution in the low wavelength region.

Fig. 1. Measured spectral reflectances of sample skins and a cosmetic.

Fig. 2. Measured spectral reflectances of sample skins after makeup and acosmetic.

Figure 2 shows spectral color data of human skin after aspecific makeup procedure is complete. Note that spectraldistribution of the foundation represented by a black line hasbeen changed. We can observe that the spectral reflectance hasbeen decreased in the low wavelength region, increased in thehigh wavelength regions, and equalized in some degree for thewhole wavelength rage. Fig. 3 represents the change of colorinformation before and after makeup procedure in CIELABcolor space. The red dotted line here denotes the colorinformation for cosmetics. We can notice that the change ofcolor information is irregular that CIELAB space is notappropriate for color estimation in makeup simulation.However, Spectral data can represent the change in colorbefore and after makeup procedures better comparing toCIELAB data. We can therefore conclude that spectrum datacan be used to represent color information for accurate colorreproduction in makeup simulation applications.

IV. MAKEUP COLOR ESTIMATION

We need to estimate spectral reflectance of facial skin onwhich makeup has been applied to reproduce true color formakeup simulation. Fig. 4 shows the ratio that has beencomputed by dividing spectral data after makeup over spectraldata before makeup. Therefore, when there is no change beforeand after makeup is conducted this value will be 1. The ratiovalue larger than 1 denotes the increase in the spectral data, andthe ratio value smaller than 1 denotes the decrease in thespectral data before and after the makeup procedure. We canobserve that there is little change in the 620nm region. Spectralreflectance data for facial skins used in the experiment showsimilar patterns with slightly different heights. The differenceis due to the variation in the brightness of the facial skins. Wecalculate average transition rate for each cosmetics applied inthe experiment to model the variation among the differentspectral reflectance of facial skins. Transition of spectralreflectance is estimated by the following equation:

234

Ro = max(C, Ri * a) (2)

The 19th Korea -Japan Joint Workshop on Frontiers of Computer Vision

Fig. 4. Color transition rage for each sample skin.

Here, R denotes spectrum for the skin, a denotes averagetransition rate, and C denotes spectrum for cosmetics. Themaximum between c and R *a is selected in determining Rvalue because the spectrum value obtained after makeup cannotexceed spectrum value for cosmetics.

V. EXPERIMENTS

We validated our approach by comparing the estimatedspectral data synthesized by our approach with the spectral dataobtained by measuring facial skin after applying makeup. Twofoundations, two lipsticks, and two blushers were applied onsubject faces in our experiment to develop spectrum transitionmodel for makeup procedures. We measured spectral data atboth of lip and cheek areas using spectral radiometer toconstruct training dataset. We applied makeup such asfoundations and lipsticks on the skin and measured spectraldata again at the same locations on the faces. Blushers wereapplied on top of foundations as they are usually applied inreality. The spectrum transition rate for each cosmetic productis calculated based on the analysis of spectral difference beforeand after the makeup procedures using the constructed dataset.We finally validate the obtained spectrum transition rate usingthe root mean square error as shown in equation (3).

(3)

Fig. 3. Color transition before and after makeup in (a) a and b space, (b) Land a space, and (c) Land b space.

235

Figure 5 shows the RMS errors for sample skins in trainingprocess. The graph is RMS errors by two foundations, twoblushers, and two lipsticks for ten testers. As we can see in thefigure, we obtained RMS errors 0.018 and 0.017 for the twofoundations respectively. We obtained RMS errors 0.031 and0.013 for blushers and 0.015 and 0.011 for lipsticks. Totalaverage RMS error value is 0.017 for the training data set. Wethen conducted the experiment on the test data that consists ofnine skin samples and two foundations. Fig. 6 shows theexperimental result of the same experiment on test data setapplying only two foundations and we obtained average RMS

The 19th Korea -Japan Joint Workshop on Frontiers of Computer Vision

VI. CONCLUSION

In this paper, we proposed a novel approach to reproducefacial skin colors after makeup using spectral reflectance data.We could estimate the color information of human facial skinafter makeup that is independent on illumination conditions byusing spectral data of human skins and cosmetics that representcorresponding materials' unique optical properties. Based onthis research we will develop virtual makeup simulation systemthat can be used by customers in selecting products that bestmatches their skins at cosmetics stores.

ACKNOWLEDGMENT

This work was supported by Ministry of Culture, Sportsand Tourism and KOCCA in the CT R&D Program 2011.

Fig. 5. RMS errors for used sample skins in traing process.

Fig. 6. RMS errors for new sample skins.

error value 0.035. From the experimental results we canconclude that our approach can estimate the accurate spectralreflectance for the cosmetics applied on human facial skins.

236

REFERENCES

[1] CIE (Commission Internationale de lEclairage), http://www.cie.co.at/.[2] I. Jang, J. You, and J. Kim, Makeup Color Representation Based on

Spectral Characteristics. AIC(Association Internationale de la Couleur)Interim Meeting, Taipei, pp. 622-625, 2012.

[3] A. Mansouri, T. Sliwa, J. Y. Hardeberg, Y. Voisin, Representation andEstimation of Spectral Reflectances Using Projection on PCA andWavelet Bases, COLOR research and application, vol. 33, no. 6, Dec.2008.

[4] A. Mansouri, F. Marzani, and P. Gouto, Neural networks in cascadeschemes for spectral reflectance reconstruction, IEEE InternationalConference on Image Processing (ICIP05) II. Genova, Italy, pp 718-721,2005.

[5] S. Tominaga and Y. Moriuchi, Principal Component Analysis-BasedReflectance Analysis/Synthesis of Cosmetic Foundation, Journal ofImaging Science and Technology, vol. 53, no. 6, pp. 060403-1 - 060403­8, Nov. 2009.

[6] S. Tominaga and Y. Moriuchi, PCA-based ReflectanceAnalysis/Synthesis of Cosmetic Foundation, 16th Color ImagingConference, pp. 195-198, 2008

[7] N. Tsumura, N. Ojima, K. Sato, M. Shiraishi, H. Shimizu, H. Nabeshima,S. Akazaki, K. Hori, and Y. Miyake, Image-based skin color and textureanalysis/synthesis by extracting hemoglobin and melanin information inthe skin, ACM Trans. Graph. vol. 22, pp. 770-779, 2003.