WO2006051841A1 - 質感再現システム - Google Patents
質感再現システム Download PDFInfo
- Publication number
- WO2006051841A1 WO2006051841A1 PCT/JP2005/020588 JP2005020588W WO2006051841A1 WO 2006051841 A1 WO2006051841 A1 WO 2006051841A1 JP 2005020588 W JP2005020588 W JP 2005020588W WO 2006051841 A1 WO2006051841 A1 WO 2006051841A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- luminance image
- sample object
- simulated
- simulated object
- sample
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/50—Lighting effects
Definitions
- the present invention relates to a texture reproduction system that reproduces the texture of a sample object on a simulated object having a reflection characteristic different from that of the sample object.
- 3D-CAD systems have been introduced at manufacturing sites, and it is possible to simulate finished quality on the Digital-Mock-up in combination with the above CG technology. This is very important for optimizing the manufacturing process.
- the display medium is a self-luminous object, whereas the natural object is a reflective object, so that the display mode differs between the display object and the real object.
- the display dynamic range is an important item for expressing the texture and the like.
- the current display medium can only secure this dynamic range of about 200: 1.
- the reflection characteristic processing and compression method shown in Fig. 2 will be adopted.
- Patent Document 1 Japanese Patent No. 3254195
- Patent Document 2 Japanese Patent No. 3311841 Disclosure of the invention
- the conventional display reproduction technology is only through the display medium, it is difficult to match the texture with the real thing.
- the display medium is a self-luminous object, whereas the natural object is a reflective object, so that the display mode and the real object are different.
- the dynamic range (contrast) of display is an important item for expressing textures, etc., but the current self-luminous display media can only secure this dynamic range of about 200: 1.
- the reflection characteristic processing and compression methods as shown in Patent Documents 1 and 2 must be adopted.
- the object of the present invention is to provide a texture that can realistically reproduce the texture of a sample object on a simulated object having a reflection characteristic different from that of the sample object by using a projection display means. To provide a reproduction system.
- the present invention is a texture reproduction system for reproducing the texture of a sample object on a simulated object having a reflection characteristic different from that of the sample object, and based on the reflection characteristic of the sample object, at a predetermined observation position.
- First computing means for computing a luminance image of the sample object;
- second computing means for computing a brightness image of the simulated object at the predetermined observation position based on reflection characteristics of the simulated object;
- Third calculation means for calculating the difference between the luminance image of the sample object and the luminance image of the simulated object, and fourth calculation means for calculating a projection luminance image that eliminates the difference based on the difference.
- the projection display means for projecting the projection luminance image onto the sample object and the Z or the simulated object.
- the projected luminance image may be obtained, for example, by tracking the difference with a reverse ray.
- the reflection characteristic of the sample object can be set as a function of the position of the sample object, the incident angle, the outgoing angle, the incident azimuth angle, the outgoing azimuth angle and the wavelength of the light in the sample object.
- the fourth calculation means is based on means for searching for the maximum luminance value of the projection luminance image, the projection luminance image, the maximum luminance value, and the limit radiance of the projection display means.
- the texture of the sample object is defined as the sample object using the projection display means having a reflective display function and a display dynamic range of 1000: 1 or more. It is possible to reproduce realistically on a simulated object having different reflection characteristics. Therefore, for example, when requesting manufacture of a product from a supplier to a manufacturing factory, there is an advantage that it is not necessary to send a product sample object to the manufacturing factory.
- FIG. 1 is a perspective view conceptually showing an embodiment of a texture reproduction system according to the present invention.
- FIG. 2 is a flowchart showing a procedure for reproducing a texture.
- FIG. 3 is a perspective view showing a state in which the textures of a sample object and a simulated object are compared.
- FIG. 4 is a schematic diagram showing a method for obtaining a luminance image at the viewpoint position of a sample object.
- FIG. 5 is a schematic diagram showing a method for obtaining a luminance image at a viewpoint position of a simulated object.
- FIG. 6 is a schematic diagram showing a method for obtaining a projection luminance image in the projection apparatus corresponding to the difference image at the viewpoint position.
- FIG. 7 is a reflection characteristic diagram of an object.
- FIG. 8 is a cross section of the reflection characteristic diagram.
- FIG. 9 is a reflection characteristic diagram at one point of the sample object.
- FIG. 10 is a reflection characteristic diagram at one point of the simulated object.
- FIG. 11 is a reflection characteristic diagram in which the reflection characteristics shown in FIG. 9 are indicated by dotted lines and the reflection characteristics shown in FIG. 10 are indicated by solid lines.
- FIG. 12 is a reflection characteristic diagram in which the luminance image of the sample object 7 at the viewpoint position is indicated by a dotted line and the final simulated object luminance image is indicated by a solid line.
- FIG. 13 Reflection characteristics with the luminance image of the sample object before the change at the viewpoint position indicated by a thin solid line, and the luminance image of the sample object at the viewpoint position indicated by the solid solid line when the final projected luminance image is projected from the projection device FIG.
- FIG. 14 is a flowchart showing a technique for improving the appearance of a simulated object by controlling the projection brightness of the sample object 7.
- FIG. 1 is a schematic diagram showing an embodiment of a texture display system according to the present invention.
- the database 5 of each processing computer 3 connected via the network 1 stores the reflection characteristics of the sample object and the simulated object.
- the sample object is a finished product that has been subjected to a surface treatment such as chrome plating, and the simulated object is an unfinished product that has not been subjected to this surface treatment.
- a semi-cylindrical sample object 7 and a simulated object 9 having the same shape as the sample object 7 are shown.
- the reflection characteristic of the sample object 7 is measured in advance using the position of a light source (not shown) having a predetermined radiance and the position of the viewpoint of an observer observing the reflected light as parameters.
- the reflection characteristics of the simulated object 9 are the same. Since this reflection characteristic measuring method is well known, the description thereof is omitted here.
- the processing computer 3 captures the reflection characteristics of the simulated object 9 corresponding to the specified position of the light source, the viewpoint of the observer, and the object shape, based on the database 5 (step 107). Based on the reflection characteristics, the luminance image of the simulated object 9 at the observer's viewpoint is calculated (step 109), and the difference between the luminance image of the sample object 7 and the luminance image of the simulated object 9 is calculated. (Step 111).
- the processing computer 3 performs reverse ray tracing of the difference luminance image obtained by the difference calculation (step 113).
- FIGS. 4 to 6 are schematic diagrams for explaining the reverse ray tracing.
- the sample object 7 and the simulated object 9 are shown as spheres.
- a predetermined viewpoint position As shown in FIG. 4, when light of a predetermined radiance generated by a modulator (such as a liquid crystal or a mirror device) in the projection apparatus 11 is illuminated onto the sample object 7 through the lens 11a, a predetermined viewpoint position ( The reflected light forms an image on the retina via a lens made of a crystalline lens.
- a luminance image recognized by the observer at this time is shown. This luminance image corresponds to the luminance image of the sample object calculated in step 105.
- the luminance image of the simulated object 9 at the viewpoint position is the right part of FIG. As shown.
- This luminance image corresponds to the luminance image of the simulated object calculated in step 109.
- the luminance image shown on the right side of FIG. 6 is the luminance image of the sample object 7 at the viewpoint position shown in FIG. 4 and the luminance image of the simulated object 9 at the viewpoint position shown in FIG. It is a difference luminance image obtained by taking the difference.
- This difference luminance image corresponds to the difference luminance image calculated in step 111.
- the difference luminance image means the luminance deficiency of the simulated object 9 based on the luminance of the sample object 7. Therefore, if a luminance image that compensates for the above-described luminance deficiency is projected from the projection device 11 onto the simulated object 9 based on the difference luminance image, the appearance of the simulated object 9 matches that of the sample object 7. Become.
- the projection luminance image on the projection device 11 can be obtained by tracking the difference luminance image from the viewpoint position to the projection device 11 based on the relationship between the viewpoint position, the position of the object, and the position of the projection device 11. . Since the reverse ray tracing method is well known, the specific description thereof is omitted here.
- step 113 the difference luminance image at the visual point position obtained in step 111 is converted into a difference luminance image for the predetermined radiance currently projected by the projection device 11 by the inverse ray tracing method.
- step 115 the difference luminance image at the viewpoint position, that is, the insufficient luminance of the simulated object 9 viewed from the viewpoint position is compensated based on the difference luminance image at the projection device 11.
- a projection luminance image is generated, and the projection luminance image is output from the projection device 11 in the next step 117.
- the insufficient luminance of the simulated object 9 viewed from the viewpoint position is compensated, and the appearance of the simulated object 9 (particularly gloss in this example) matches that of the sample object 7. Will do.
- the appearance of the simulated object 9 can be evaluated by arranging the sample object 7 and the simulated object 9 adjacent to each other as shown in FIG. In this case, illumination light having a predetermined luminance is projected from the projection device 11 to the sample object 7, and the projection luminance image obtained in step 115 is output from the projection device 11 to the simulated object 9. As a result of such an evaluation, it was confirmed that the appearance of the simulated object 9 was very close to the appearance of the sample object 7.
- any of the processing computers 3 connected via the network 1 can execute the above procedure. Therefore, according to this texture reproduction system, it is possible to view the simulated object 9 as an object having the force and the texture of the sample object 7 even in a remote place. As a result, for example, when requesting manufacture of a product from the supplier to the manufacturing plant, it is not necessary to send the product sample object to the manufacturing plant.
- the manufactured product is arranged in the dotted frame shown in FIG. 1, and the manufactured product is compared with the simulated object 9 in which the texture of the sample object 7 is reproduced, and the manufactured product is compared. It is possible to evaluate the finishing power of products.
- the projection device 11 has the limit radiance, when the maximum brightness of the projection brightness image generated in the step 115 exceeds the limit radiance, the simulation is viewed from the viewpoint position. This means that the lack of brightness of object 9 cannot be completely compensated.
- the following countermeasures for controlling the projection brightness of the sample object 7 to improve the appearance of the simulated object 9 can be applied.
- FIG. 7 illustrates the reflection characteristics at the viewpoint position of the sample object 7 shown in FIG. 4 and the simulated object 9 shown in FIG.
- the reflected luminance lout is expressed as follows.
- BRDF is the bidirectional reflection function (positions x and y of sample object 7 and simulated object 9, light incident angle ⁇ in, light output angle ⁇ out, azimuth angle ⁇ in of incident light, output light Is a function of the azimuth angle ⁇ out and the light wavelength ⁇ .
- the AA ′ cross section of the reflection characteristic of the object 8 or 9 is expressed as shown in FIG.
- the following explanation will be made using this AA 'cross section for easy understanding.
- FIG. 9 and FIG. 10 illustrate the reflection characteristics (reflection characteristics viewed from the viewpoint position) at one point of the sample object 7 shown in FIG. 4 and the simulated object 9 shown in FIG. 5, respectively.
- FIG. 11 shows the reflection characteristic shown in FIG. 9 by a dotted line and the reflection characteristic shown in FIG. 10 by a solid line.
- the difference between the reflection characteristics shown in FIG. 11 is the difference between the luminance images of the sample object 7 and the simulated object 9 at the one point as seen from the viewpoint position.
- this difference luminance image is calculated for all points centered on the object plane normal vector, so that the projection luminance image generated in step 115 of FIG. This is a projected luminance image for all points.
- step 201 shown in FIG. 14 the maximum luminance value (cd / m 2) in the projection luminance image generated in step 115 of FIG. 2 is searched. Then, in the next step 203, the following calculation is performed based on the projection luminance image, the limit radiance of the projection device 11 and the searched maximum luminance value, and the limit radiance is set as the upper limit luminance.
- the corrected projection luminance image for the simulated object 9 to be determined is determined.
- Modified projection luminance image projection luminance image X (limit radiance / maximum luminance value) [0031]
- step 205 the luminance image at the viewpoint position of the simulated object 9 when the modified projected luminance image is projected onto the simulated object 9. Is obtained as a final simulated object luminance image.
- the final simulated object luminance image is obtained based on the following calculation.
- a final projected luminance image for the sample 7 that matches the final simulated object luminance image is determined based on the following calculation.
- the thin solid line shows the luminance image of the sample object before the change in the viewpoint position
- the thick solid line shows the luminance of the sample object at the viewpoint position when the final projected luminance image is projected from the projection device. Show the image (changed brightness image).
- the luminance image of the bold line matches the final simulated object luminance image shown by the solid line in FIG.
- steps 201 to 207 can be inserted between step 115 and step 117 of Fig. 2 as required and executed by the processing computer 3. If the procedures of Steps 201 to 207 are executed, the simulated object luminance image that approximates the luminance image of the sample object can be reproduced by making the best use of the ability of the projection device 11.
- the method in steps 201 to 207 described above adjusts the projected luminance image for the simulated object 9 and the sample object 7 in consideration of the limit radiance in the projection device 11, so that the simulated object is obtained as a result.
- the projected brightness on 9 and Z or the sample object 7 may be reduced.
- the texture and contrast of both objects 7 and 9 are preserved, and humans have the visual characteristic that they feel the texture with relative brightness instead of absolute brightness. Reproducibility is obtained.
- the database 5 has reflection characteristics corresponding to those changes. Can provide sex.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Graphics (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Generation (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004-331171 | 2004-11-15 | ||
JP2004331171A JP2006139726A (ja) | 2004-11-15 | 2004-11-15 | 質感再現システム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2006051841A1 true WO2006051841A1 (ja) | 2006-05-18 |
Family
ID=36336518
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2005/020588 WO2006051841A1 (ja) | 2004-11-15 | 2005-11-10 | 質感再現システム |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP2006139726A (ja) |
WO (1) | WO2006051841A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009003166A (ja) * | 2007-06-21 | 2009-01-08 | Hitachi Ltd | 質感映像表示装置 |
WO2019130726A1 (ja) * | 2017-12-27 | 2019-07-04 | 富士フイルム株式会社 | 質感再現装置、質感再現方法、プログラムおよび記録媒体 |
WO2019130716A1 (ja) * | 2017-12-27 | 2019-07-04 | 富士フイルム株式会社 | 質感再現装置、質感再現方法、プログラムおよび記録媒体 |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100883311B1 (ko) * | 2007-05-25 | 2009-02-11 | 한국과학기술연구원 | 입력 장치의 개발을 위한 시스템 |
US8982125B1 (en) * | 2014-05-15 | 2015-03-17 | Chaos Software Ltd. | Shading CG representations of materials |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003216970A (ja) * | 2002-01-23 | 2003-07-31 | Canon Inc | 三次元画像処理装置、三次元画像処理システム、三次元画像処理方法および三次元画像処理プログラム |
JP2003281565A (ja) * | 2002-03-20 | 2003-10-03 | Japan Science & Technology Corp | 表示デバイスの特性に依存しない光沢感再現方法 |
JP2003331318A (ja) * | 2002-05-14 | 2003-11-21 | Fujitsu Ltd | 物体データ生成装置 |
-
2004
- 2004-11-15 JP JP2004331171A patent/JP2006139726A/ja not_active Withdrawn
-
2005
- 2005-11-10 WO PCT/JP2005/020588 patent/WO2006051841A1/ja active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003216970A (ja) * | 2002-01-23 | 2003-07-31 | Canon Inc | 三次元画像処理装置、三次元画像処理システム、三次元画像処理方法および三次元画像処理プログラム |
JP2003281565A (ja) * | 2002-03-20 | 2003-10-03 | Japan Science & Technology Corp | 表示デバイスの特性に依存しない光沢感再現方法 |
JP2003331318A (ja) * | 2002-05-14 | 2003-11-21 | Fujitsu Ltd | 物体データ生成装置 |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009003166A (ja) * | 2007-06-21 | 2009-01-08 | Hitachi Ltd | 質感映像表示装置 |
WO2019130726A1 (ja) * | 2017-12-27 | 2019-07-04 | 富士フイルム株式会社 | 質感再現装置、質感再現方法、プログラムおよび記録媒体 |
WO2019130716A1 (ja) * | 2017-12-27 | 2019-07-04 | 富士フイルム株式会社 | 質感再現装置、質感再現方法、プログラムおよび記録媒体 |
JPWO2019130726A1 (ja) * | 2017-12-27 | 2020-10-22 | 富士フイルム株式会社 | 質感再現装置、質感再現方法、プログラムおよび記録媒体 |
JPWO2019130716A1 (ja) * | 2017-12-27 | 2020-10-22 | 富士フイルム株式会社 | 質感再現装置、質感再現方法、プログラムおよび記録媒体 |
Also Published As
Publication number | Publication date |
---|---|
JP2006139726A (ja) | 2006-06-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11470303B1 (en) | Two dimensional to three dimensional moving image converter | |
Kim et al. | Immersive spatial audio reproduction for vr/ar using room acoustic modelling from 360 images | |
Aldrian et al. | Inverse rendering of faces with a 3D morphable model | |
US12014467B2 (en) | Generating augmented reality prerenderings using template images | |
Alexiou et al. | On the performance of metrics to predict quality in point cloud representations | |
WO2015166684A1 (ja) | 画像処理装置と画像処理方法 | |
Meilland et al. | 3d high dynamic range dense visual slam and its application to real-time object re-lighting | |
WO2006051841A1 (ja) | 質感再現システム | |
CN114925748B (zh) | 模型训练及模态信息的预测方法、相关装置、设备、介质 | |
Chalmers et al. | Reconstructing reflection maps using a stacked-CNN for mixed reality rendering | |
Kneiphof et al. | Real‐time Image‐based Lighting of Microfacet BRDFs with Varying Iridescence | |
US11734888B2 (en) | Real-time 3D facial animation from binocular video | |
Zhi et al. | Towards fast and convenient end-to-end HRTF personalization | |
Kim et al. | Generating 3D texture models of vessel pipes using 2D texture transferred by object recognition | |
Haleem et al. | Holography and its applications for industry 4.0: An overview | |
Marques et al. | Spatially and color consistent environment lighting estimation using deep neural networks for mixed reality | |
CN102301399A (zh) | 图像生成系统、图像生成方法、计算机程序及其记录介质 | |
McGuigan et al. | Automating RTI: Automatic light direction detection and correcting non-uniform lighting for more accurate surface normals | |
JP7279352B2 (ja) | 質感調整支援システム、及び質感調整支援方法 | |
Nader et al. | Visual contrast sensitivity and discrimination for 3D meshes and their applications | |
Rohe | An Optical Test Simulator Based on the Open-Source Blender Software. | |
Xu et al. | Object-based illumination transferring and rendering for applications of mixed reality | |
Tsuchida et al. | Development of BRDF and BTF measurement and computer-aided design systems based on multispectral imaging | |
Gigilashvili et al. | Appearance manipulation in spatial augmented reality using image differences | |
Huraibat et al. | Accurate physics-based digital reproduction of effect coatings |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KM KN KP KR KZ LC LK LR LS LT LU LV LY MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU LV MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 05806002 Country of ref document: EP Kind code of ref document: A1 |