WO2017039026A1 - Method of generating virtual reality on head mounted display and head mounted display performing the same - Google Patents

Method of generating virtual reality on head mounted display and head mounted display performing the same Download PDF

Info

Publication number
WO2017039026A1
WO2017039026A1 PCT/KR2015/009162 KR2015009162W WO2017039026A1 WO 2017039026 A1 WO2017039026 A1 WO 2017039026A1 KR 2015009162 W KR2015009162 W KR 2015009162W WO 2017039026 A1 WO2017039026 A1 WO 2017039026A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
sub
ray tracing
eye
reference images
Prior art date
Application number
PCT/KR2015/009162
Other languages
French (fr)
Inventor
Woo Nam Chung
Original Assignee
Siliconarts Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siliconarts Inc. filed Critical Siliconarts Inc.
Priority to PCT/KR2015/009162 priority Critical patent/WO2017039026A1/en
Priority to KR1020187008833A priority patent/KR102101217B1/en
Publication of WO2017039026A1 publication Critical patent/WO2017039026A1/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems

Definitions

  • This disclosure relates to a virtual reality technique and, more particularly, to a method of generating virtual reality on a head mounted display and a head mounted display capable of using a ray tracing to implement a virtual reality on a head mounted display.
  • Virtual Reality which can be referred to as immersive multimedia or computer-simulated life, replicates an environment that simulates physical presence in places in the real world or imagined worlds and lets the user interact in that world.
  • Virtual reality artificially creates sensory experiences, which can include sight, hearing, touch, smell, and taste.
  • virtual reality environments are displayed either on a computer screen or with special stereoscopic displays, and some simulations include additional sensory information and focus on real sound through speakers or headphones targeted towards VR users.
  • Some advanced, haptic, systems now include tactile information, generally known as force feedback in medical, gaming and military applications.
  • virtual reality covers remote communication environments which provide virtual presence of users with the concepts of telepresence and telexistence or a virtual artifact (VA) either through the use of standard input devices such as a keyboard and mouse, or through multimodal devices such as a wired glove or omnidirectional treadmills.
  • VA virtual artifact
  • the simulated environment can be similar to the real world in order to create a lifelike experience ⁇ for example, in simulations for pilot or combat training ⁇ or it differs significantly from reality, such as in VR games.
  • Korean Patent Publication NO. 10-0453225 is related to a three dimensional VR (Virtual Reality) implementing client system and method.
  • the client system comprises an applet, a communication module, a browser, and an ASP(Active Server Page) support module.
  • the applet enables the client system to communicate the X3D data with a server.
  • the communication module connected to the applet, makes possible the communication between the client system and the server.
  • the browser parses the X3D data transmitted via the communication module by the server, constructs a scene graph, renders the scene graph according to a preset type, and displays the rendered scene graph on a display.
  • the ASP support module processes the data, transmitted by the server, on an external window according to a user event, generated in the VR implemented by the browser.
  • One embodiment of the present invention proposes a head mounted display technology capable of using a ray tracing to implement a virtual reality on a head mounted display.
  • One embodiment of the present invention proposes a head mounted display technology capable of using one eyeline image as an eye ray in a ray tracing to generate another eyeline image, thereby decreasing an operation complexity.
  • one eyeline image may be an image shown by one eye (e.g., a left eye) and another eyeline image may be an image shown by another eye (e.g., a right eye).
  • One embodiment of the present invention proposes a head mounted display technology capable of providing a real-time interface to reduce eyestrain.
  • a method of generating virtual reality on an HMD includes (a) generating a reference image associated with a first eye through a ray tracing, (b) generating a temporary image associated with a second eye, the temporary image being generated by transforming the reference image and (c) performing a ray tracing on a below threshold pixel among pixels in the temporary image to generate a virtual image.
  • the method further may include obtaining a viewpoint including a first and a second eyelines and a parallax and the viewpoint is generated by both eyes before the step (a).
  • the step (a) may include associating a first eyeline with an eye ray used in a ray tracing to generate the reference image.
  • the step (b) may include reflecting a parallax generated by both eyes on the reference image to warp the reference image.
  • the step (c) may include associating a second eyeline with an eye ray used in a ray tracing to perform a ray tracing on an empty region, the empty region corresponding to anti-dimensional region being not visually displayed through the first eyeline when the reference image is dimensionally displayed.
  • the step (b) may include (b1) partitioning the reference image into a plurality of sub-reference images and (b2) reflecting a parallax generated by both eyes in each of the plurality of the sub-reference images to warp the reference image.
  • the step (b1) may further include variably partitioning into the plurality of the sub-reference images through a combination of a plurality of sub-windows, the combination thereof completing the reference window.
  • the step (c) may include (c1) associating a second eyeline with an eye ray used in a ray tracing to perform a ray tracing on at least one sub-reference image, the at least one sub-reference image including an empty region corresponding to an anti-dimensional region among the plurality of the warped sub-reference images.
  • the step (c1) may further include performing the ray tracing on the plurality of the warped sub-reference images in parallel.
  • the step (b) may include (b1) classifying a plurality of first sub-reference images with an anti-dimensional region and a plurality of second sub-reference images with a dimensional region, before or after partitioning the reference images into a plurality of sub-reference images and (b2) reflecting a parallax generated by both eyes in each of the plurality of the second sub-reference images to warp the plurality of the second sub-reference images.
  • the step (c) may include (c1) associating a second eyeline with an eye ray used in a ray tracing to perform a ray tracing on the plurality of the first sub-reference images.
  • the step (c) may further include (c2) combining the plurality of the first sub-reference images generated through the ray tracing and the plurality of the warped second sub-reference images to generate the virtual image.
  • a head mounted display includes a viewpoint obtaining unit configured to obtain a viewpoint that is generated by both eyes and includes a first and a second eyelines and a parallax, a reference generation unit configured to generate a reference image associated with a first eye through a ray tracing, a temporary image generation unit configured to generate a temporary image associated with a second eye, the temporary image being generated by transforming the reference image and a ray tracing unit configured to perform a ray tracing on a below threshold pixel among pixels in the temporary image.
  • a head mounted display technology may use a ray tracing to implement a virtual reality on a head mounted display.
  • a head mounted display technology may use one eyeline image as an eye ray in a ray tracing to generate another eyeline image, thereby decreasing an operation complexity.
  • one eyeline image may be an image shown by one eye (e.g., a left eye) and another eyeline image may be an image shown by another eye (e.g., a right eye).
  • a head mounted display technology may provide a real-time interface to reduce eyestrain.
  • FIG. 1 is a conceptual diagram illustrating a head mounted display system according to an example embodiment of the present invention.
  • FIG. 2 is a block diagram illustrating one control unit in a head mounted display in FIG. 1.
  • FIG. 3 is a block diagram illustrating another control unit in a head mounted display in FIG. 1.
  • FIG. 4 is a flowchart illustrating procedures for generating virtual reality on a head mounted display system in FIG. 1.
  • FIG. 5 is a conceptual diagram illustrating dimensional and anti-dimensional regions.
  • first and second may be used to describe various components, such components must not be understood as being limited to the above terms.
  • the above terms are used to distinguish one component from another.
  • a first component may be referred to as a second component without departing from the scope of rights of the present invention, and likewise a second component may be referred to as a first component.
  • the present invention may be implemented as machine-readable codes on a machine-readable medium.
  • the machine-readable medium includes any type of recording device for storing machine-readable data. Examples of the machine readable recording medium include a read-only memory (ROM), a random access memory (RAM), a compact disk-read only memory (CD-ROM), a magnetic tape, a floppy disk, and optical data storage.
  • the medium may also be carrier waves (e.g., Internet transmission).
  • the computer-readable recording medium may be distributed among networked machine systems which store and execute machine-readable codes in a de-centralized manner.
  • the present invention is supported by Seoul Industry Promotion Institute in 2013 as ⁇ Technology Development Supporting Project For Patent Technology Commercialization ⁇ and is a result of the project "Software Kit Development For Developing Ray Tracing Rendering Contents" from September 1, 2013 to August 31, 2014 by SILICONARTS, INC.
  • FIG. 1 is a conceptual diagram illustrating a head mounted display system according to an example embodiment of the present invention.
  • a head mounted display (HMD) system 10 includes a virtual object 50 and an HMD 100.
  • the virtual object 50 may correspond to a virtual focusing region generated by the HMD 100. That is, the virtual object 50 may be a 3D object that is not real but be dimensionally drawn by the HMD 100. In one embodiment, the virtual object 50 may be totally generated by the HMD 100. In another embodiment, the virtual object 50 may be generated by combining a real object positioned in a front of the HMD 100 and a partially drawn object generated by the HMD 100.
  • the HMD 100 includes a left display unit 110, a right display unit 120, a network interface unit 130 and a control unit 140.
  • the HMD 100 may be worn on the head or as part of a helmet and has a small display optic in front of each eye (binocular HMD).
  • the left and right display units 110 and 120 may correspond to display optics in front of corresponding eyes.
  • the left display unit 110 may generate a left-side image for a left eye and the right display unit 120 may generate a right-side image for a right eye, thereby the left and right display units 110 and 120 may virtually generate a dimensional image.
  • the network interface unit 130 may transmit left and right images to an external device. Also, the network interface unit 130 may receive a dimensional image or left and right images from an external device. In one embodiment, the network interface unit 130 may be implemented as a Bluetooth, a WiFi and so on.
  • the control unit 140 may generate one side image for one eye as a reference image associated with a first eye through a ray tracing.
  • the control unit 140 may generate a temporary image associated with a second eye by transforming the reference image and then may complete another side image for another eye through a ray tracing on an error pixel (i.e., a below threshold pixel) in the temporary image.
  • the control unit 140 may combine one side image and another side image to generate a virtual image.
  • the control unit 140 will be described with references with FIGS 2 and 3.
  • FIG. 2 is a block diagram illustrating one control unit in a head mounted display in FIG. 1.
  • the HMD 100 includes a viewpoint obtaining unit 210, a reference image generation unit 220, a temporary image generation unit 230, a ray tracing unit 240 and an HMD display unit 250.
  • the viewpoint obtaining unit 210 obtains a viewpoint including first and second eyelines generated by both eyes and a parallax.
  • the first eyeline may correspond to a left eyeline and the second eyeline may correspond to a right eyeline.
  • the parallax may correspond to an apparent displacement of the virtual object 50 as seen from both eyes that are not on a line with the virtual object 50.
  • the reference image generation unit 220 generates a reference image associated with a first eye through a ray tracing.
  • the reference image generation unit 220 associates the first eyeline with an eye ray used in a ray tracing to generate the reference image. That is, the reference image generation unit 220 may perform a ray tracing through using the first eyeline (e.g., left eye) as an eye ray to generate the reference image.
  • the ray tracing is a rendering technique with a rendering object, a light source and an eyeline, which may simulate a reflection generated on the virtual object 50 by a ray from a light source to generate a pixel color in the virtual object 50.
  • the temporary image generation unit 230 transforms the reference image to generate a temporary image for a second eye.
  • the temporary image generation unit 230 may change a point of view from the first eye into the second eye for the reference image to generate the temporary image.
  • the temporary image may be incomplete for the second eye due to some regions viewed only from the first eye (i.e., not viewed from the second eye).
  • the temporary image generation unit 230 may warp the reference image through reflecting the parallax generated by both eyes on the reference image to generate the temporary image.
  • the temporary image generation unit 230 may pervert the reference image into the temporary image, which may use the parallax for changing a point of view from the first eye to the second eye. This may lead to calculating dimensional directions and light effects on the virtual object 50. That is, the temporary image generation unit 230 may perform an image warping on the reference image to generate the temporary image, which may reflect distortions due to the parallax on the reference image.
  • the ray tracing unit 240 associates the second eyeline with an eye ray to perform a ray tracing on an error pixel (i.e., below threshold pixel) in the temporary image.
  • the error pixel may include a pixel inadequately reflecting a color value or a pixel inadequately reflecting a dimensional direction.
  • the error pixel When the temporary image is generated by warping the reference image, the error pixel may be generated due to the parallax. Then the ray tracing unit 240 may perform a ray tracing on a pixel below a pre-determined threshold among pixels in the temporary image to obtain a pixel value, thereby compensating for the error pixel.
  • the error pixel due to the parallax may be plural and grouped into some areas. One of the some areas corresponds to an empty region indicating anti-dimensional region being not visually displayed through the first eyeline when the warped reference image (i.e., the temporary image) is generated.
  • An HMD display unit 250 may combine the reference image generated from the reference image generation unit 220 for the first eye and the ray-traced warped reference image generated from the ray tracing unit 240 to provide the virtual image to a user.
  • FIG. 3 is a block diagram illustrating another control unit in a head mounted display in FIG. 1.
  • FIG.3 is an example implementation for firstly partitioning a reference image into a plurality of sub-reference images and secondly warping each of the partitioned sub-reference images and compensating through a ray tracing to generate the virtual image.
  • the HMD 100 includes a viewpoint obtaining unit 310, a reference image generation unit 320, a temporary image generation unit 330 including an image partitioning unit 332 and a temporary partition image generation unit 332, a ray tracing unit 340 and an HMD display unit 350.
  • the viewpoint obtaining unit 310, the reference image generation unit 320, the ray tracing unit 340 and the HMD display unit 350 may be substantially equal thereto in FIG.2. For convenience's sake, the main differences therebetween will be explained.
  • the temporary image generation unit 330 may include the image partitioning unit 332 and the temporary partition image generation unit 334.
  • the image partitioning unit 332 partitions the reference image (e.g., left-side image) into a plurality of sub-reference images.
  • the image partitioning unit 332 may variably partitions the reference image into the plurality of the sub-reference images through a combination of a plurality of sub-windows and the combination thereof may complete the reference window. For example, when a size of the reference window corresponds to 16X16 pixels, a size of the plurality of the sub-reference windows may be correspond to 4X4, 8X8 and 16X16 pixels.
  • the image partitioning unit 332 may use one type of the plurality of the sub-windows to partition the reference image. For example, the image partitioning unit 332 may use a size of 8X8 sub-window to partition the reference image into four sub-reference images. In another embodiment, the image partitioning unit 332 may use various types of the plurality of the sub-windows to partition the reference image. For example, the image partitioning unit 332 may use a size of 8X8 sub-window and a size of 4X4 sub-window to partition the reference image into three 8X8 sub-reference images and four 4X4 sub-reference images.
  • the temporary partition image generation unit 334 reflects the parallax generated by both eyes 330 to warp the plurality of the sub-reference images, thereby generating the temporary image. That is, the temporary partition image unit 334 may perform an image warping on each of the plurality of the sub-reference images to generate the temporary image.
  • the ray tracing unit 340 associates the second eyeline with the eye ray to perform a ray tracing on an error pixel (i.e., a below threshold pixel) in the plurality of the warped sub-reference images or on a warped sub-reference image with the error pixel.
  • the ray tracing unit 340 may perform a ray tracing in parallel. Because a ray tracing procedure for the error pixel is described in the ray tracing unit 240, a ray tracing procedure for the warped sub-reference image with the error pixel will be described.
  • the ray tracing unit 340 may associate the second eyeline with an eye ray to perform a ray tracing on at least one of the sub-reference image including an error pixel or an empty region (e.g., region including the below threshold pixel) to generate the ray-traced warped image.
  • the empty region may include an anti-dimensional region in the plurality of the warped sub-reference images that indicates an area viewed from one eye and not viewed from another from another eye in the reference image.
  • the image partitioning unit 332 may classify a plurality of first sub-reference images with anti-dimensional regions and a plurality of second sub-reference images with dimensional regions before or after partitioning the reference image into the plurality of the sub-reference images.
  • the temporary partition image generation unit 334 may reflect the parallax generated by both eyes to warp the plurality of the second sub-reference images to generate the temporary image.
  • the ray tracing unit 340 may associate the second eyeline with an eye ray used in a ray tracing to perform the ray tracing on the plurality of the first sub-reference images and may combine the plurality of the first sub-reference images and the plurality of the warped second sub-reference images to generate the ray-traced warped image.
  • FIG. 4 is a flowchart illustrating procedures for generating virtual reality on a head mounted display system in FIG. 1 and
  • FIG. 5 is a conceptual diagram illustrating dimensional and anti-dimensional regions.
  • the reference image generation unit 220 or 320 generates a reference image associated with a first eye through a ray tracing (Step S410).
  • the temporary image generation unit 230 or 330 transforms the reference image to generate the temporary image associated with a second eye (Step S420).
  • the temporary image generation unit 230 or 330 may reflect the parallax generated by both eyes on the reference image and may warp the reference image to generate the temporary image.
  • the temporary image generation unit 230 or 330 may wholly or partly transform the reference image.
  • the ray tracing unit 240 or 340 performs a ray tracing by using the second eyeline as an eye ray on the error pixel (i.e., below threshold pixel) in the temporary image (Step S430). In one embodiment, the ray tracing unit 240 or 340 performs a ray tracing on each of error pixels in the temporary image or on each of error sub-blocks therein, each indicating a partitioned and transformed reference image with an error pixel. Finally, the ray tracing unit 240 or 340 generates the ray-traced transformed reference image.
  • the temporary image generation unit 230 or 330 transforms the reference image to generate the temporary image for a right eye.
  • a dimensional region 510 may include a region viewed from both eyes and be adequately transformed.
  • An anti-dimensional region 520 may include a region viewed from the right eye and not viewed from the left eye and be inadequately transformed.
  • the ray tracing unit 240 or 340 searches for the error pixel in the temporary image, which is included in the anti-dimensional region 520 and performed a ray tracing thereon.
  • the error pixel may be used as a sub-block including the error pixel.
  • the HMD display unit 250 or 350 generates the virtual image for both eyes. That is, the HMD display unit 250 or 350 respectively provides the reference image to one display unit (e.g., the left display unit 110) and the ray-traced transformed reference image to another display unit (e.g., the right display unit 120).
  • one display unit e.g., the left display unit 110
  • another display unit e.g., the right display unit 120

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Processing Or Creating Images (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

A method of generating virtual reality on a head mounted display includes (a) generating a reference image associated with a first eye through a ray tracing, (b) generating a temporary image associated with a second eye, the temporary image being generated by transforming the reference image and (c) performing a ray tracing on a below threshold pixel among pixels in the temporary image to generate a virtual image.

Description

METHOD OF GENERATING VIRTUAL REALITY ON HEAD MOUNTED DISPLAY AND HEAD MOUNTED DISPLAY PERFORMING THE SAME
[0001] This disclosure relates to a virtual reality technique and, more particularly, to a method of generating virtual reality on a head mounted display and a head mounted display capable of using a ray tracing to implement a virtual reality on a head mounted display.
[0002] Virtual Reality (VR), which can be referred to as immersive multimedia or computer-simulated life, replicates an environment that simulates physical presence in places in the real world or imagined worlds and lets the user interact in that world. Virtual reality artificially creates sensory experiences, which can include sight, hearing, touch, smell, and taste.
[0001] Most up to date virtual reality environments are displayed either on a computer screen or with special stereoscopic displays, and some simulations include additional sensory information and focus on real sound through speakers or headphones targeted towards VR users. Some advanced, haptic, systems now include tactile information, generally known as force feedback in medical, gaming and military applications. Furthermore, virtual reality covers remote communication environments which provide virtual presence of users with the concepts of telepresence and telexistence or a virtual artifact (VA) either through the use of standard input devices such as a keyboard and mouse, or through multimodal devices such as a wired glove or omnidirectional treadmills. The simulated environment can be similar to the real world in order to create a lifelike experience―for example, in simulations for pilot or combat training―or it differs significantly from reality, such as in VR games.
[0002] Korean Patent Publication NO. 10-0453225 is related to a three dimensional VR (Virtual Reality) implementing client system and method. The client system comprises an applet, a communication module, a browser, and an ASP(Active Server Page) support module. The applet enables the client system to communicate the X3D data with a server. The communication module, connected to the applet, makes possible the communication between the client system and the server. The browser parses the X3D data transmitted via the communication module by the server, constructs a scene graph, renders the scene graph according to a preset type, and displays the rendered scene graph on a display. The ASP support module processes the data, transmitted by the server, on an external window according to a user event, generated in the VR implemented by the browser.
[0003] One embodiment of the present invention proposes a head mounted display technology capable of using a ray tracing to implement a virtual reality on a head mounted display.
[0004] One embodiment of the present invention proposes a head mounted display technology capable of using one eyeline image as an eye ray in a ray tracing to generate another eyeline image, thereby decreasing an operation complexity. Herein, one eyeline image may be an image shown by one eye (e.g., a left eye) and another eyeline image may be an image shown by another eye (e.g., a right eye).
[0005] One embodiment of the present invention proposes a head mounted display technology capable of providing a real-time interface to reduce eyestrain.
[0006] In some embodiments, a method of generating virtual reality on an HMD (Head-Mounted Display) includes (a) generating a reference image associated with a first eye through a ray tracing, (b) generating a temporary image associated with a second eye, the temporary image being generated by transforming the reference image and (c) performing a ray tracing on a below threshold pixel among pixels in the temporary image to generate a virtual image.
[0007] The method further may includeobtaining a viewpoint including a first and a second eyelines and a parallax and the viewpoint is generated by both eyes before the step (a).
[0008] In one embodiment, the step (a) may include associating a first eyeline with an eye ray used in a ray tracing to generate the reference image. The step (b) may include reflecting a parallax generated by both eyes on the reference image to warp the reference image. The step (c) may include associating a second eyeline with an eye ray used in a ray tracing to perform a ray tracing on an empty region, the empty region corresponding to anti-dimensional region being not visually displayed through the first eyeline when the reference image is dimensionally displayed.
[0009] In another embodiment, the step (b) may include (b1) partitioning the reference image into a plurality of sub-reference images and (b2) reflecting a parallax generated by both eyes in each of the plurality of the sub-reference images to warp the reference image. The step (b1) may further include variably partitioning into the plurality of the sub-reference images through a combination of a plurality of sub-windows, the combination thereof completing the reference window. The step (c) may include (c1) associating a second eyeline with an eye ray used in a ray tracing to perform a ray tracing on at least one sub-reference image, the at least one sub-reference image including an empty region corresponding to an anti-dimensional region among the plurality of the warped sub-reference images. In one embodiment, the step (c1) may further include performing the ray tracing on the plurality of the warped sub-reference images in parallel.
[0010] In still another embodiment, the step (b) may include (b1) classifying a plurality of first sub-reference images with an anti-dimensional region and a plurality of second sub-reference images with a dimensional region, before or after partitioning the reference images into a plurality of sub-reference images and (b2) reflecting a parallax generated by both eyes in each of the plurality of the second sub-reference images to warp the plurality of the second sub-reference images. The step (c) may include (c1) associating a second eyeline with an eye ray used in a ray tracing to perform a ray tracing on the plurality of the first sub-reference images. The step (c) may further include (c2) combining the plurality of the first sub-reference images generated through the ray tracing and the plurality of the warped second sub-reference images to generate the virtual image.
[0011] In some embodiments, a head mounted display includes a viewpoint obtaining unit configured to obtain a viewpoint that is generated by both eyes and includes a first and a second eyelines and a parallax, a reference generation unit configured to generate a reference image associated with a first eye through a ray tracing, a temporary image generation unit configured to generate a temporary image associated with a second eye, the temporary image being generated by transforming the reference image and a ray tracing unit configured to perform a ray tracing on a below threshold pixel among pixels in the temporary image.
[0012] A head mounted display technology according to an example embodiment of the present invention may use a ray tracing to implement a virtual reality on a head mounted display.
[0013] A head mounted display technology according to an example embodiment of the present invention may use one eyeline image as an eye ray in a ray tracing to generate another eyeline image, thereby decreasing an operation complexity. Herein, one eyeline image may be an image shown by one eye (e.g., a left eye) and another eyeline image may be an image shown by another eye (e.g., a right eye).
[0014] A head mounted display technology according to an example embodiment of the present invention may provide a real-time interface to reduce eyestrain.
[0015] FIG. 1 is a conceptual diagram illustrating a head mounted display system according to an example embodiment of the present invention.
[0016] FIG. 2 is a block diagram illustrating one control unit in a head mounted display in FIG. 1.
[0017] FIG. 3 is a block diagram illustrating another control unit in a head mounted display in FIG. 1.
[0018] FIG. 4 is a flowchart illustrating procedures for generating virtual reality on a head mounted display system in FIG. 1.
[0019] FIG. 5 is a conceptual diagram illustrating dimensional and anti-dimensional regions.
[0020] Explanation of the present invention is merely an embodiment for structural or functional explanation, so the scope of the present invention should not be construed to be limited to the embodiments explained in the embodiment. That is, since the embodiments may be implemented in several forms without departing from the characteristics thereof, it should also be understood that the described embodiments are not limited by any of the details of the foregoing description, unless otherwise 20 specified, but rather should be construed broadly within its scope as defined on the appended claims. Therefore, various changes and modifications that fall within the scope of the claims, or equivalents of such scope are therefore intended to be embraced by the appended claims.
[0021] Terms described in the present disclosure may be understood as follows.
[0022] While terms such as "first" and "second," etc., may be used to describe various components, such components must not be understood as being limited to the above terms. The above terms are used to distinguish one component from another. For example, a first component may be referred to as a second component without departing from the scope of rights of the present invention, and likewise a second component may be referred to as a first component.
[0023] It will be understood that when an element is referred to as being "connected to" another element, it can be directly connected to the other element or intervening elements may also be present. In contrast, when an element is referred to as being “directly connected to" another element, no intervening elements are present. In addition, unless explicitly described to the contrary, the word "comprise" and variations such as "comprises" or "comprising," will be understood to imply the inclusion of stated elements but not the exclusion of any other elements. Meanwhile, other expressions describing relationships between components such as "between?,“immediately between" or "adjacent to" and "directly adjacent to" may be construed similarly.
[0024] Singular forms "a", "an" and "the" in the present disclosure are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that terms such as "including" or "having," etc., are intended to indicate the existence of the features, numbers, operations, actions, components, parts, or combinations thereof disclosed in the specification, and are not intended to preclude the possibility that one or more other features, numbers, operations, actions, components, parts, or combinations thereof may exist or may be added.
[0025] Identification letters (e.g., a, b, c, etc.) in respective steps are used for the sake of explanation and do not described order of respective steps. The respective steps may be changed from a mentioned order unless specifically mentioned in context. Namely, respective steps may be performed in the same order as described, may be substantially simultaneously performed, or may be performed in reverse order.
[0026] The present invention may be implemented as machine-readable codes on a machine-readable medium. The machine-readable medium includes any type of recording device for storing machine-readable data. Examples of the machine readable recording medium include a read-only memory (ROM), a random access memory (RAM), a compact disk-read only memory (CD-ROM), a magnetic tape, a floppy disk, and optical data storage. The medium may also be carrier waves (e.g., Internet transmission). The computer-readable recording medium may be distributed among networked machine systems which store and execute machine-readable codes in a de-centralized manner.
[0027] The terms used in the present application are merely used to describe particular embodiments, and are not intended to limit the present invention. Unless otherwise defined, all terms used herein, including technical or scientific terms, have the same meanings as those generally understood by those with ordinary knowledge in the field of art to which the present invention belongs. Such terms as those defined on a generally used dictionary are to be interpreted to have the meanings equal to the contextual meanings in the relevant field of art, and are not to be interpreted to have ideal or excessively formal meanings unless clearly defined on the present application.
[0028]
[0029] The present invention is supported by Seoul Industry Promotion Institute in 2013 as 「Technology Development Supporting Project For Patent Technology Commercialization」 and is a result of the project "Software Kit Development For Developing Ray Tracing Rendering Contents" from September 1, 2013 to August 31, 2014 by SILICONARTS, INC.
[0030] [National R&D Project Supporting the present invention]
[0031] [Project Serial Number] PA130017
[0032] [Department] Seoul Special City
[0033] [Research Management Organization] Seoul Industry Promotion Institute
[0034] [Research Project Name] Seoul Industry-academic Cooperative Project (Technology Development Supporting Project For Patent Technology Commercialization in 2013)
[0035] [Research Subject Name] Software Kit Development For Developing Ray Tracing Rendering Contents
[0036] [Contribution Ratio] 1/1
[0037] [Leading Organization] SILICONARTS, INC.
[0038] [Research Period] September 1, 2013 ~ August 31, 2014
[0039]
[0040] FIG. 1 is a conceptual diagram illustrating a head mounted display system according to an example embodiment of the present invention.
[0041] Referring to FIG. 1, a head mounted display (HMD) system 10 includes a virtual object 50 and an HMD 100.
[0042] The virtual object 50 may correspond to a virtual focusing region generated by the HMD 100. That is, the virtual object 50 may be a 3D object that is not real but be dimensionally drawn by the HMD 100. In one embodiment, the virtual object 50 may be totally generated by the HMD 100. In another embodiment, the virtual object 50 may be generated by combining a real object positioned in a front of the HMD 100 and a partially drawn object generated by the HMD 100.
[0043] The HMD 100 includes a left display unit 110, a right display unit 120, a network interface unit 130 and a control unit 140. The HMD 100 may be worn on the head or as part of a helmet and has a small display optic in front of each eye (binocular HMD).
[0044] The left and right display units 110 and 120 may correspond to display optics in front of corresponding eyes. In one embodiment, the left display unit 110 may generate a left-side image for a left eye and the right display unit 120 may generate a right-side image for a right eye, thereby the left and right display units 110 and 120 may virtually generate a dimensional image.
[0045] The network interface unit 130 may transmit left and right images to an external device. Also, the network interface unit 130 may receive a dimensional image or left and right images from an external device. In one embodiment, the network interface unit 130 may be implemented as a Bluetooth, a WiFi and so on.
[0046] The control unit 140 may generate one side image for one eye as a reference image associated with a first eye through a ray tracing. The control unit 140 may generate a temporary image associated with a second eye by transforming the reference image and then may complete another side image for another eye through a ray tracing on an error pixel (i.e., a below threshold pixel) in the temporary image. Finally, the control unit 140 may combine one side image and another side image to generate a virtual image. The control unit 140 will be described with references with FIGS 2 and 3.
[0047] FIG. 2 is a block diagram illustrating one control unit in a head mounted display in FIG. 1.
[0048] Referring to FIG. 2, the HMD 100 includes a viewpoint obtaining unit 210, a reference image generation unit 220, a temporary image generation unit 230, a ray tracing unit 240 and an HMD display unit 250.
[0049] The viewpoint obtaining unit 210 obtains a viewpoint including first and second eyelines generated by both eyes and a parallax. For example, the first eyeline may correspond to a left eyeline and the second eyeline may correspond to a right eyeline. The parallax may correspond to an apparent displacement of the virtual object 50 as seen from both eyes that are not on a line with the virtual object 50.
[0050] The reference image generation unit 220 generates a reference image associated with a first eye through a ray tracing. In one embodiment, the reference image generation unit 220 associates the first eyeline with an eye ray used in a ray tracing to generate the reference image. That is, the reference image generation unit 220 may perform a ray tracing through using the first eyeline (e.g., left eye) as an eye ray to generate the reference image. The ray tracing is a rendering technique with a rendering object, a light source and an eyeline, which may simulate a reflection generated on the virtual object 50 by a ray from a light source to generate a pixel color in the virtual object 50.
[0051] The temporary image generation unit 230 transforms the reference image to generate a temporary image for a second eye. For example, the temporary image generation unit 230 may change a point of view from the first eye into the second eye for the reference image to generate the temporary image. The temporary image may be incomplete for the second eye due to some regions viewed only from the first eye (i.e., not viewed from the second eye).
[0052] In one embodiment, the temporary image generation unit 230 may warp the reference image through reflecting the parallax generated by both eyes on the reference image to generate the temporary image. The temporary image generation unit 230 may pervert the reference image into the temporary image, which may use the parallax for changing a point of view from the first eye to the second eye. This may lead to calculating dimensional directions and light effects on the virtual object 50. That is, the temporary image generation unit 230 may perform an image warping on the reference image to generate the temporary image, which may reflect distortions due to the parallax on the reference image.
[0053] The ray tracing unit 240 associates the second eyeline with an eye ray to perform a ray tracing on an error pixel (i.e., below threshold pixel) in the temporary image. Herein, the error pixel may include a pixel inadequately reflecting a color value or a pixel inadequately reflecting a dimensional direction.
[0054] When the temporary image is generated by warping the reference image, the error pixel may be generated due to the parallax. Then the ray tracing unit 240 may perform a ray tracing on a pixel below a pre-determined threshold among pixels in the temporary image to obtain a pixel value, thereby compensating for the error pixel. In one embodiment, the error pixel due to the parallax may be plural and grouped into some areas. One of the some areas corresponds to an empty region indicating anti-dimensional region being not visually displayed through the first eyeline when the warped reference image (i.e., the temporary image) is generated.
[0055] An HMD display unit 250 may combine the reference image generated from the reference image generation unit 220 for the first eye and the ray-traced warped reference image generated from the ray tracing unit 240 to provide the virtual image to a user.
[0056] FIG. 3 is a block diagram illustrating another control unit in a head mounted display in FIG. 1. FIG.3 is an example implementation for firstly partitioning a reference image into a plurality of sub-reference images and secondly warping each of the partitioned sub-reference images and compensating through a ray tracing to generate the virtual image.
[0057] Referring to FIG. 3, the HMD 100 includes a viewpoint obtaining unit 310, a reference image generation unit 320, a temporary image generation unit 330 including an image partitioning unit 332 and a temporary partition image generation unit 332, a ray tracing unit 340 and an HMD display unit 350.
[0058] The viewpoint obtaining unit 310, the reference image generation unit 320, the ray tracing unit 340 and the HMD display unit 350 may be substantially equal thereto in FIG.2. For convenience's sake, the main differences therebetween will be explained.
[0059] In FIG.3, the temporary image generation unit 330 may include the image partitioning unit 332 and the temporary partition image generation unit 334. The image partitioning unit 332 partitions the reference image (e.g., left-side image) into a plurality of sub-reference images. In one embodiment, the image partitioning unit 332 may variably partitions the reference image into the plurality of the sub-reference images through a combination of a plurality of sub-windows and the combination thereof may complete the reference window. For example, when a size of the reference window corresponds to 16X16 pixels, a size of the plurality of the sub-reference windows may be correspond to 4X4, 8X8 and 16X16 pixels.
[0060] In one embodiment, the image partitioning unit 332 may use one type of the plurality of the sub-windows to partition the reference image. For example, the image partitioning unit 332 may use a size of 8X8 sub-window to partition the reference image into four sub-reference images. In another embodiment, the image partitioning unit 332 may use various types of the plurality of the sub-windows to partition the reference image. For example, the image partitioning unit 332 may use a size of 8X8 sub-window and a size of 4X4 sub-window to partition the reference image into three 8X8 sub-reference images and four 4X4 sub-reference images.
[0061] The temporary partition image generation unit 334 reflects the parallax generated by both eyes 330 to warp the plurality of the sub-reference images, thereby generating the temporary image. That is, the temporary partition image unit 334 may perform an image warping on each of the plurality of the sub-reference images to generate the temporary image.
[0062] The ray tracing unit 340 associates the second eyeline with the eye ray to perform a ray tracing on an error pixel (i.e., a below threshold pixel) in the plurality of the warped sub-reference images or on a warped sub-reference image with the error pixel. In one embodiment, the ray tracing unit 340 may perform a ray tracing in parallel. Because a ray tracing procedure for the error pixel is described in the ray tracing unit 240, a ray tracing procedure for the warped sub-reference image with the error pixel will be described.
[0063] In one embodiment, the ray tracing unit 340 may associate the second eyeline with an eye ray to perform a ray tracing on at least one of the sub-reference image including an error pixel or an empty region (e.g., region including the below threshold pixel) to generate the ray-traced warped image. Herein, the empty region may include an anti-dimensional region in the plurality of the warped sub-reference images that indicates an area viewed from one eye and not viewed from another from another eye in the reference image.
[0064] In another embodiment, the image partitioning unit 332 may classify a plurality of first sub-reference images with anti-dimensional regions and a plurality of second sub-reference images with dimensional regions before or after partitioning the reference image into the plurality of the sub-reference images. The temporary partition image generation unit 334 may reflect the parallax generated by both eyes to warp the plurality of the second sub-reference images to generate the temporary image. Then the ray tracing unit 340 may associate the second eyeline with an eye ray used in a ray tracing to perform the ray tracing on the plurality of the first sub-reference images and may combine the plurality of the first sub-reference images and the plurality of the warped second sub-reference images to generate the ray-traced warped image.
[0065] FIG. 4 is a flowchart illustrating procedures for generating virtual reality on a head mounted display system in FIG. 1 and FIG. 5 is a conceptual diagram illustrating dimensional and anti-dimensional regions.
[0066] In FIG. 4, after the viewpoint obtaining unit 210 or 310 may obtain a viewpoint including first and second eyelines generated by both eyes and a parallax, the reference image generation unit 220 or 320 generates a reference image associated with a first eye through a ray tracing (Step S410).
[0067] The temporary image generation unit 230 or 330 transforms the reference image to generate the temporary image associated with a second eye (Step S420). In one embodiment, the temporary image generation unit 230 or 330 may reflect the parallax generated by both eyes on the reference image and may warp the reference image to generate the temporary image. The temporary image generation unit 230 or 330 may wholly or partly transform the reference image.
[0068] The ray tracing unit 240 or 340 performs a ray tracing by using the second eyeline as an eye ray on the error pixel (i.e., below threshold pixel) in the temporary image (Step S430). In one embodiment, the ray tracing unit 240 or 340 performs a ray tracing on each of error pixels in the temporary image or on each of error sub-blocks therein, each indicating a partitioned and transformed reference image with an error pixel. Finally, the ray tracing unit 240 or 340 generates the ray-traced transformed reference image.
[0069] In FIG. 5, assuming the reference image to be a left side image viewed from a left eye, the temporary image generation unit 230 or 330 transforms the reference image to generate the temporary image for a right eye. A dimensional region 510 may include a region viewed from both eyes and be adequately transformed. An anti-dimensional region 520 may include a region viewed from the right eye and not viewed from the left eye and be inadequately transformed. The ray tracing unit 240 or 340 searches for the error pixel in the temporary image, which is included in the anti-dimensional region 520 and performed a ray tracing thereon. In one embodiment, the error pixel may be used as a sub-block including the error pixel.
[0070] The HMD display unit 250 or 350 generates the virtual image for both eyes. That is, the HMD display unit 250 or 350 respectively provides the reference image to one display unit (e.g., the left display unit 110) and the ray-traced transformed reference image to another display unit (e.g., the right display unit 120).
[0071] While the disclosure has been described in terms of exemplary embodiments, those skilled in the art will recognize that the disclosure can be practiced with modifications in the spirit and scope of the appended claims.
100 : HEAD MOUNTED DISPLAY (HMD)
110 : LEFT DISPLAY UNIT
120 : RIGHT DISPLAY UNIT
130 : NETWORK INTERFACE UNIT
140 : CONTROL UNI
210 : VIEWPOINT OBTATINING UNIT
220 : REFERENCE IMAGE GENERATION UNIT
230 : TEMPORARY IMAGE GENERATION UNIT
240 : RAY TRACING UNIT
250 : HMD DISPLAY UNIT

Claims (13)

  1. A method of generating virtual reality on an HMD (Head-Mounted Display) comprising:
    (a) generating a reference image associated with a first eye through a ray tracing;
    (b) generating a temporary image associated with a second eye, the temporary image being generated by transforming the reference image; and
    (c) performing a ray tracing on a below threshold pixel among pixels in the temporary image to generate a virtual image.
  2. The method of claim 1, further comprising:
    before the step (a), obtaining a viewpoint including a first and a second eyelines and a parallax, the viewpoint being generated by both eyes.
  3. The method of claim 1, wherein the step (a) includes associating a first eyeline with an eye ray used in a ray tracing to generate the reference image.
  4. The method of claim 1, wherein the step (b) includes reflecting a parallax generated by both eyes on the reference image to warp the reference image.
  5. The method of claim 4, wherein the step (c) includes associating a second eyeline with an eye ray used in a ray tracing to perform a ray tracing on an empty region, the empty region corresponding to anti-dimensional region being not visually displayed through the first eyeline when the reference image is dimensionally displayed.
  6. The method of claim 1, wherein the step (b) includes
    (b1) partitioning the reference image into a plurality of sub-reference images; and
    (b2) reflecting a parallax generated by both eyes in each of the plurality of the sub-reference images to warp the reference image.
  7. The method of claim 6, wherein the step (b1) further includes variably partitioning into the plurality of the sub-reference images through a combination of a plurality of sub-windows, the combination thereof completing the reference window.
  8. The method of claim 6, wherein the step (c) includes
    (c1) associating a second eyeline with an eye ray used in a ray tracing to perform a ray tracing on at least one sub-reference image, the at least one sub-reference image including an empty region corresponding to an anti-dimensional region among the plurality of the warped sub-reference images.
  9. The method of claim 8, wherein the step (c1) further includes performing the ray tracing on the plurality of the warped sub-reference images in parallel.
  10. The method of claim 1, wherein the step (b) includes
    (b1) classifying a plurality of first sub-reference images with an anti-dimensional region and a plurality of second sub-reference images with a dimensional region, before or after partitioning the reference images into a plurality of sub-reference images; and
    (b2) reflecting a parallax generated by both eyes in each of the plurality of the second sub-reference images to warp the plurality of the second sub-reference images.
  11. The method of claim 10, the step (c) includes
    (c1) associating a second eyeline with an eye ray used in a ray tracing to perform a ray tracing on the plurality of the first sub-reference images.
  12. The method of claim 11, the step (c) further includes
    (c2) combining the plurality of the first sub-reference images generated through the ray tracing and the plurality of the warped second sub-reference images to generate the virtual image.
  13. A head mounted display comprising:
    a viewpoint obtaining unit configured to obtain a viewpoint that is generated by both eyes and includes a first and a second eyelines and a parallax;
    a reference generation unit configured to generate a reference image associated with a first eye through a ray tracing;
    a temporary image generation unit configured to generate a temporary image associated with a second eye, the temporary image being generated by transforming the reference image; and
    a ray tracing unit configured to perform a ray tracing on a below threshold pixel among pixels in the temporary image.
PCT/KR2015/009162 2015-08-31 2015-08-31 Method of generating virtual reality on head mounted display and head mounted display performing the same WO2017039026A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/KR2015/009162 WO2017039026A1 (en) 2015-08-31 2015-08-31 Method of generating virtual reality on head mounted display and head mounted display performing the same
KR1020187008833A KR102101217B1 (en) 2015-08-31 2015-08-31 Method for creating virtual reality of head mounted display and head mounted display performing the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/KR2015/009162 WO2017039026A1 (en) 2015-08-31 2015-08-31 Method of generating virtual reality on head mounted display and head mounted display performing the same

Publications (1)

Publication Number Publication Date
WO2017039026A1 true WO2017039026A1 (en) 2017-03-09

Family

ID=58187702

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2015/009162 WO2017039026A1 (en) 2015-08-31 2015-08-31 Method of generating virtual reality on head mounted display and head mounted display performing the same

Country Status (2)

Country Link
KR (1) KR102101217B1 (en)
WO (1) WO2017039026A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020061131A1 (en) * 2000-10-18 2002-05-23 Sawhney Harpreet Singh Method and apparatus for synthesizing new video and/or still imagery from a collection of real video and/or still imagery
US20090128552A1 (en) * 2007-11-07 2009-05-21 Canon Kabushiki Kaisha Image processing apparatus for combining real object and virtual object and processing method therefor
US20110050853A1 (en) * 2008-01-29 2011-03-03 Thomson Licensing Llc Method and system for converting 2d image data to stereoscopic image data
KR20140090968A (en) * 2013-01-09 2014-07-18 엘지전자 주식회사 Head Mounted Display and controlling method for eye-gaze calibration
US20150061975A1 (en) * 2013-09-03 2015-03-05 Seiko Epson Corporation Virtual image display apparatus

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100453225B1 (en) 2001-12-26 2004-10-15 한국전자통신연구원 Client system for embodying 3-dimension virtual reality and method for embodying virtual reality using same
KR101334187B1 (en) * 2011-07-25 2013-12-02 삼성전자주식회사 Apparatus and method for rendering

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020061131A1 (en) * 2000-10-18 2002-05-23 Sawhney Harpreet Singh Method and apparatus for synthesizing new video and/or still imagery from a collection of real video and/or still imagery
US20090128552A1 (en) * 2007-11-07 2009-05-21 Canon Kabushiki Kaisha Image processing apparatus for combining real object and virtual object and processing method therefor
US20110050853A1 (en) * 2008-01-29 2011-03-03 Thomson Licensing Llc Method and system for converting 2d image data to stereoscopic image data
KR20140090968A (en) * 2013-01-09 2014-07-18 엘지전자 주식회사 Head Mounted Display and controlling method for eye-gaze calibration
US20150061975A1 (en) * 2013-09-03 2015-03-05 Seiko Epson Corporation Virtual image display apparatus

Also Published As

Publication number Publication date
KR20180042419A (en) 2018-04-25
KR102101217B1 (en) 2020-04-17

Similar Documents

Publication Publication Date Title
US8520024B2 (en) Virtual interactive presence systems and methods
Scarfe et al. Using high-fidelity virtual reality to study perception in freely moving observers
US10181361B2 (en) System and method for image registration of multiple video streams
US10717001B2 (en) System and method for saving tracked data in the game server for replay, review and training
WO2010101362A2 (en) Metadata generating method and apparatus and image processing method and apparatus using metadata
WO2018174535A1 (en) System and method for depth map
US20140176533A1 (en) System and method for role-switching in multi-reality environments
WO2011105671A1 (en) System and method for providing a user manual using augmented reality
WO2017188637A1 (en) Message display method according to event occurrence in vr device and apparatus therefor
US20140354633A1 (en) Image processing method and image processing device
WO2015008932A1 (en) Digilog space creator for remote co-work in augmented reality and digilog space creation method using same
WO2022025565A1 (en) System and method for generating bokeh image for dslr quality depth-of-field rendering and refinement and training method for the same
US11611738B2 (en) User interface module for converting a standard 2D display device into an interactive 3D display device
WO2013183877A1 (en) System for providing three-dimensional digital animation viewer and method thereof
WO2017039026A1 (en) Method of generating virtual reality on head mounted display and head mounted display performing the same
WO2019107942A1 (en) Method and program for providing augmented reality image by using depth data
WO2019124802A1 (en) Apparatus and method for providing mapping pseudo-hologram by using individual image signal output
WO2012128421A1 (en) 3d graphic model spatial rendering device and 3d rendering method
WO2020032434A1 (en) Method for providing three-dimensional image contents and program
WO2023090846A1 (en) Electronic device and method for anchoring augmented reality object
WO2018224870A1 (en) System and method for saving tracked data in the game server for replay, review and training
WO2015133707A1 (en) Method for reducing crosstalk on basis of crosstalk visibility
Mohd et al. Virtual reality application: A review
WO2013058440A1 (en) Method and system for displaying a stereoscopic image
WO2019107637A1 (en) System and method for applying digital hologram

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15903087

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 20187008833

Country of ref document: KR

Kind code of ref document: A

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 12/07/2018)

122 Ep: pct application non-entry in european phase

Ref document number: 15903087

Country of ref document: EP

Kind code of ref document: A1