WO2015107817A1 - 画像表示装置及び画像表示方法、画像出力装置及び画像出力方法、並びに画像表示システム - Google Patents
画像表示装置及び画像表示方法、画像出力装置及び画像出力方法、並びに画像表示システム Download PDFInfo
- Publication number
- WO2015107817A1 WO2015107817A1 PCT/JP2014/082900 JP2014082900W WO2015107817A1 WO 2015107817 A1 WO2015107817 A1 WO 2015107817A1 JP 2014082900 W JP2014082900 W JP 2014082900W WO 2015107817 A1 WO2015107817 A1 WO 2015107817A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- viewing angle
- display device
- image display
- output device
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/74—Projection arrangements for image reproduction, e.g. using eidophor
- H04N5/7475—Constructional details of television projection apparatus
- H04N5/7491—Constructional details of television projection apparatus of head mounted projectors
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0118—Head-up displays characterised by optical features comprising devices for improving the contrast of the display / brillance control visibility
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0123—Head-up displays characterised by optical features comprising devices increasing the field of view
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0147—Head-up displays characterised by optical features comprising a device modifying the resolution of the displayed image
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/06—Adjustment of display parameters
- G09G2320/068—Adjustment of display parameters for control of viewing angle adjustment
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/04—Changes in size, position or resolution of an image
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/14—Solving problems related to the presentation of information to be displayed
Definitions
- the technology disclosed in this specification includes an image display device and an image display method for displaying an image provided from the image output device, an image output device and an image output method for providing an image to the image display device, and an image display device.
- the present invention relates to an image display system including an image output device, and more particularly to an image display device and an image display method for displaying an image with a wide viewing angle, an image output device and an image output method, and an image display system.
- An image display device that is worn on a head or a face and is used for viewing an image, that is, a head-mounted display is known.
- a head-mounted display for example, an image display unit is arranged for each of the left and right eyes, and a user can observe a realistic image by forming an enlarged virtual image of the display image by a virtual image optical system.
- Head-mounted displays are very popular. If mass production progresses in the future, it will become widespread like mobile phones, smartphones and portable game machines, and one person may have one head-mounted display.
- VR sickness virtual reality
- FOV Field of View
- FPV first person view
- VR sickness with a sickness response process that reduces the stimulus intensity of a virtual space video presented by the VR system in response to a VR sickness subjective input by the user or a user's VR sickness determination by physiological index measurement Proposals have been made for reduction methods.
- An object of the technology disclosed in this specification is to provide an excellent image display device and image display method, an image output device and an image output method capable of displaying or outputting an image with a wide viewing angle while suppressing VR sickness, and
- the object is to provide an image display system.
- a display for displaying an image An attribute information storage unit that stores attribute information including attributes of image display in the display unit; A communication unit that communicates with the image output device; Comprising Transmitting the attribute information to the image output device, receiving an image converted by the image output device based on the attribute information, and displaying the image on the display unit; An image display device.
- the image display device according to claim 1 is configured to be used by being mounted on the face or head of an observer who observes the image displayed by the display unit. Has been.
- the display unit of the image display device includes a display panel that displays an image, and a virtual image optical unit that magnifies and projects the image displayed on the display panel. It has.
- the attribute information storage unit of the image display device stores information on the first viewing angle of the image displayed by the display unit. Yes.
- the information on the first viewing angle is transmitted to the image output device, and the image output device receives an image converted based on the first viewing angle and displays the image on the display unit. It is configured.
- the image display device is based on the difference between the second viewing angle of the original image and the first viewing angle on the image output device side. An image obtained by converting the original image is received and displayed on the display unit.
- the image display device when the second viewing angle is larger than the first viewing angle, the image output device performs the original display. An image obtained by cutting out the first viewing angle region from the image is received and displayed on the display unit.
- the original image is output from the image output device. Is received and displayed on the display unit.
- the image display device is configured to display an image received from the image output device in a relationship between the first viewing angle and the second viewing angle. Based on this, it is configured to display on the display unit.
- the second viewing angle is larger than the first viewing angle, and the first viewing field is determined from the original image.
- the image from which the corner area is cut out is configured to be displayed on the full screen on the display unit.
- the image display apparatus is configured to display an image having the second viewing angle smaller than the first viewing angle on the display unit.
- the surplus area is filled with black or a wallpaper is displayed.
- the image display device is configured to display an image including the second viewing angle smaller than the first viewing angle up to the first viewing angle.
- the image is stretched and displayed on the display unit.
- the image display device receives an image having the second viewing angle smaller than the first viewing angle from the image output device.
- it is configured to display on the display unit according to a method based on an instruction from the observer, attribute information of the observer, or an instruction added to the original image.
- the technique according to claim 13 of the present application is Transmitting attribute information including image display attributes to the image output device; Receiving an image converted based on the attribute information from the image output device; Displaying the received image; Is an image display method.
- a communication unit that communicates with the image display device; An image acquisition unit for acquiring an original image to be provided to the image display device; An image processing unit for processing the original image; Comprising Receiving attribute information including attributes of image display from the image display device, the image processing unit converts the original image based on the attribute information, and transmits the converted image to the image display device; An image output device.
- the image output device receives information on a first viewing angle of an image displayed by the image display device, and the image processing unit The original image is converted based on the difference between the second viewing angle and the first viewing angle of the original image, and the converted image is transmitted to the image display device.
- the image processing unit of the image output device according to claim 15 is configured such that the original viewing angle is larger when the second viewing angle is larger than the first viewing angle.
- the first viewing angle region is cut out from the image.
- the image processing unit of the image output device is configured such that when the first viewing angle is equal to or larger than the second viewing angle, the viewing angle is different.
- the original image is not converted based on the above.
- the image display device stores the attribute information in accordance with EDID (Extended Display Identification Data) or other predetermined data format.
- EDID Extended Display Identification Data
- the image output device according to claim 14 is configured to acquire the attribute information from the image display device via the communication unit according to a DDC (Display Data Channel) or other predetermined protocol.
- DDC Display Data Channel
- the technology described in claim 19 of the present application is: Obtaining an original image for provision to an image display device; Receiving attribute information including image display attributes from the image display device; Converting the original image based on the attribute information; Transmitting the converted image to the image display device; Is an image output method.
- the technique described in claim 20 of the present application is: An image display device having attribute information including image display attributes; An image output device that outputs an image converted based on the attribute information acquired from the image display device to the image display device; Is an image display system.
- system here refers to a logical collection of a plurality of devices (or functional modules that realize specific functions), and each device or functional module is in a single housing. It does not matter whether or not.
- an image with a wide viewing angle can be displayed while suppressing VR sickness.
- An excellent image display device and image display method, image output device and image output method, and image display system can be provided.
- FIG. 1 is a diagram schematically illustrating a configuration example of an image display system 100 to which the technology disclosed in this specification is applied.
- FIG. 2 is a diagram schematically showing the internal configuration of a device that functions as the image output device 200 in the image display system 100.
- FIG. 3 is a diagram schematically illustrating an internal configuration of a device that functions as the image display device 300 in the image display system 100.
- FIG. 4 is a diagram for explaining a process of matching the viewing angle FOV O of the image provided by the image output device 200 with the viewing angle FOV D observed by the user of the image display device 300.
- FIG. 5 is a diagram for explaining a process of matching the viewing angle FOV O of the image provided by the image output device 200 with the viewing angle FOV D observed by the user of the image display device 300.
- FIG. 6 is a diagram for explaining processing for displaying an image sent from the image output apparatus 200 on the image display apparatus 300 side.
- FIG. 7 is a diagram for explaining processing for displaying an image sent from the image output device 200 on the image display device 300 side.
- FIG. 8 is a diagram for explaining processing for displaying an image sent from the image output apparatus 200 on the image display apparatus 300 side.
- FIG. 9 is a diagram illustrating an example of an operation sequence of the image display system 100 that transmits an image from the image output apparatus 200 and displays the image on the image display apparatus 300.
- FIG. 10 is a flowchart showing the procedure of the image format conversion process executed in the image output apparatus 200 in SEQ903 in FIG.
- FIG. 11 is a flowchart showing the procedure of the image display process executed in image display apparatus 300 in SEQ905 in FIG.
- FIG. 12 is a diagram illustrating another operation sequence example of the image display system 100 that transmits an image from the image output device 200 and displays the image on the image display device 300.
- FIG. 13 is a flowchart illustrating a procedure of distortion correction table switching and image conversion processing executed in the image output apparatus 200 in SEQ1203 in FIG.
- FIG. 1 schematically shows a configuration example of an image display system 100 to which the technology disclosed in this specification is applied.
- the illustrated image display system 100 includes an image output device 200 that provides an image to the image display device 300, and an image display device 300 that displays an image provided from the image output device 200, and an observer of the image display device 300. It is assumed that a wide viewing angle image is provided.
- the image output device 200 serving as an image supply source includes an information terminal 200-1 such as a personal computer, a smartphone, and a tablet, a media playback device 200-2 that plays back images from media such as Blu-ray (registered trademark), and a set top.
- An information terminal 200-1 such as a personal computer, a smartphone, and a tablet
- a media playback device 200-2 that plays back images from media such as Blu-ray (registered trademark)
- a set top such as Blu-ray (registered trademark)
- a box or TV tuner 200-3 is assumed.
- an image display device 300 that is an image providing destination
- a display device that displays an image of its own viewpoint with a screen fixed to the face or head of the observer such as a head-mounted display or a head-up display.
- a general display device such as a large screen display may be included in the image display device 300.
- the image reproduction device 200 and the image display device 300 are, for example, cables 400-1 and 400- conforming to interface standards such as DVI (Digital Visual Interface), HDMI (registered trademark) (High Definition Multimedia interface), and Display port. 2 and 400-3.
- interface standards such as DVI (Digital Visual Interface), HDMI (registered trademark) (High Definition Multimedia interface), and Display port. 2 and 400-3.
- Wi-Fi registered trademark
- Bluetooth registered trademark
- BLE Bluetooth (registered trademark) Low Energy
- wireless communication such as communication.
- the image output device 200 and the image display device 300 may be interconnected using both wired and wireless.
- uncompressed image data is transmitted from the image playback device 200 to the image display device 300.
- image data is transmitted from the image reproduction device 200 to the image display device 300 in a compressed format according to an algorithm such as H264, VC1, MPEG (Moving Picture Experts Group) 2, JPEG (Joint Photographic Experts Group), etc.
- an algorithm such as H264, VC1, MPEG (Moving Picture Experts Group) 2, JPEG (Joint Photographic Experts Group), etc.
- H264 Joint Photographic Experts Group
- MPEG Moving Picture Experts Group
- JPEG Joint Photographic Experts Group
- FIG. 2 schematically shows an internal configuration of a device that functions as the image output device 200 in the image display system 100.
- the image output device 200 is a supplier of images to the image display device 300.
- the information terminal 200-1 such as a personal computer, a smartphone, or a tablet, or Blu-ray (registered trademark) is used.
- An image reproducing apparatus 200-2 for reproducing an image from a medium such as a set top box or a TV tuner 200-3 is assumed.
- the image output apparatus 200 shown in FIG. 2 includes a control unit 201, an attribute information storage unit 202, an image acquisition unit 203, an image processing unit 204, and a communication unit 205.
- the control unit 201 includes, for example, a CPU (Central Processing Unit), a RAM (Random Access Memory), a ROM (Read Only Memory), and the like, and comprehensively controls operations in the image output apparatus 200.
- a CPU Central Processing Unit
- RAM Random Access Memory
- ROM Read Only Memory
- the attribute information storage unit 202 is configured by a nonvolatile memory such as an EEPROM (Electrically Erasable Programmable ROM), and stores attribute information.
- the attribute information includes image resolution and frame rate handled by the image output device 200, manufacturer identification information (VendorID), product identification information (ProductID), and the like.
- EDID Extended Display Identification Data
- the attribute information related to image display includes the viewing angle (FOV O ) of the original image sent from the image output apparatus 200 to the image display apparatus 300.
- Attribute information related to image display ⁇ Resolution ⁇ Image shape and aspect ratio ⁇ Frame rate (frequency) -Viewing angle (both eyes, one eye, including overlap) -Distortion information-Interocular distance and visual acuity-Color gamut, brightness, gamma-Whether the image is stretched when the field of view of the original image and the field of view observed by the image display device 300 are different-Information about text display-Display position, Area, font, size, color, ... ⁇ Delay time from signal input to display (not limited to video) (2) Attribute information related to audio output-Either headphones or speakers-Number of headphones or speakers
- the image acquisition unit 203 acquires image data to be provided to the image display device 300.
- the image output device 200 is an information terminal such as a personal computer, a smartphone, or a tablet
- the image acquisition unit 203 receives image content by streaming or the like from a content server on the Internet, for example.
- the image output device 200 is a media playback device such as Blu-ray (registered trademark)
- the image acquisition unit 203 reads image data from the media.
- the image output apparatus 200 is a set top box or a TV tuner
- the image acquisition unit 203 selects and receives broadcast content.
- the viewing angle of the original image acquired by the image acquisition unit 203 for provision to the image display device 300 is hereinafter referred to as FOV O.
- the image processing unit 204 performs a process of converting the image data into a format suitable for displaying the image data acquired by the image acquisition unit 203 on the image display device 300. As the conversion processing here, the viewing angle is adjusted, and details thereof will be described later.
- the communication unit 205 transmits the image data processed by the image processing unit 204 to the image display device 300 via the cable 400.
- the configuration of the communication unit 205 is arbitrary.
- the communication unit 205 can be configured according to a communication protocol applied to communication with the image display apparatus 300 that is a communication partner.
- it is assumed that the communication unit 205 is configured in accordance with interface standards such as DVI, HDMI (registered trademark), and display port.
- FIG. 3 schematically shows an internal configuration of a device that functions as the image display device 300 in the image display system 100.
- the image display device 300 displays the image provided from the image output device 200, but the screen is fixed to the face or head of the observer like a head-mounted display or a head-up display. It is assumed that the display device displays an image of its own viewpoint.
- the control unit 301 includes a ROM (Read Only Memory) 301A and a RAM (Random Access Memory) 301B.
- the ROM 301A stores program codes executed by the control unit 301 and various data.
- the control unit 301 executes the program loaded to the RAM 301B, thereby starting the image display control and comprehensively controlling the operation of the entire image display apparatus 300.
- Examples of programs and data stored in the ROM 301A include an image display control program such as reproduction of moving image content, and a communication control program that communicates with an external device such as the image output device 200 that is an image provider according to a predetermined communication protocol. it can.
- the input operation unit 302 includes one or more operators (none of which are shown) that the user performs input operations, such as keys, buttons, and switches, accepts user instructions via the operators, and outputs them to the control unit 301. To do. Further, the input operation unit 302 receives a user instruction including a remote control command received by the remote control reception unit 303 and outputs it to the control unit 301.
- the status information acquisition unit 304 is a functional module that acquires status information of the image display device 300 main body or a user (an observer of the display image) wearing the image display device 300.
- the state information acquisition unit 304 may be equipped with various sensors for detecting state information by itself, or an external device (for example, a user wears a part or all of these sensors).
- the status information may be acquired via a communication unit 305 (described later) from a smartphone, a wristwatch, or another multifunction terminal. Further, the attribute information may be directly designated or input by the user.
- the status information acquisition unit 304 acquires, for example, information on the position and posture of the user's head or information on the posture in order to track the user's head movement.
- the state information acquisition unit 304 is, for example, a sensor that can detect a total of nine axes including a three-axis gyro sensor, a three-axis acceleration sensor, and a three-axis geomagnetic sensor.
- the state information acquisition unit 304 may use any one or more sensors such as a GPS (Global Positioning System) sensor, a Doppler sensor, an infrared sensor, and a radio wave intensity sensor.
- the state information acquisition unit 304 acquires information provided from various infrastructures such as mobile phone base station information and PlaceEngine (registered trademark) information (electrical measurement information from a wireless LAN access point) for acquiring position and orientation information. Further, it may be used in combination.
- the state acquisition unit 304 for tracking the head movement is built in a head-mounted display as the image display device 300, but accessory parts attached to the head-mounted display, etc. You may make it comprise.
- the externally connected state acquisition unit 304 expresses the posture information of the head in the form of a rotation matrix, for example, wireless communication such as Bluetooth (registered trademark) communication, or USB (Universal Serial Bus). Send to the head-mounted display via high-speed wired interface.
- the state information acquisition unit 304 may include, for example, the user's work state (the head mounted display 100's) as the state information of the user wearing the image display device 300. Wearing state), user action state (moving state such as stationary, walking, running, etc., hand / fingertip gesture, eyelid opening / closing state, gaze direction, pupil size), mental state (user observing display image) Or the like), visual acuity (diopter), and physiological state.
- the state information acquisition unit 304 acquires the state information from the user by using a mounting sensor such as a mechanical switch, an inner camera that captures the face of the user, a gyro sensor, an acceleration sensor, a speed sensor, and a pressure sensor.
- a mounting sensor such as a mechanical switch, an inner camera that captures the face of the user, a gyro sensor, an acceleration sensor, a speed sensor, and a pressure sensor.
- Temperature sensor to detect body temperature or temperature sweat sensor, pulse sensor, myoelectric sensor, electro-oculogram sensor, electroencephalogram sensor, breath sensor, gas / ion concentration sensor, etc., various timers (not shown) You may have.
- the communication unit 305 performs communication processing with an external device, modulation / demodulation of communication signals, and encoding / decoding processing.
- An example of the external device is an image output device 200 that supplies an image.
- the control unit 301 transmits transmission data to the external device from the communication unit 305.
- the configuration of the communication unit 305 is arbitrary.
- the communication unit 305 can be configured according to a communication protocol applied to communication with the image output apparatus 200 that is a communication partner. In the present embodiment, it is assumed that the communication unit 305 is configured in accordance with an interface standard such as DVI, HDMI (registered trademark), or Display port.
- the attribute information storage unit 306 is configured by a nonvolatile memory such as an EEPROM (Electrically Erasable Programmable ROM), and stores attribute information.
- the attribute information includes the resolution and frame rate when the image display apparatus 300 displays an image, manufacturer identification information (VendorID), product identification information (ProductID), and the like.
- VendorID manufacturer identification information
- ProductID product identification information
- EDID can be used as a file format for storing such attribute information, it is not particularly limited to this.
- the EDID is expanded to include attribute information related to image display, attribute information related to audio output, attribute information about the user of the image display device 200, and the image display device 200 as described below. It is assumed that attribute information related to the sensor included in is stored in the attribute information storage unit 306. As attribute information related to image display, the viewing angle (FOV D ) of an image displayed by the image display device 300 is also included in the attribute information.
- attribute information related to image display the viewing angle (FOV D ) of an image displayed by the image display device 300 is also included in the attribute information.
- Attribute information related to image display ⁇ Resolution ⁇ Image shape and aspect ratio ⁇ Frame rate (frequency) -Viewing angle (both eyes, one eye, including overlap) -Distortion information-Interocular distance, visual acuity / color gamut, brightness, gamma-Whether the image is stretched when the field of view of the original image and the field of view observed by the image display device 300 are different-Information about text display-Display position, Area, font, size, color, ...
- the image processing unit 307 further performs signal processing such as image quality correction on the image signal output from the control unit 301 and converts the image signal to a resolution that matches the screen of the display unit 309.
- the display driving unit 308 sequentially selects the pixels of the display unit 309 for each row and performs line sequential scanning, and supplies a pixel signal based on the image signal subjected to signal processing.
- the display unit 309 includes a display panel (not shown) configured by a laser scanning display such as an organic EL (Electro-Luminescence) element, a micro display such as a liquid crystal display, or a retina direct drawing display.
- a laser scanning display such as an organic EL (Electro-Luminescence) element
- a micro display such as a liquid crystal display, or a retina direct drawing display.
- the virtual image optical unit 310 enlarges and projects the display image of the display unit 309 and causes the user to observe it as an enlarged virtual image.
- the virtual image optical unit 310 enlarges and projects the display image of the display unit 309 and causes the user to observe it as an enlarged virtual image having an appropriate angle of view.
- the virtual image optical unit 310 includes a wide-field optical system, for example, forms an enlarged virtual image with an angle of view of 45 degrees on the user's pupil (see, for example, Patent Document 1) so as to be viewed at the best seat in the movie theater. Reproduce a realistic sensation.
- the viewing angle of the enlarged virtual image observed by the user is assumed to be FOV D.
- the sound processing unit 311 further performs signal processing such as sound quality correction, sound amplification, and input sound signal on the sound signal output from the control unit 301.
- the voice input / output unit 312 outputs the voice after voice processing to the outside and inputs voice from a microphone (not shown).
- the outer camera 313 is arranged, for example, in the center of the front surface of the head mounted display 100 (not shown), and can capture a surrounding image. More preferably, the outer camera 313 is composed of a plurality of cameras so that the three-dimensional information of the surrounding image can be acquired using the parallax information. In addition, even with one camera, shooting is performed while moving the camera using SLAM (Simultaneous Localization and Mapping) image recognition, and parallax information is calculated using a plurality of frame images that move in time (for example, (See Patent Document 2), and the three-dimensional information of the surrounding image can be acquired from the calculated parallax information.
- SLAM Simultaneous Localization and Mapping
- the following can be illustrated as a method of acquiring the visual acuity information in the state information acquisition unit 304.
- the user inputs manually.
- Input information model number, etc. relating to a vision correction lens prepared in advance by the user and collate with a separately prepared database.
- Information such as the model number of the vision correction lens is not manually input by the user, but is read by the state information acquisition unit 304 as information written on the lens side. Information can be written and read by mechanical (notch shape, number, etc.), electrical, optical (barcode pattern, etc.) methods.
- Calibration is performed with the lens mounted, diopter is estimated, and a distortion correction table is created. Display a known calibration pattern such as a square lattice on the display panel, capture it with a camera placed at the user's eyeball position, obtain a distortion pattern, and calculate the inverse mapping to obtain an appropriate distortion correction table. Create
- a lens attachment can be used to correct the user's diopter.
- the state information acquisition unit 304 automatically identifies the presence and type of the lens attachment by other means such as mechanical, electrical, and optical, and acquires information on the user's diopter, An appropriate distortion correction table according to the degree may be selected. The user can set the diopter information by himself to obtain the same effect.
- the viewing angle of the image provided by the image output device 200 and the user of the image display device 300 observe the image.
- the discrepancy with the viewing angle is considered to be the main cause of VR sickness.
- Distortion remains, and VR sickness is likely to occur.
- attribute information including information related to image display is exchanged between the image output device 200 and the image display device 300, and the image output device 200 displays an image on the image display device 300 side.
- VR sickness is prevented by outputting an image converted into a format suitable for the above.
- the image conversion process is performed by the image processing unit 204 in the image output apparatus 200.
- the image format conversion includes, for example, processing for matching the viewing angle FOV O of the image provided by the image output device 200 with the viewing angle FOV D observed by the user of the image display device 300.
- the image output device 200 cuts out the region 42 of the viewing angle FOV D from the original image 41 and sends it to the image display device 300.
- the image output apparatus 200 sends the original image 51 to the image display apparatus 300 as it is.
- the viewing angle FOV O of the original image provided from the image output device 200 is larger than the viewing angle FOV D of the image observed by the user wearing the image display device 300 (FOV O > FOV D ).
- FOV D the viewing angle of the image observed by the user wearing the image display device 300.
- FOV T the viewing angle of the image observed by the user of the image display device 300.
- FOV D the image 61 sent from the image output device 200 is displayed on the display panel 309 in full screen as shown in FIG.
- the surplus area 72 when the transmitted image 71 of the viewing angle FOV T is displayed in the range of the viewing angle FOV D is displayed in black as shown in FIG. You may make it fill or display a wallpaper.
- the transmitted image 81 of the viewing angle FOV T is enlarged to the viewing angle FOV D indicated by the reference numeral 82 and displayed on the display panel 309.
- a processing method when the transmitted image of the viewing angle FOV T is displayed in the range of the viewing angle FOV D (as shown in FIG. 7, the surplus area is filled in black or wallpaper is displayed, or FIG. Whether the image of the viewing angle FOV T that has been transmitted as shown in FIG. 5 is enlarged to the viewing angle FOV D for display may be specified by the user from the input operation unit 302 or the like, or the image display device 300 may be designated. May be automatically selected based on user attribute information. Alternatively, a processing method may be defined in the metadata of the original image.
- the image output device 200 and the image display device are used. It is necessary to exchange attribute information stored in the mutual attribute information storage units 202 and 306 among the 300.
- DDC Display Data Channel
- attribute information may be exchanged using this protocol.
- the exchange of the attribute information is not limited to the DDC, and the attribute information may be exchanged via the cable 400 using another protocol.
- FIG. 9 shows an operation sequence example of the image display system 100 in which an image is transmitted from the image output device 200 and displayed on the image display device 300.
- the image display apparatus 300 When the image display apparatus 300 is connected to the image output apparatus 200 using the cable 400 of DVI, HDMI (registered trademark), or Display port, power is supplied to the image display apparatus 300 via the cable 400 (SEQ901). The display device 300 is activated.
- the image output apparatus 200 acquires attribute information related to image display such as the viewing angle FOV D observed by the user, the screen shape of the display panel 309, and the resolution from the image display apparatus 300 (SEQ902).
- Attribute information relating to image display is stored in the attribute information storage unit 306 in the image display device 300.
- the attribute information is described in, for example, EDID format, and the image output apparatus 200 can acquire necessary attribute information according to, for example, the DDC protocol.
- the file format for describing the attribute information is not limited, and the image output apparatus 200 may acquire the attribute information using another protocol.
- the image output device 200 performs image format conversion processing to be provided to the image display device 300 by the image processing unit 204 based on the attribute information acquired from the image display device 300 (SEQ903). Then, the image output apparatus 200 sends the image after the format conversion process to the image display apparatus 300 (SEQ904).
- the image display device 300 performs processing corresponding to the format on the image received from the image output device 200 and displays the image on the display panel 309 (SEQ905).
- FIG. 10 shows, in the form of a flowchart, the procedure of image format conversion processing executed in the image output apparatus 200 in SEQ903 in FIG.
- the image format conversion is a process for matching the viewing angle FOV O of the image provided by the image output apparatus 200 with the viewing angle FOV D observed by the user of the image display apparatus 300, and is executed by the image processing unit 204. To do.
- the viewing angle FOV D observed by the user of the image display device 300 is acquired from the image display device 300 via the communication unit 205 (step S1001). Then, the image processing unit 204 compares the viewing angle FOV O of the image provided by the image output device 200 with the viewing angle FOV D observed by the user of the image display device 300 (step S1002).
- step S1003 When the viewing angle FOV O of the original image provided from the image output device 200 is larger than the viewing angle FOV D of the image observed by the user wearing the image display device 300 (Yes in step S1002), the image processing unit 204 Then, a region of the viewing angle FOV D is cut out from the original image (step S1003).
- the image processing unit 204 Does not process the original image.
- the image is sent from the communication unit 205 to the image display device 300 via the cable 400 (step S1004).
- FIG. 12 shows another operation sequence example of the image display system 100 in which an image is sent from the image output device 200 and displayed on the image display device 300.
- the image display apparatus 300 When the image display apparatus 300 is connected to the image output apparatus 200 using the cable 400 of DVI, HDMI (registered trademark), or Display port, power is supplied to the image display apparatus 300 via the cable 400 (SEQ1201). The display device 300 is activated.
- the image display apparatus 300 transmits the information of the user's visual acuity (diopter) acquired by the state information acquisition unit 304 to the image output apparatus 200 (SEQ1202).
- the method for acquiring information on visual acuity has already been described.
- the image output device 200 switches the distortion correction table according to the visual acuity information acquired from the image display device 300, and the image processing unit 204 performs conversion processing of an image provided to the image display device 300 based on the distortion correction table. (SEQ1203).
- the image output device 200 sends the converted image to the image display device 300 (SEQ1204), and the image display device 300 displays the image received from the image output device 200 on the display panel 309 (SEQ1205).
- FIG. 13 shows, in the form of a flowchart, the procedure of distortion correction table switching and image conversion processing executed in the image output apparatus 200 in SEQ1203 in FIG.
- step S1302 When the user's visual acuity information can be acquired from the image display device 300 (Yes in step S1301), a distortion correction table corresponding to the visual acuity information is read (step S1302). Then, image conversion processing is performed using the read distortion correction table (step S1303), and the converted image is sent to the image display device 300 (step S1304).
- the image output device 200 transmits an image to the image display device 300 without performing a conversion process according to the user's visual acuity. It is sent out (step S1304).
- a head-mounted display applied as the image display device 300 is usually equipped with an eyepiece (such as the virtual image optical unit 310 in FIG. 3).
- the distortion correction table includes information for canceling the distortion of the entire synthetic optical system, which includes distortion caused by the eyepiece lens of the head mounted display and distortion caused by the additional lens.
- a distortion correction table that corrects only the distortion caused by the eye-mounted lens of the head-mounted display. What is necessary is just to make it process.
- a distortion correction table for distortion caused by the eyepiece lens of the head-mounted display and a distortion correction table for distortion caused by the vision correction glasses are stored separately, and appropriate for image conversion processing.
- a combination of distortion correction tables may be read and image conversion processing may be performed using a distortion correction table obtained by combining these.
- FIG. 11 shows, in the form of a flowchart, the procedure of image display processing executed in the image display device 300 in SEQ905 in FIG. 9 (or SEQ1205 in FIG. 12).
- the illustrated process is executed by the control unit 301 or the image processing unit 307.
- the viewing angle FOV O of the image provided by the image output device 200 is acquired from the image output device 200 via the communication unit 305 (step S1101).
- information indicating what format conversion processing is applied to the output image may be acquired.
- the viewing angle FOV O of the original image provided from the image output device 200 is compared with the viewing angle FOV D of the image observed by the user wearing the image display device 300 (step S1102).
- the image output device 200 When the viewing angle FOV O of the original image provided from the image output device 200 is larger than the viewing angle FOV D of the image observed by the user wearing the image display device 300 (Yes in step S1102), the image output device 200 It is understood that the region of the viewing angle FOV D is cut out from the original image and transmitted, that is, the viewing angle FOV T of the transmitted image is equal to the viewing angle FOV D of the image observed by the user of the image display device 300. . Therefore, as illustrated in FIG. 6, the control unit 301 instructs the image processing unit 307 to display the image sent from the image output apparatus 200 on the display panel 309 as it is (step S1103).
- the control unit 301 instructs the image processing unit 307 to execute the process designated by the user, the process automatically selected by the image display apparatus 300, or the process designated by the current image. Then, the image processed by the image processing unit 307 is displayed on the display panel 309 (step S1104).
- step S1104 the image processing unit 307 fills a surplus area when the transmitted image of the viewing angle FOV T is displayed in the range of the viewing angle FOV D with black or wallpaper. Is displayed.
- the image processing unit 307 enlarges the transmitted image of the viewing angle FOV T to the viewing angle FOV D and displays it on the display panel 309.
- the image display system 100 As described above, according to the image display system 100 according to the present embodiment, a mismatch between the viewing angle FOV O of the image provided on the image output apparatus 200 side and the viewing angle FOV D of the image displayed on the image display apparatus 300 is corrected. As a result, it is possible to display an image of one's own viewpoint or an image with a wide viewing angle while suppressing VR sickness.
- the user does not need to perform an operation of checking the viewing angle FOV D of the image display device 300 that he / she wears and setting it in the image output device 200, and always appropriately selects an image sent from the image output device 200. You can watch with a wide viewing angle. As a result, the VR sickness of the user can be greatly reduced.
- an image that is corrected so as to cancel both the distortion caused by the eyepiece mounted in the image display apparatus 300 and the distortion caused by the eyesight correction glasses is output as an image.
- the image display device 300 By providing the image display device 300 on the device 200 side, it is possible to display a self-viewpoint image or a wide viewing angle image while suppressing VR sickness.
- the embodiment in the case where the image display device 300 is configured as a head-mounted display has been mainly described, but the gist of the technology disclosed in the present specification is not limited to this. Even if the image display device 300 is various display devices such as a head-up display and a large screen display, the technology disclosed in the present specification can be similarly applied.
- a display unit for displaying an image An attribute information storage unit that stores attribute information including attributes of image display in the display unit; A communication unit that communicates with the image output device; Comprising Transmitting the attribute information to the image output device, receiving an image converted by the image output device based on the attribute information, and displaying the image on the display unit; Image display device.
- the display unit includes a display panel that displays an image, and a virtual image optical unit that enlarges and projects the image displayed on the display panel.
- the attribute information storage unit stores information on a first viewing angle of an image displayed by the display unit, Transmitting the first viewing angle information to the image output device, receiving the image converted by the image output device based on the first viewing angle, and displaying the image on the display unit;
- An image obtained by converting the original image based on the difference between the second viewing angle of the original image and the first viewing angle on the image output device side is received and displayed on the display unit.
- the image output device receives an image obtained by cutting out the region of the first viewing angle from the original image, and the display To display
- the original image is received from the image output device and displayed on the display unit.
- An image received from the image output device is displayed on the display unit based on a relationship between the first viewing angle and the second viewing angle.
- the image display device according to (5) above. (9)
- the second viewing angle is larger than the first viewing angle, and an image obtained by cutting out the region of the first viewing angle from the original image is displayed on the full screen on the display unit.
- the image display device according to (8) above. (10) When an image having the second viewing angle smaller than the first viewing angle is displayed on the display unit, a surplus area is filled with black or wallpaper is displayed.
- An image composed of the second viewing angle smaller than the first viewing angle is extended to the first viewing angle and displayed on the display unit.
- (12) When an image having the second viewing angle smaller than the first viewing angle is received from the image output device, it is added to the instruction from the observer, the attribute information of the observer, or the original image. Display on the display unit according to a method based on the received instructions, The image display device according to (8) above.
- An image display method comprising: (14) a communication unit that communicates with the image display device; An image acquisition unit for acquiring an original image to be provided to the image display device; An image processing unit for processing the original image; Comprising Receiving attribute information including attributes of image display from the image display device, the image processing unit converts the original image based on the attribute information, and transmits the converted image to the image display device; Image output device.
- Information on a first viewing angle of an image displayed by the image display device is received, and the image processing unit is based on a difference between the second viewing angle of the original image and the first viewing angle.
- the image processing unit cuts out the region of the first viewing angle from the original image when the second viewing angle is larger than the first viewing angle.
- the image processing unit does not perform conversion of an original image based on a difference in viewing angle when the first viewing angle is equal to or larger than the second viewing angle.
- the image display device stores the attribute information according to EDID or other predetermined data format, Acquiring the attribute information from the image display device via the communication unit according to DDC or other predetermined protocol; The image output apparatus according to (14) above.
- An image output method comprising: (20) an image display device having attribute information including image display attributes; An image output device that outputs an image converted based on the attribute information acquired from the image display device to the image display device;
- An image display system comprising:
- DESCRIPTION OF SYMBOLS 100 ... Image display system 200 ... Image output device 201 ... Control part, 202 ... Attribute information storage part, 203 ... Image acquisition part 204 ... Image processing part, 205 ... Communication part 300 ... Image display apparatus 301 ... Control part, 301A ... ROM , 301B ... RAM 302 ... Input operation unit 303 ... Remote control reception unit 304 ... Status information acquisition unit 305 ... Communication unit 306 ... Attribute information storage unit 307 ... Image processing unit 308 ... Display drive unit 309 ... Display unit 310 ... Virtual image optical unit 311 ... Audio processing unit, 312 ... Audio input / output unit, 313 ... Outside camera
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Optics & Photonics (AREA)
- Computer Graphics (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
Description
画像を表示する表示部と、
前記表示部における画像表示の属性を含む属性情報を記憶する属性情報記憶部と、
画像出力装置と通信する通信部と、
を具備し、
前記画像出力装置に前記属性情報を送信するとともに、前記画像出力装置が前記属性情報に基づいて変換した画像を受信して、前記表示部で表示する、
画像表示装置である。
画像表示の属性を含む属性情報を画像出力装置に送信するステップと、
前記属性情報に基づいて変換した画像を前記画像出力装置から受信するステップと、
受信した画像を表示するステップと、
を有する画像表示方法である。
画像表示装置と通信する通信部と、
前記画像表示装置に提供するための原画像を取得する画像取得部と、
前記原画像を処理する画像処理部と、
を具備し、
画像表示の属性を含む属性情報を前記画像表示装置から受信し、前記画像処理部が前記属性情報に基づいて前記原画像を変換し、変換後の画像を前記画像表示装置に送信する、
画像出力装置である。
画像表示装置に提供するための原画像を取得するステップと、
画像表示の属性を含む属性情報を前記画像表示装置から受信するステップと、
前記属性情報に基づいて前記原画像を変換するステップと、
変換後の画像を前記画像表示装置に送信するステップと、
を有する画像出力方法である。
画像表示の属性を含む属性情報を持つ画像表示装置と、
前記画像表示装置から取得した属性情報に基づいて変換した画像を前記画像表示装置に出力する画像出力装置と、
を具備する画像表示システムである。
・解像度
・画像の形状、アスペクト比
・フレームレート(周波数)
・視野角(両眼、片眼毎、オーバーラップ量を含む)
・歪み情報
・眼間距離や視力
・色域、輝度、ガンマ
・原画像の視野と画像表示装置300で観察する視野が異なるときに、画像を引き伸ばすか否か
・テキスト表示にまつわる情報
・表示位置、面積、フォント、サイズ、カラー、…
・信号入力から表示されるまでの遅延時間(映像だけに限らず)
(2)音声出力に関する属性情報
・ヘッドフォン又はスピーカーのいずれか
・ヘッドフォン又はスピーカーの数
・解像度
・画像の形状、アスペクト比
・フレームレート(周波数)
・視野角(両眼、片眼毎、オーバーラップ量を含む)
・歪み情報
・眼間距離や視力
・色域、輝度、ガンマ
・原画像の視野と画像表示装置300で観察する視野が異なるときに、画像を引き伸ばすか否か
・テキスト表示にまつわる情報
・表示位置、面積、フォント、サイズ、カラー、…
・信号入力から表示されるまでの遅延時間(映像だけに限らず)
(2)音声出力に関する属性情報
・ヘッドフォン又はスピーカーのいずれか
・ヘッドフォン又はスピーカーの数
(3)ユーザーに関する属性情報
・眼間距離や視力
・聴力、耳の形状
・身長、体重、体型、歩き方
・ユーザー識別情報(暗号化された認識パスワード)
・VR酔いのし易さ
・年齢(子供には立体視出力しない、など)
(4)センサーに関する属性情報
・マイク、内側カメラ、外側カメラ、動きセンサー、アイトラッキング
・各センサーの有無、個数
・位置、方向、サンプリングレートや精度などの情報
・カメラ画角、フレームレート、明るさ、画素数、歪み、色、ガンマ、…
・頭以外のものの位置情報(手、足、腰、ガンコン、リモコン、…)
・マーカーでも画像認識でも方法は問わない
(1)ユーザーが手動で入力する。
(2)ユーザーがあらかじめ準備した視力矯正用レンズに関する情報(型番など)を入力し、別途用意されたデータベースと照合する。
(3)視力矯正用レンズの型番などの情報をユーザーが手動で入力するのではなく、レンズ側に書き込まれている情報として、状態情報取得部304が読み取る。機械的(切り欠きの形や個数など)、電気的、光学的(バーコードのパターンなど)な方法で情報の書き込み並びに読み取りを行なうことができる。
(4)レンズを装着した状態でキャリブレーションを行ない、視度を推定したり、歪み補正テーブルを作成したりする。ディスプレイ・パネルに正方格子など既知のキャリブレーション・パターンを表示し、それをユーザーの眼球位置においたカメラで撮影することでひずみパターンを取得し、その逆写像を計算することにより適切な歪み補正テーブルを作成する。
(1)画像を表示する表示部と、
前記表示部における画像表示の属性を含む属性情報を記憶する属性情報記憶部と、
画像出力装置と通信する通信部と、
を具備し、
前記画像出力装置に前記属性情報を送信するとともに、前記画像出力装置が前記属性情報に基づいて変換した画像を受信して、前記表示部で表示する、
画像表示装置。
(2)前記表示部が表示する画像を観察する観察者の顔又は頭部に装着して用いられる、
上記(1)に記載の画像表示装置。
(3)前記表示部は、画像を表示する表示パネルと、前記表示パネルに表示した画像を拡大投影する虚像光学部を備える、
上記(2)に記載の画像表示装置。
(4)前記属性情報記憶部は、前記表示部が表示する画像が持つ第1の視野角の情報を記憶し、
前記画像出力装置に前記第1の視野角の情報を送信するとともに、前記画像出力装置が前記第1の視野角に基づいて変換した画像を受信して、前記表示部で表示する、
上記(1)に記載の画像表示装置。
(5)前記画像出力装置側で原画像の第2の視野角と前記第1の視野角との相違に基づいて原画像を変換した画像を受信して、前記表示部で表示する、
上記(4)に記載の画像表示装置。
(6)前記第2の視野角が前記第1の視野角よりも大きいときに、前記画像出力装置が前記原画像から前記第1の視野角の領域を切り出した画像を受信して、前記表示部で表示する、
上記(5)に記載の画像表示装置。
(7)前記第1の視野角が前記第2の視野角以上のときに、前記画像出力装置から前記原画像を受信して、前記表示部で表示する、
上記(5)に記載の画像表示装置。
(8)前記画像出力装置から受信した画像を、前記第1の視野角と前記第2の視野角の関係に基づいて前記表示部で表示する、
上記(5)に記載の画像表示装置。
(9)前記第2の視野角が前記第1の視野角よりも大きく、前記原画像から前記第1の視野角の領域を切り出された画像を、前記表示部で全画面表示する、
上記(8)に記載の画像表示装置。
(10)前記第1の視野角よりも小さい前記第2の視野角からなる画像を前記表示部で表示するときに、余剰の領域を黒で塗り潰し又は壁紙を表示する、
上記(8)に記載の画像表示装置。
(11)前記第1の視野角よりも小さい前記第2の視野角からなる画像を前記第1の視野角まで引き伸ばして前記表示部で表示する、
上記(8)に記載の画像表示装置。
(12)記第1の視野角よりも小さい前記第2の視野角からなる画像を前記画像出力装置から受信したときに、観察者からの指示、観察者の属性情報、又は原画像に付加された指示に基づいた方法に従って前記表示部で表示する、
上記(8)に記載の画像表示装置。
(13)画像表示の属性を含む属性情報を画像出力装置に送信するステップと、
前記属性情報に基づいて変換した画像を前記画像出力装置から受信するステップと、
受信した画像を表示するステップと、
を有する画像表示方法。
(14)画像表示装置と通信する通信部と、
前記画像表示装置に提供するための原画像を取得する画像取得部と、
前記原画像を処理する画像処理部と、
を具備し、
画像表示の属性を含む属性情報を前記画像表示装置から受信し、前記画像処理部が前記属性情報に基づいて前記原画像を変換し、変換後の画像を前記画像表示装置に送信する、
画像出力装置。
(15)前記画像表示装置が表示する画像が持つ第1の視野角の情報を受信し、前記画像処理部が前記原画像の第2の視野角と前記第1の視野角との相違に基づいて前記原画像を変換し、変換後の画像を前記画像表示装置に送信する、
上記(14)に記載の画像出力装置。
(16)前記画像処理部は、前記第2の視野角が前記第1の視野角よりも大きいときに、前記原画像から前記第1の視野角の領域を切り出す、
上記(15)に記載の画像出力装置。
(17)前記画像処理部は、前記第1の視野角が前記第2の視野角以上のときには、視野角の相違に基づく原画像の変換を行なわない、
上記(15)に記載の画像出力装置。
(18)前記画像表示装置は前記属性情報をEDID又はその他の所定のデータ・フォーマットに従って記憶しており、
DDC又はその他の所定のプロトコルに従って前記通信部を介して前記画像表示装置から前記属性情報を取得する、
上記(14)に記載の画像出力装置。
(19)画像表示装置に提供するための原画像を取得するステップと、
画像表示の属性を含む属性情報を前記画像表示装置から受信するステップと、
前記属性情報に基づいて前記原画像を変換するステップと、
変換後の画像を前記画像表示装置に送信するステップと、
を有する画像出力方法。
(20)画像表示の属性を含む属性情報を持つ画像表示装置と、
前記画像表示装置から取得した属性情報に基づいて変換した画像を前記画像表示装置に出力する画像出力装置と、
を具備する画像表示システム。
200…画像出力装置
201…制御部、202…属性情報記憶部、203…画像取得部
204…画像処理部、205…通信部
300…画像表示装置
301…制御部、301A…ROM、301B…RAM
302…入力操作部、303…リモコン受信部
304…状態情報取得部、305…通信部、306…属性情報記憶部
307…画像処理部、308…表示駆動部
309…表示部、310…虚像光学部
311…音声処理部、312…音声入出力部、313…外側カメラ
Claims (20)
- 画像を表示する表示部と、
前記表示部における画像表示の属性を含む属性情報を記憶する属性情報記憶部と、
画像出力装置と通信する通信部と、
を具備し、
前記画像出力装置に前記属性情報を送信するとともに、前記画像出力装置が前記属性情報に基づいて変換した画像を受信して、前記表示部で表示する、
画像表示装置。 - 前記表示部が表示する画像を観察する観察者の顔又は頭部に装着して用いられる、
請求項1に記載の画像表示装置。 - 前記表示部は、画像を表示する表示パネルと、前記表示パネルに表示した画像を拡大投影する虚像光学部を備える、
請求項2に記載の画像表示装置。 - 前記属性情報記憶部は、前記表示部が表示する画像が持つ第1の視野角の情報を記憶し、
前記画像出力装置に前記第1の視野角の情報を送信するとともに、前記画像出力装置が前記第1の視野角に基づいて変換した画像を受信して、前記表示部で表示する、
請求項1に記載の画像表示装置。 - 前記画像出力装置側で原画像の第2の視野角と前記第1の視野角との相違に基づいて原画像を変換した画像を受信して、前記表示部で表示する、
請求項4に記載の画像表示装置。 - 前記第2の視野角が前記第1の視野角よりも大きいときに、前記画像出力装置が前記原画像から前記第1の視野角の領域を切り出した画像を受信して、前記表示部で表示する、
請求項5に記載の画像表示装置。 - 前記第1の視野角が前記第2の視野角以上のときに、前記画像出力装置から前記原画像を受信して、前記表示部で表示する、
請求項5に記載の画像表示装置。 - 前記画像出力装置から受信した画像を、前記第1の視野角と前記第2の視野角の関係に基づいて前記表示部で表示する、
請求項5に記載の画像表示装置。 - 前記第2の視野角が前記第1の視野角よりも大きく、前記原画像から前記第1の視野角の領域を切り出された画像を、前記表示部で全画面表示する、
請求項8に記載の画像表示装置。 - 前記第1の視野角よりも小さい前記第2の視野角からなる画像を前記表示部で表示するときに、余剰の領域を黒で塗り潰し又は壁紙を表示する、
請求項8に記載の画像表示装置。 - 前記第1の視野角よりも小さい前記第2の視野角からなる画像を前記第1の視野角まで引き伸ばして前記表示部で表示する、
請求項8に記載の画像表示装置。 - 前記第1の視野角よりも小さい前記第2の視野角からなる画像を前記画像出力装置から受信したときに、観察者からの指示、観察者の属性情報、又は原画像に付加された指示に基づいた方法に従って前記表示部で表示する、
請求項8に記載の画像表示装置。 - 画像表示の属性を含む属性情報を画像出力装置に送信するステップと、
前記属性情報に基づいて変換した画像を前記画像出力装置から受信するステップと、
受信した画像を表示するステップと、
を有する画像表示方法。 - 画像表示装置と通信する通信部と、
前記画像表示装置に提供するための原画像を取得する画像取得部と、
前記原画像を処理する画像処理部と、
を具備し、
画像表示の属性を含む属性情報を前記画像表示装置から受信し、前記画像処理部が前記属性情報に基づいて前記原画像を変換し、変換後の画像を前記画像表示装置に送信する、
画像出力装置。 - 前記画像表示装置が表示する画像が持つ第1の視野角の情報を受信し、前記画像処理部が前記原画像の第2の視野角と前記第1の視野角との相違に基づいて前記原画像を変換し、変換後の画像を前記画像表示装置に送信する、
請求項14に記載の画像出力装置。 - 前記画像処理部は、前記第2の視野角が前記第1の視野角よりも大きいときに、前記原画像から前記第1の視野角の領域を切り出す、
請求項15に記載の画像出力装置。 - 前記画像処理部は、前記第1の視野角が前記第2の視野角以上のときには、視野角の相違に基づく原画像の変換を行なわない、
請求項15に記載の画像出力装置。 - 前記画像表示装置は前記属性情報をEDID(Extended Display Identification Data)又はその他の所定のデータ・フォーマットに従って記憶しており、
DDC(Display Data Channel)又はその他の所定のプロトコルに従って前記通信部を介して前記画像表示装置から前記属性情報を取得する、
請求項14に記載の画像出力装置。 - 画像表示装置に提供するための原画像を取得するステップと、
画像表示の属性を含む属性情報を前記画像表示装置から受信するステップと、
前記属性情報に基づいて前記原画像を変換するステップと、
変換後の画像を前記画像表示装置に送信するステップと、
を有する画像出力方法。 - 画像表示の属性を含む属性情報を持つ画像表示装置と、
前記画像表示装置から取得した属性情報に基づいて変換した画像を前記画像表示装置に出力する画像出力装置と、
を具備する画像表示システム。
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/031,351 US10437060B2 (en) | 2014-01-20 | 2014-12-11 | Image display device and image display method, image output device and image output method, and image display system |
EP14878787.2A EP3057089A4 (en) | 2014-01-20 | 2014-12-11 | Image display device and image display method, image output device and image output method, and image display system |
KR1020167009580A KR102233223B1 (ko) | 2014-01-20 | 2014-12-11 | 화상 표시 장치 및 화상 표시 방법, 화상 출력 장치 및 화상 출력 방법과, 화상 표시 시스템 |
CA2928248A CA2928248C (en) | 2014-01-20 | 2014-12-11 | Image display device and image display method, image output device and image output method, and image display system |
CN201480058413.3A CN105684074B (zh) | 2014-01-20 | 2014-12-11 | 图像显示装置和图像显示方法、图像输出装置和图像输出方法与图像显示系统 |
JP2015557741A JPWO2015107817A1 (ja) | 2014-01-20 | 2014-12-11 | 画像表示装置及び画像表示方法、画像出力装置及び画像出力方法、並びに画像表示システム |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014007472 | 2014-01-20 | ||
JP2014-007472 | 2014-01-20 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015107817A1 true WO2015107817A1 (ja) | 2015-07-23 |
Family
ID=53542720
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/082900 WO2015107817A1 (ja) | 2014-01-20 | 2014-12-11 | 画像表示装置及び画像表示方法、画像出力装置及び画像出力方法、並びに画像表示システム |
Country Status (7)
Country | Link |
---|---|
US (1) | US10437060B2 (ja) |
EP (1) | EP3057089A4 (ja) |
JP (1) | JPWO2015107817A1 (ja) |
KR (1) | KR102233223B1 (ja) |
CN (1) | CN105684074B (ja) |
CA (1) | CA2928248C (ja) |
WO (1) | WO2015107817A1 (ja) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017023705A1 (en) | 2015-07-31 | 2017-02-09 | Google Inc. | Unique reflective lenses to auto-calibrate eye tracking system |
JP2018137609A (ja) * | 2017-02-22 | 2018-08-30 | 株式会社東芝 | 部分画像処理装置 |
WO2019207728A1 (ja) * | 2018-04-26 | 2019-10-31 | 株式会社ソニー・インタラクティブエンタテインメント | 画像提示装置、画像提示方法、記録媒体及びプログラム |
JP2021532681A (ja) * | 2018-07-30 | 2021-11-25 | アップル インコーポレイテッドApple Inc. | 補助レンズを備える電子デバイスシステム |
JP2022500700A (ja) * | 2018-09-24 | 2022-01-04 | アップル インコーポレイテッドApple Inc. | 交換可能なレンズを備えたディスプレイシステム |
JPWO2022070701A1 (ja) * | 2020-09-30 | 2022-04-07 | ||
WO2022244340A1 (ja) * | 2021-05-20 | 2022-11-24 | ソニーグループ株式会社 | 送信装置、送信方法および映像表示システム |
JP2023052278A (ja) * | 2018-06-25 | 2023-04-11 | マクセル株式会社 | ヘッドマウントディスプレイ |
Families Citing this family (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015190877A1 (ko) * | 2014-06-12 | 2015-12-17 | 엘지전자(주) | Hdmi를 사용하여 데이터를 송수신하기 위한 방법 및 장치 |
US11198063B2 (en) * | 2016-09-14 | 2021-12-14 | Square Enix Co., Ltd. | Video display system, video display method, and video display program |
US20180144554A1 (en) | 2016-11-18 | 2018-05-24 | Eyedaptic, LLC | Systems for augmented reality visual aids and tools |
EP3602174A4 (en) * | 2017-03-21 | 2020-04-08 | Magic Leap, Inc. | METHOD AND SYSTEM FOR WAVE GUIDE PROJECTOR WITH A WIDE FIELD OF VIEW |
US20190012841A1 (en) * | 2017-07-09 | 2019-01-10 | Eyedaptic, Inc. | Artificial intelligence enhanced system for adaptive control driven ar/vr visual aids |
GB2565628B (en) * | 2017-08-18 | 2021-09-08 | Adobe Inc | Collaborative interaction with virtual reality video |
US10803642B2 (en) | 2017-08-18 | 2020-10-13 | Adobe Inc. | Collaborative virtual reality anti-nausea and video streaming techniques |
US10613703B2 (en) | 2017-08-18 | 2020-04-07 | Adobe Inc. | Collaborative interaction with virtual reality video |
KR101951406B1 (ko) * | 2017-10-11 | 2019-02-22 | 한양대학교 산학협력단 | 가상 멀미 저감을 위한 헤드 마운티드 디스플레이 및 그 동작 방법 |
US10984508B2 (en) | 2017-10-31 | 2021-04-20 | Eyedaptic, Inc. | Demonstration devices and methods for enhancement for low vision users and systems improvements |
KR102233099B1 (ko) * | 2017-12-05 | 2021-03-29 | 한국전자통신연구원 | 기계학습에 기반한 가상 현실 콘텐츠의 사이버 멀미도 예측 모델 생성 및 정량화 조절 장치 및 방법 |
US10832483B2 (en) | 2017-12-05 | 2020-11-10 | Electronics And Telecommunications Research Institute | Apparatus and method of monitoring VR sickness prediction model for virtual reality content |
US11563885B2 (en) | 2018-03-06 | 2023-01-24 | Eyedaptic, Inc. | Adaptive system for autonomous machine learning and control in wearable augmented reality and virtual reality visual aids |
CN117631307A (zh) | 2018-05-29 | 2024-03-01 | 爱达扩视眼镜公司 | 用于低视力用户的混合式透视增强现实系统和方法 |
JP7163656B2 (ja) * | 2018-07-30 | 2022-11-01 | 株式会社リコー | 配信システム、受信クライアント端末、配信方法 |
CN110868581A (zh) | 2018-08-28 | 2020-03-06 | 华为技术有限公司 | 一种图像的显示方法、装置及系统 |
EP3856098A4 (en) | 2018-09-24 | 2022-05-25 | Eyedaptic, Inc. | IMPROVED AUTONOMOUS HANDS-FREE CONTROL OF ELECTRONIC VISION AIDS |
KR20210014819A (ko) * | 2019-07-30 | 2021-02-10 | 삼성디스플레이 주식회사 | 표시 장치 및 이를 포함하는 가상 현실 표시 시스템 |
EP4104034A4 (en) | 2020-02-10 | 2024-02-21 | Magic Leap, Inc. | POSITIONING BODY-CENTRIC CONTENT RELATIVE TO A THREE-DIMENSIONAL CONTAINER IN A MIXED REALITY ENVIRONMENT |
US20240347020A1 (en) * | 2023-04-12 | 2024-10-17 | Motorola Mobility Llc | Electronic Devices and Corresponding Methods for Delivering an Extended Display Identification (EDID) Extension Identifying a Device Type and Field of View or Pixel Density Metrics |
US20240353681A1 (en) * | 2023-04-24 | 2024-10-24 | Motorola Mobility Llc | Electronic Devices and Corresponding Methods for Delivering an Extended Display Identification (EDID) Extension Identifying Interpupillary Distance Metrics |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000339490A (ja) | 1999-05-28 | 2000-12-08 | Mitsubishi Electric Corp | Vr酔い低減方法 |
JP2001154144A (ja) * | 1999-11-30 | 2001-06-08 | Shimadzu Corp | 頭部装着型表示システム |
JP2008304268A (ja) | 2007-06-06 | 2008-12-18 | Sony Corp | 情報処理装置、および情報処理方法、並びにコンピュータ・プログラム |
JP2009103908A (ja) * | 2007-10-23 | 2009-05-14 | Sharp Corp | 画像表示装置および画像表示方法 |
Family Cites Families (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0821975A (ja) * | 1994-07-06 | 1996-01-23 | Olympus Optical Co Ltd | 頭部装着型映像表示システム |
US6623428B2 (en) * | 2001-10-11 | 2003-09-23 | Eastman Kodak Company | Digital image sequence display system and method |
JP4186766B2 (ja) * | 2003-09-12 | 2008-11-26 | セイコーエプソン株式会社 | 眼鏡レンズの製造システム及び眼鏡レンズの製造方法 |
EP1811493B1 (en) * | 2004-11-12 | 2013-04-24 | Nikon Corporation | Video display |
JP4693552B2 (ja) * | 2005-08-30 | 2011-06-01 | キヤノン株式会社 | 表示装置、制御装置、及び制御方法 |
JP5228305B2 (ja) * | 2006-09-08 | 2013-07-03 | ソニー株式会社 | 表示装置、表示方法 |
FR2906899B1 (fr) * | 2006-10-05 | 2009-01-16 | Essilor Int | Dispositif d'affichage pour la visualisation stereoscopique. |
CN103607585A (zh) * | 2009-01-21 | 2014-02-26 | 株式会社尼康 | 图像处理装置、图像处理方法 |
JP2012141461A (ja) * | 2010-12-29 | 2012-07-26 | Sony Corp | ヘッド・マウント・ディスプレイ |
JP5919624B2 (ja) * | 2011-02-04 | 2016-05-18 | セイコーエプソン株式会社 | 頭部装着型表示装置および頭部装着型表示装置の制御方法 |
JP5901122B2 (ja) * | 2011-02-24 | 2016-04-06 | キヤノン株式会社 | 表示制御装置および表示制御方法 |
US8729216B2 (en) * | 2011-03-18 | 2014-05-20 | Prc Desoto International, Inc. | Multifunctional sulfur-containing polymers, compositions thereof and methods of use |
WO2012132289A1 (ja) * | 2011-03-25 | 2012-10-04 | パナソニック株式会社 | 表示装置 |
WO2013006739A2 (en) * | 2011-07-05 | 2013-01-10 | Zia Shlaimoun | Method and system for video messaging |
US20130147686A1 (en) * | 2011-12-12 | 2013-06-13 | John Clavin | Connecting Head Mounted Displays To External Displays And Other Communication Networks |
CN103190883B (zh) * | 2012-12-20 | 2015-06-24 | 苏州触达信息技术有限公司 | 一种头戴式显示装置和图像调节方法 |
JP2014215604A (ja) * | 2013-04-30 | 2014-11-17 | ソニー株式会社 | 画像処理装置および画像処理方法 |
GB201310367D0 (en) * | 2013-06-11 | 2013-07-24 | Sony Comp Entertainment Europe | Head-mountable apparatus and systems |
WO2014203440A1 (ja) * | 2013-06-19 | 2014-12-24 | パナソニックIpマネジメント株式会社 | 画像表示装置および画像表示方法 |
JP2015125502A (ja) * | 2013-12-25 | 2015-07-06 | ソニー株式会社 | 画像処理装置及び画像処理方法、表示装置及び表示方法、コンピューター・プログラム、並びに画像表示システム |
US9465237B2 (en) * | 2013-12-27 | 2016-10-11 | Intel Corporation | Automatic focus prescription lens eyeglasses |
GB2523740B (en) * | 2014-02-26 | 2020-10-14 | Sony Interactive Entertainment Inc | Image encoding and display |
US10139902B2 (en) * | 2015-09-16 | 2018-11-27 | Colopl, Inc. | Method and apparatus for changing a field of view without synchronization with movement of a head-mounted display |
JP6092437B1 (ja) * | 2016-02-02 | 2017-03-08 | 株式会社コロプラ | 仮想空間画像提供方法、及びそのプログラム |
JP6087453B1 (ja) * | 2016-02-04 | 2017-03-01 | 株式会社コロプラ | 仮想空間の提供方法、およびプログラム |
-
2014
- 2014-12-11 KR KR1020167009580A patent/KR102233223B1/ko active IP Right Grant
- 2014-12-11 CN CN201480058413.3A patent/CN105684074B/zh active Active
- 2014-12-11 CA CA2928248A patent/CA2928248C/en active Active
- 2014-12-11 EP EP14878787.2A patent/EP3057089A4/en not_active Withdrawn
- 2014-12-11 JP JP2015557741A patent/JPWO2015107817A1/ja active Pending
- 2014-12-11 WO PCT/JP2014/082900 patent/WO2015107817A1/ja active Application Filing
- 2014-12-11 US US15/031,351 patent/US10437060B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000339490A (ja) | 1999-05-28 | 2000-12-08 | Mitsubishi Electric Corp | Vr酔い低減方法 |
JP2001154144A (ja) * | 1999-11-30 | 2001-06-08 | Shimadzu Corp | 頭部装着型表示システム |
JP2008304268A (ja) | 2007-06-06 | 2008-12-18 | Sony Corp | 情報処理装置、および情報処理方法、並びにコンピュータ・プログラム |
JP2009103908A (ja) * | 2007-10-23 | 2009-05-14 | Sharp Corp | 画像表示装置および画像表示方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3057089A4 |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3329671A4 (en) * | 2015-07-31 | 2019-03-20 | Google LLC | UNIQUE REFLECTIVE LENS FOR SELF-CALIBRATING AN EYES TRACKING SYSTEM |
CN112181140A (zh) * | 2015-07-31 | 2021-01-05 | 谷歌有限责任公司 | 自动校准穿戴式眼睛跟踪系统的独特反射镜片 |
WO2017023705A1 (en) | 2015-07-31 | 2017-02-09 | Google Inc. | Unique reflective lenses to auto-calibrate eye tracking system |
JP2018137609A (ja) * | 2017-02-22 | 2018-08-30 | 株式会社東芝 | 部分画像処理装置 |
US11599192B2 (en) | 2018-04-26 | 2023-03-07 | Sony Interactive Entertainment Inc. | Image presentation apparatus, image presentation method, recording medium, and program |
WO2019207728A1 (ja) * | 2018-04-26 | 2019-10-31 | 株式会社ソニー・インタラクティブエンタテインメント | 画像提示装置、画像提示方法、記録媒体及びプログラム |
JPWO2019207728A1 (ja) * | 2018-04-26 | 2020-12-03 | 株式会社ソニー・インタラクティブエンタテインメント | 画像提示装置、画像提示方法、記録媒体及びプログラム |
JP2023052278A (ja) * | 2018-06-25 | 2023-04-11 | マクセル株式会社 | ヘッドマウントディスプレイ |
JP7379734B2 (ja) | 2018-06-25 | 2023-11-14 | マクセル株式会社 | ヘッドマウントディスプレイ |
JP2021532681A (ja) * | 2018-07-30 | 2021-11-25 | アップル インコーポレイテッドApple Inc. | 補助レンズを備える電子デバイスシステム |
JP7494336B2 (ja) | 2018-07-30 | 2024-06-03 | アップル インコーポレイテッド | 補助レンズを備える電子デバイスシステム |
JP2022500700A (ja) * | 2018-09-24 | 2022-01-04 | アップル インコーポレイテッドApple Inc. | 交換可能なレンズを備えたディスプレイシステム |
JPWO2022070701A1 (ja) * | 2020-09-30 | 2022-04-07 | ||
WO2022070701A1 (ja) * | 2020-09-30 | 2022-04-07 | 富士フイルム株式会社 | 電子ビューファインダーおよび光学装置 |
JP7436699B2 (ja) | 2020-09-30 | 2024-02-22 | 富士フイルム株式会社 | 電子ビューファインダーおよび光学装置 |
US12055838B2 (en) | 2020-09-30 | 2024-08-06 | Fujifilm Corporation | Electronic view finder and optical apparatus |
WO2022244340A1 (ja) * | 2021-05-20 | 2022-11-24 | ソニーグループ株式会社 | 送信装置、送信方法および映像表示システム |
Also Published As
Publication number | Publication date |
---|---|
US20160246057A1 (en) | 2016-08-25 |
CA2928248A1 (en) | 2015-07-23 |
CA2928248C (en) | 2023-03-21 |
KR20160110350A (ko) | 2016-09-21 |
CN105684074B (zh) | 2020-01-07 |
CN105684074A (zh) | 2016-06-15 |
US10437060B2 (en) | 2019-10-08 |
JPWO2015107817A1 (ja) | 2017-03-23 |
EP3057089A1 (en) | 2016-08-17 |
EP3057089A4 (en) | 2017-05-17 |
KR102233223B1 (ko) | 2021-03-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2015107817A1 (ja) | 画像表示装置及び画像表示方法、画像出力装置及び画像出力方法、並びに画像表示システム | |
KR102420313B1 (ko) | 정보 처리 장치 및 정보 처리 방법, 컴퓨터 프로그램, 그리고 화상 처리 시스템 | |
US10534428B2 (en) | Image processing device and image processing method, display device and display method, and image display system | |
US10692300B2 (en) | Information processing apparatus, information processing method, and image display system | |
EP2889873B1 (en) | Image display device and image display method, information communication terminal and information communication method, and image display system | |
US20150355463A1 (en) | Image display apparatus, image display method, and image display system | |
KR20210130206A (ko) | 헤드 마운트 디스플레이를 위한 이미지 디스플레이 방법 및 장치 | |
US20160306173A1 (en) | Display control device, display control method, and computer program | |
US20170070730A1 (en) | Imaging apparatus and imaging method | |
WO2016013269A1 (ja) | 画像表示装置及び画像表示方法、並びにコンピューター・プログラム | |
JP6391423B2 (ja) | 画像生成装置、画像抽出装置、画像生成方法、および画像抽出方法 | |
US20150370067A1 (en) | Devices and Systems For Real-Time Experience Sharing | |
US20210063746A1 (en) | Information processing apparatus, information processing method, and program | |
CN108989784A (zh) | 虚拟现实设备的图像显示方法、装置、设备及存储介质 | |
US20220053179A1 (en) | Information processing apparatus, information processing method, and program | |
WO2020129115A1 (ja) | 情報処理システム、情報処理方法およびコンピュータプログラム | |
US20210058611A1 (en) | Multiviewing virtual reality user interface | |
US11343567B1 (en) | Systems and methods for providing a quality metric for media content | |
WO2019056577A1 (zh) | 在vr一体机显示高清图像的方法及vr一体机 | |
US20230236442A1 (en) | Multimedia system and method for AR/VR Smart Contact Lens |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14878787 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015557741 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 20167009580 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2928248 Country of ref document: CA |
|
REEP | Request for entry into the european phase |
Ref document number: 2014878787 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014878787 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15031351 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |