US20180184042A1 - Video display device and control method - Google Patents
Video display device and control method Download PDFInfo
- Publication number
- US20180184042A1 US20180184042A1 US15/579,295 US201515579295A US2018184042A1 US 20180184042 A1 US20180184042 A1 US 20180184042A1 US 201515579295 A US201515579295 A US 201515579295A US 2018184042 A1 US2018184042 A1 US 2018184042A1
- Authority
- US
- United States
- Prior art keywords
- video
- video display
- unit
- user
- movement
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/001—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/01—Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
- H04N7/0127—Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level by changing the field or frame frequency of the incoming video signal, e.g. frame rate converter
- H04N7/0132—Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level by changing the field or frame frequency of the incoming video signal, e.g. frame rate converter the field or frame frequency of the incoming video signal being multiplied by a positive integer, e.g. for flicker reduction
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/147—Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/20—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
- G09G3/34—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source
- G09G3/3406—Control of illumination source
- G09G3/3413—Details of control of colour illumination sources
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3102—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM] using two-dimensional electronic spatial light modulators
- H04N9/3111—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM] using two-dimensional electronic spatial light modulators for displaying the colours sequentially, e.g. by using sequentially activated light sources
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3179—Video signal processing therefor
- H04N9/3182—Colour adjustment, e.g. white balance, shading or gamut
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3191—Testing thereof
- H04N9/3194—Testing thereof including sensor feedback
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0118—Head-up displays characterised by optical features comprising devices for improving the contrast of the display / brillance control visibility
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2310/00—Command of the display device
- G09G2310/02—Addressing, scanning or driving the display screen or processing steps related thereto
- G09G2310/0235—Field-sequential colour display
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/02—Improving the quality of display appearance
- G09G2320/0242—Compensation of deficiencies in the appearance of colours
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/10—Special adaptations of display systems for operation with variable images
- G09G2320/106—Determination of movement vectors or equivalent parameters within the image
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/04—Changes in size, position or resolution of an image
- G09G2340/0407—Resolution change, inclusive of the use of different resolutions for different screen areas
- G09G2340/0435—Change or adaptation of the frame rate of the video stream
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2360/00—Aspects of the architecture of display systems
- G09G2360/02—Graphics controller able to handle multiple formats, e.g. input or output formats
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/16—Use of wireless transmission of display information
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Hardware Design (AREA)
- Optics & Photonics (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Controls And Circuits For Display Device (AREA)
- Control Of Indicators Other Than Cathode Ray Tubes (AREA)
- Liquid Crystal Display Device Control (AREA)
Abstract
Provided is a video display device wearable on the head of a user, wherein the video display device includes a video display unit capable of switching two or more display methods, a control unit for indicating a display method to the video display unit, a first detection unit for detecting the motion of the head of a user, a second detection unit for detecting the motion of the point of view of the user, and a motion determination unit for determining the motion state of the device user by using the output from the first detection unit and the output from the second detection unit. The control unit indicates a change of display methods to the video display unit in accordance with the determination result of the motion determination unit.
Description
- The invention of the present application relates to a video display device which is wearable on the head of a user and displays a video before the eyes of the user, and also to a method of controlling the same.
- User-wearable video display devices are getting lighter in weight and smaller in size and are anticipated to become less cumbersome to device users. Video display devices of such a type is advantageous in its capability of allowing a user to obtain information with both of their hands free, and are thus expected to be used for various purposes.
-
- PATENT LITERATURE 1: JP5,228,305B
- For example, as in the above-given patent literature, there have been proposed devices that use a glasses-like or head-wearable mount unit and display and end displaying a video on a display unit placed right before the eyes of the user.
- With a wearable video display device, it is hard for the device user to look away from a video, so it is desirable to, while continuing displaying the video, lessen a discomfort that may result from instability in the video, or the like. However, there are cases where it is important to keep displaying a video depending on the purpose of the device, and to end displaying the video at the device's discretion as in the
reference literature 1 may work against the device user's interests in such a case. - A conceivable cause for the instability in a video may lie in the display scheme employed by the video display device. For example, in a case of a video display device that employs liquid crystals, the instability may be a flicker on the screen caused by update of video information on the screen, and in a case of a video display device employing a field sequential scheme (color time-division scheme), the instability may be color breakup and the like. With a wearable video display device, the device user may sense such video instabilities as a greater discomfort when moving. Further, a display method employed to reduce instability in a video increases consumption of power in turn.
- To solve the above problem, a video display device wearable on a head of a user includes a video display unit capable of switching a display method between two or more display methods, a control unit that instructs the video display unit which of the display methods to employ, a first detection unit that detects a movement of the head of the user, a second detection unit that detects a movement of a viewpoint of the user, and a motion determination unit that determines a motional state of the user of the device based on an output from the first detection unit and an output from the second detection unit. The control unit instructs the video display unit to change the display method according to a result of the determination made by the motion determination unit.
- The present invention can achieve, with low power consumption, reduction in a discomfort that the user of a wearable video display device may feel from a video.
- The other objectives, characteristics, and advantages of the present invention will become clear from the following descriptions of embodiments of the present invention based on the accompanying drawings.
-
FIG. 1A is a block diagram of avideo display device 10 according toEmbodiment 1. -
FIG. 1B is a diagram illustrating an example of the outer appearance of thevideo display device 10. -
FIG. 1C is a diagram illustrating thevideo display device 10 shown inFIG. 1B being worn. -
FIG. 1D is a diagram illustrating another example of the outer appearance of thevideo display device 10. -
FIG. 1E is a diagram illustrating thevideo display device 10 shown inFIG. 1D being worn. -
FIG. 1F is a diagram illustrating yet another example of thevideo display device 10 being worn. -
FIG. 2 is a block diagram illustrating an embodiment of avideo display unit 1001 in thevideo display device 10. -
FIG. 3 is a circuitry diagram illustrating an embodiment of alight source element 2003 in thevideo display unit 1001. -
FIG. 4A is a timing chart illustrating an example of a normal-speed display operation of thevideo display unit 1001 in thevideo display device 10. -
FIG. 4B is a timing chart illustrating an example of a double-speed display operation of thevideo display unit 1001 in thevideo display device 10. -
FIG. 4C is a timing chart illustrating an example of a double-speed operation of thevideo display unit 1001 in thevideo display device 10, in which frame interpolation is performed. -
FIG. 4D is a timing chart illustrating an example of a triple-speed display operation of thevideo display unit 1001 in thevideo display device 10. -
FIG. 5 is a table illustrating determination processing performed by a motiondetermination processing unit 1006. -
FIG. 6 is a flowchart of determination processing performed by acontrol unit 1003 and the motiondetermination processing unit 1006. -
FIG. 7A is an example of the surroundings. -
FIG. 7B is an example of a virtual video displayed by thevideo display device 10. -
FIG. 7C is an example of how a virtual video is superimposed over the surroundings. -
FIG. 8A is an example of a video displayed by thevideo display device 10. -
FIG. 8B is an example of a displayed video the contrast of which has been lowered by avideo processing unit 1008. -
FIG. 8C is an example of a displayed video the sharpness of which has been lowered by thevideo processing unit 1008. -
FIG. 9A is an example of a video targeted by a videodetermination processing unit 1009. -
FIG. 9B is another example of a video targeted by the videodetermination processing unit 1009. -
FIG. 10A is a timing chart illustrating an example of a light source control stop operation of the video display unit in thevideo display device 10. -
FIG. 10B is a timing chart illustrating an example of a ferroelectric liquid crystal update stop operation of the video display unit in thevideo display device 10. -
FIG. 11A is a diagram illustrating an embodiment of a second sensor. -
FIG. 11B is a diagram illustrating the embodiment of the second sensor in detail. -
FIG. 12A is an example of a video displayed for initialization of the video display device. -
FIG. 12B is a diagram illustrating a range in which to display a video for initialization of the video display device. -
FIG. 13 is a block diagram illustrating avideo display device 130 according toEmbodiment 2. -
FIG. 14 is a block diagram illustrating avideo display device 140 according toEmbodiment 2. -
FIG. 15 is a block diagram illustrating avideo display device 150 according toEmbodiment 2. -
FIG. 16 is a block diagram illustrating avideo display device 160 according toEmbodiment 3. -
FIG. 17 is a block diagram illustrating an example of how video display devices according toEmbodiment 3 are controlled. - Embodiments of the present invention are described below with reference to the drawings.
- A description is given of an embodiment of the present invention based on the accompanying drawings.
- 1. Outline of the Device
-
FIG. 1A is a block diagram illustrating a video display device which is wearable on a user and displays a video before the eyes of the user. - A
video display device 10 includes avideo display unit 1001, adisplay control unit 1002, acontrol unit 1003, afirst sensor 1004, asecond sensor 1005, a motiondetermination processing unit 1006, avideo information source 1007, avideo processing unit 1008, a videodetermination processing unit 1009, astorage unit 1010, a frequencydetermination processing unit 1011, and apower supply unit 1012. - The
video display device 10 displays a video by transmitting video information acquired from thevideo information source 1007 to thevideo display unit 1001 via thevideo processing unit 1008. A representative example of thevideo display unit 1001 is a display using a liquid crystal element or a mirror array element. - The
video information source 1007 selects appropriate video information data stored in a storage device (not shown), performs processing such as decryption or cancellation of encryption on the data if necessary, and transmits the video information to 1008. Thevideo information source 1007 may transmit chronological moving images chronologically, or may transmit still images successively. - The
control unit 1003 is connected to acontroller 1020 outside thevideo display device 10 in a wired or wireless manner. By operating thecontroller 1020, adevice user 20 can turn on and off thevideo display device 10 or make various video-related settings. Thecontroller 1020 may be a controller exclusive to thevideo display device 10, or a smartphone in which a particular application program is installed to enable the smartphone to be used a controller. Thepower supply unit 1012 is provided with a power on/off switch besides the one on the controller. - The
display control unit 1002, thecontrol unit 1003, the motiondetermination processing unit 1006, thevideo information source 1007, thevideo processing unit 1008, the videodetermination processing unit 1009, and the frequencydetermination processing unit 1011 are mounted on thevideo display device 10 as independent pieces of hardware. Alternatively, they may be implemented by one or more arithmetic processors or microprocessors and software or firmware. They may be implemented as a functional block in part of an integrated circuit, or implemented by a programmable logic device such as an FPGA (Field-Programmable Gate Array). - The
storage unit 1010 also does not need to be implemented as a separate component, and may be implemented as a functional block in part of the integrated circuit. -
FIG. 1B is a diagram illustrating an example of the outer appearance of thevideo display device 10. Main components are housed in the junction between a lens and a temple, and a video is projected onto the lens parts which are half mirrors. Thecontroller 1020 and the main body of the device are connected to each other with a cable. -
FIG. 1C is a diagram illustrating thevideo display device 10 ofFIG. 1B being worn. -
FIG. 1D is a diagram illustrating an another example of the outer appearance of thevideo display device 10. Main components are housed in the junction between a lens and a temple, and a prism serves as a projection unit (2007 inFIG. 2 ) of thevideo display unit 1001. -
FIG. 1E is a diagram illustrating thevideo display device 10 ofFIG. 1D being worn. -
FIG. 1F is a diagram illustrating yet another example of the outer appearance of thevideo display device 10. A prism before an eye serves as a projection unit (2007 inFIG. 2 ) of thevideo display unit 1001, and other components are housed separately in a helmet and in a temple. - 2. Displaying a Video
-
FIG. 2 illustrates an example configuration of thevideo display unit 1001. Thevideo display unit 1001 includes a videosignal processing unit 2001, a light source element powersupply control unit 2002, alight source element 2003, alight source driver 2004, amodulator 2005, amodulator driver 2006, aprojection unit 2007, and asettings control unit 2008. - Video information from the
video processing unit 1008 is sent to the videosignal processing unit 2001, and the videosignal processing unit 2001 determines, for the received video information, the intensity of the light source, timing to drive the light source, and a modulator driving pattern. - Information on the intensity of the light source is transmitted to the light source element power
supply control unit 2002. The light source element powersupply control unit 2002 controls voltage to supply to thelight source element 2003 according to the intensity information received. - The timing to drive the light source is sent to the
light source driver 2004. Thelight source driver 2004 controls thelight source element 2003 according to the timing to drive the light source received. - The light source element power
supply control unit 2002 and thelight source driver 2004 may be mounted on the same element. -
FIG. 3 is a diagram illustrating the configurations of the light source element powersupply control unit 2002, thelight source element 2003, and thelight source driver 2004 in detail. Thelight source element 2003 is formed by LEDs of the three primary colors: ared LED 3001, agreen LED 3002, and ablue LED 3003. The three LEDs are connected in series with current limitingresistances supply control unit 2002. The potentials VLEDr, VLEDg, and VLEDb can be set to any values by the light source element powersupply control unit 2002, and thereby light emissions by thered LED 3001, thegreen LED 3002, and theblue LED 3003 can be controlled. The terminals of the current limitingresistances light source driver 2004. By changing the potentials at terminals CTRLr, CTRLg, and CTRLb, thelight source driver 2004 controls the amounts of light emission, the durations of light emission, and the like of thered LED 3001, thegreen LED 3002, and theblue LED 3003, respectively, and themodulator 2005 thus performs modulation for all the pixels. Thedisplay control unit 1002 sends thesettings control unit 2008 of thevideo display unit 1001 control signals specifying the intensities of the light sources. - Although it has been described above that the light source has LEDs of the three primary colors, the present invention is not limited to such a configuration. The light source may have one or more white LEDs. Also, the light source does not need to be LEDs. With regard to the three primary colors, the light source may be configured to, instead of emitting only the primary colors, extract a particular color by filtering using, for example, a white light source and a dichroic filter, a color wheel, or the like.
-
FIG. 2 is referenced again. - The modulator driving pattern is transmitted to the
modulator driver 2006. Themodulator driver 2006 drives themodulator 2005 according to the modulator driving pattern. - Examples of the
modulator 2005 include a transmissive liquid crystal element, an LCOS (Liquid Crystal On Silicon), a DMD (Digital Mirror Device), and the like. - The
modulator 2005 and themodulator driver 2006 may be configured as a single element component. - The following description assumes that the
modulator 2005 is an LCOS type. - Light emitted from the
light source element 2003 is modulated by themodulator 2005 and projected onto theprojection unit 2007. - Examples of the
projection unit 2007 is a reflective object such as a mirror, a scattering object such as a screen, a prism, a half mirror, a lens, and the like. Any of these may be used in combination. - Depending on the structure of the
projection unit 2007 and on the addition of other component, thevideo display device 10 of the present invention may be in the opaque, goggle-like form in which the view of thedevice user 20 is covered, or in the transparent form in which thedevice user 20 can see the surroundings and recognize a video in part of their view. The following description assumes the transparent form. - In the transparent form, the
device user 20 sees a video as illustrated inFIGS. 7A, 7B, and 7C . Specifically, thedevice user 20 can see a video like the one inFIG. 7C , in which avirtual video 7001 like the one inFIG. 7B is superimposed over the surroundings like the one inFIG. 7A . - The
video display device 10 is in either a form in which a video is projected to both of the eyes of thedevice user 20 or a form in which a video is projected to only one of the eyes. AlthoughFIG. 2 illustrates a case where a video is projected to only one of the eyes, it is possible to project a video to both of the eyes by configuring thevideo display device 10 to include twoprojection units 2007 so that light from themodulator 2005 may be incident on both the left and right eyes. Further, thevideo display device 10 may be configured with twovideo displays units 1001—one for the right eye and one for the left eye—to display a three-dimensional video by projecting parallax images onto the respectivevideo displays units 1001. - By observing the light from the
projection unit 2007, thedevice user 20 recognizes light representing the input video information, as a video. - The
settings control unit 2008 can receive a control signal and change the settings of the videosignal processing unit 2001. - The
video display unit 1001 has two or more display methods for displaying a video.FIGS. 4A, 4B, 4C, and 4D illustrate display methods by way of example. For example, video information is handled as moving image information in which N still images per second in average are arranged in a predetermined order (N is a positive number of 1 or larger). In this regard, N is called a frame rate, and the number of still images per second is expressed by flames per second (fps). At present, a typical frame rate is, for example, 30 fps and 60 fps. In the examples inFIG. 4 , the number of times of display by thevideo display unit 1001 is changed with respect to the frame rate of video information. -
FIG. 4A illustrates a first display method by way of example. In the first display method, thevideo display unit 1001 changes a displayed video every 1/N second with respect to the frame rate N. A displayed video is referred to as a frame. - The following description assumes as an example that the present embodiment employs the field sequential scheme (color time-division scheme). Specifically, for each frame to display, the
video display unit 1001 divides video information into primary color components of red, green, and blue, and further divides the 1/N second into three time slots, and displays the videos of the color components in the respective time slots separately. - For example, in the first time slot of a
frame 1, a setting (1R) is made such that themodulator 2005, which is an LCOS, displays the red component of the divided video information on theframe 1, causing thered LED 3001 to emit light for a predetermined period of time (aperiod 1R) shorter than 1/(3N) second. Next, a setting (1G) is made such that themodulator 2005 displays the green component of theframe 1, causing thegreen LED 3002 to emit light for a predetermined period of time (aperiod 1G) shorter than 1/(3N) second. Further, a setting (1B) is made such that themodulator 2005 displays the blue component of theframe 1, causing theblue LED 3003 to emit light for a predetermined period of time (aperiod 1B) shorter than 1/(3N) second. By displaying the color components sequentially at a high speed, the observer wearing the device sees full-color images in which the three primary color component are mixed together. - Such display of each color component is performed similarly for the
frames FIGS. 4A to 4D depict only up to theframe 3, similar processing is repeatedly performed for the rest of the frames, as well. -
FIG. 4B illustrates a second display method by way of example. In the second display method, the period of time themodulator 2005 displays each color component in one frame is further halved compared to the first display method ofFIG. 4A , and therefore, one color component is displayed in a time slot which is one sixth of a 1/N second of the frame rate N for a displayed video. This second display method is called double-speed driving because it drives themodulator 2005 at an update speed twice as high as the first display method. - It is also possible to achieve a display method with triple-speed, quadruple-speed, or higher driving by increasing the driving speed of the
modulator 2005 and thelight source element 2003. To change the driving speed of themodulator 2005 and thelight source element 2003 is to change the intervals of updating information on the pixels of the liquid crystal element or the mirror array element.FIG. 4D shows a timing chart of triple-speed driving. -
FIG. 4C illustrates a modification regarding double-speed driving. The intervals of driving themodulator 2005 and thelight source element 2003 are the same as those in the display method ofFIG. 4B . An intermediate frame 1.5 is generated from the video of theframes frame 1, the frame 1.5, and theframe 2, so that the observer sees the video as smooth images. The intermediate frames are generated by thevideo processing unit 1008. - Generation of intermediate frames is possible even when, for example, the update speed is higher than double speed, such as triple speed or quadruple speed like the case in
FIG. 4D , and is not limited to double speed. - By raising the speed from normal-speed display to double-speed display or to triple-speed display, or by generating intermediate frames, the motion of the images becomes smoother, and the
device user 20 is less likely to sense color breakup and therefore feel less uncomfortable. However, more power is consumed. - Via the
settings control unit 2008 of thevideo display unit 1001, thedisplay control unit 1002 sends a control signal commanding a switch to a different display method. - The
video processing unit 1008 is capable of making a change to video information inputted from thevideo information source 1007 and outputting the changed video information to thevideo display unit 1001. - For example, video information is H×V-pixel data containing H pixels vertically (where H is an integer of 1 or larger) and V pixels horizontally (where V is an integer of 1 or larger) per frame.
- Changing the contrast of an image involves processing to change the differences between color tones, or in particular, processing to obtain a pixel value to output to the
video display unit 1001 by multiplying a pixel value inputted from thevideo information source 1007 by a proportionality coefficient larger than 1. - Changing the brightness of an image involves processing to obtain an output pixel value by increasing a pixel value by a designated value, or in particular, processing to obtain a pixel value to output to the
video display unit 1001 by adding any value to a pixel value inputted from thevideo information source 1007. - According to a signal from the
control unit 1003, thevideo processing unit 1008 performs processing such as increasing or decreasing the contrast of an image and/or increasing or decreasing the brightness of an image. - The
video processing unit 1008 may change other image-related parameters, such as sharpness, saturation, and hue, according to a signal from the control unit. - Also, according to a signal from the
control unit 1003, thevideo processing unit 1008 may be switched to transmit video information to thevideo display unit 1001 without subjecting the video information to any of the above processing. - For example, decreasing the contrast of the video illustrated in
FIG. 8A yields the video inFIG. 8B . Meanwhile, decreasing the sharpness of the video inFIG. 8A yields the video inFIG. 8C . As will be described later, when the viewpoint of thedevice user 20 is not positioned on an image video being displayed, its contrast or sharpness may be decreased so that eye strain of thedevice user 20 can be reduced. - 3. Movement of the Device User and Control of Video
- The
first sensor 1004 is a sensor for detecting the turning of the head of thedevice user 20, and is for example a gyro sensor. Thefirst sensor 1004 outputs a three-dimensional motion vector indicating a head movement over a predetermined period of time. Thesecond sensor 1005 is a line-of-sight sensor for detecting the movement of the line of sight or the position of the point of gaze, and outputs a two-dimensional or three-dimensional vector indicating the movement of the line of sight of thedevice user 20 over a predetermined period of time. - The motion
determination processing unit 1006 determines the motional state of thedevice user 20 based on the outputs from thefirst sensor 1004 and thesecond sensor 1005. Specifically, the motional state includes three items: a movement of the head, a movement of the line of sight, and the directions of the movement of the head and the movement of the line of sight. A description will be given later as to the processing to determine these three items from the sensor outputs. Further, the motiondetermination processing unit 1006 determines whether the point of gaze is on thevirtual video 7001 being displayed by the video display device. A detailed description for this processing will be given later, as well. - Based on the determination results obtained by the motion
determination processing unit 1006, thecontrol unit 1003 determines which processing to perform in accordance withFIG. 5 and commands a display speed for thevideo display unit 1001 to thedisplay control unit 1002, and commands processing related to video parameters to thevideo processing unit 1008. -
Pattern 1 inFIG. 5 is a case where there is no movement of the head of thedevice user 20, there is no movement of the line of sight, and the point of gaze is on the virtual video. In this case, thedevice user 20 is still, and it is unlikely that the device is shaken or the positional shift occurs between the device and thedevice user 20. Also, there is no movement in the eyes. A phenomenon such as color breakup that may give a discomfort to thedevice user 20 is not likely to occur. In such a case, thecontrol unit 1003 commands processing A, since there is barely need to pay consideration to the effect on discomfort. - In processing A, the display method of the
video display unit 1001 is normal-speed driving, and thevideo processing unit 1008 passes a video from the video information source to thevideo display unit 1001 without making any change to the video by performing video processing thereon. This processing consumes the least power. -
Pattern 2 inFIG. 5 differs frompattern 1 in that the point of gaze detected by thesecond sensor 1005 is not on the virtual video. In this case, thedevice user 20 is not in motion and not viewing thevirtual video 7001, and therefore, the importance in visibility is low. Then, thecontrol unit 1003 commands processing B. - In processing B, the display method of the
video display unit 1001 is normal-speed driving, and thevideo processing unit 1008 lowers the contract and brightness of the video by a prescribed amount. -
Patterns FIG. 5 are cases where the movement of either the head or the line of sight of thedevice user 20 is detected, and the point of gaze is on the virtual video. - In these patterns, the
device user 20 is not still, and there is an increased possibility that thedevice user 20 of thevideo display device 10 feels uncomfortable, increasing the need to pay consideration to discomfort. Thus, thecontrol unit 1003 commands processing C to reduce discomfort. - In processing C, the display method of the
video display unit 1001 is set to a higher update speed, triple speed, and the video processing performed by thevideo processing unit 1008 is initialized. -
Patterns device user 20 is detected, and the point of gaze is not on the virtual video. - In these patterns, the
device user 20 is not still, and there is an increased possibility that thedevice user 20 of thevideo display device 10 feels uncomfortable. However, since thedevice user 20 is not viewing the virtual video 6001, the importance in visibility is low. Thus, thecontrol unit 1003 commands processing D. - In processing D, the display method of the
video display unit 1001 is set to a higher update speed, triple speed, and thevideo processing unit 1008 lowers the contract and brightness of the video by a prescribed amount. - In
pattern 7 inFIG. 5 , both the movement of the head and the movement of the line of sight of thedevice user 20 are detected. If the direction of the movement of the head substantially matches the direction of the movement of the line of sight, thedevice user 20 is likely making an eye movement called saccades, and viewing neither the surroundings nor thevirtual video 7001. While consideration still needs to be paid to thedevice user 20 for the discomfort that may be felt from the video, the importance in visibility of the virtual video is low. Thus, thecontrol unit 1003 commands processing D described above. -
FIG. 6 is a flowchart for implementing the pattern-based control illustrated inFIG. 5 , which is performed by thecontrol unit 1003 and the motiondetermination processing unit 1006. - Processing starts in Step S010 when the power switch on the device main body is turned on, or when the
device user 20 issues an instruction. - In Step S020, the
control unit 1003 performs processing for initialization and processing for energization of thefirst sensor 1004 and thesecond sensor 1005. In the initialization processing, thecontrol unit 1003 sets the display method of thevideo display unit 1001 to double speed, and initializes various parameters of thevideo processing unit 1008 to prescribed default values. - In Step S030, the motion
determination processing unit 1006 acquires a sensor output from thefirst sensor 1004. - In Step S040, the motion
determination processing unit 1006 acquires a sensor output from thesecond sensor 1005. - In Step S050, the motion
determination processing unit 1006 determines, based on the output from the first sensor, whether or not the magnitude of the turning speed of the head is higher than or equal to a prescribed value A1 (A1 is a positive value), and handles the determination result as X. X is true (‘1’) when the detection result is higher than or equal to the prescribed value A1, and false (‘0’) when the detection result is lower than the prescribed value A1. - In Step S060, the motion
determination processing unit 1006 determines, based on the output from the second sensor, whether or not the magnitude of the motional speed of the line of sight is higher than or equal to a prescribed value S1 (S1 is a positive value), and handles the determination result as Y. Y is true (‘1’) when the detection result is higher than or equal to the prescribed value S1, and false (‘0’) when the detection result is lower than the prescribed value S1. - In Step S070, the motion
determination processing unit 1006 determines, based on the output from the second sensor, whether the two-dimensional coordinates of the position of the line of sight (the point of gaze) is on the virtual image (inside a predetermined range), and handles the determination result as Z. Z is true (‘1’) when the position of the point of gaze is inside the predetermined range, and false (‘0’) when the position of the point of gaze is outside the predetermined range. - Refer to
FIG. 5 for these X, Y, and Z and their values. - In Step S080, the motion
determination processing unit 1006 performs conditional branching based on the logical AND of the determination result X and the determination result Y. Processing proceeds to Step S110 if X·Y=0, and proceeds to Step S090 if X·Y=1. - In Step S090, the motion
determination processing unit 1006 performs conditional branching based on the exclusive OR (XOR) of the determination result X and the determination result Y. Processing proceeds to Step S111 if X XOR Y=1, and proceeds to Step S100 if X XOR Y=0. - In Step S110, if Z is true, processing proceeds to Step S120 in which the
control unit 1003 commands processing A, and if Z is false, processing proceeds to Step S130 in which thecontrol unit 1003 commands processing B. - In Step S100, the motion
determination processing unit 1006 compares the direction of the motional speed of the head, outputted from thefirst sensor 1004, and the direction of the movement of the line of sight, outputted from thesecond sensor 1005, with each other, and determines whether the directions of motion vectors substantially match each other. A method for this determination will be described later. - In Step S111, the motion
determination processing unit 1006 performs conditional branching based on the determination result Z. If Z is true, processing proceeds to Step S140 in which thecontrol unit 1003 commands processing C, and if Z is false, processing proceeds to Step S150 in which thecontrol unit 1003 commands processing D. - In Step S160, it is determined whether a setting has been made to repeat the processing in this flowchart continuously. This setting may be made or changed by the
device user 20 or may be set by default. If such a setting is enabled, the processing proceeds to a standby step S170, and if such a setting is disabled, the processing proceeds to a termination step S180. - In Step S170, the processing stands by for a predetermined period of time (approximately 300 milliseconds to 10 seconds), and then proceeds back to Step S030.
- In Step S180, the processing performs, for the
first sensor 1004 and thesecond sensor 1005, power-off processing or idle setting. - The processing ends at an end step S190.
- The
video display device 10 has thestorage unit 1010. After changing the display method of thevideo display unit 1001 or the processing method of thevideo processing unit 1008, thecontrol unit 1003 records the history, the time, and the like of the change in thestorage unit 1010. - When the number of changes to certain processing recorded in the
storage unit 1010 exceeds a predetermined number within a predetermined period of time, e.g., when five changes are made in three days, the frequencydetermination processing unit 1011 requests thevideo information source 1007 to change the video information settings according to the change history. The request to change video information is issued for example in the termination processing in Step S180 ofFIG. 6 , and to change video information is to change parameters such as image contrast, sharpness, saturation, hue, image brightness, and the like. - 4. Line-of-Sight Sensor
- An example of the
second sensor 1005 as a line-of-sight sensor is now described. - As illustrated in
FIG. 11A , thesecond sensor 1005 includes asensor A element 1101, asensor B element 1102, and adetection controller 1103. When thesensor A element 1101 detects a movement, thedetection controller 1103 activates thesensor 2element 1102 and instructs the sensor B element to output more precise data. -
FIG. 11B illustrates the configuration more specifically. This configuration is suitable to be mounted on a frame part of a glasses-like device, like the ones inFIG. 1B and 1D . The line-of-sight sensor is capable of detecting the movement of aneye 1111 of thedevice user 20, and includes afirst light emitter 1112 and asecond light emitter 1113 that emit infrared light, afirst light receiver 1114 and asecond light receiver 1115, acomparator 1116, afirst camera 1117 and asecond camera 1118, acurrent control unit 1119, a movementdetection processing unit 1120, and anidleness control unit 1121. - Infrared light emitted by the
first light emitter 1112 and thesecond light emitter 1113 is projected onto and reflected by theeye 1111 of thedevice user 20. The reflected infrared light is incident on thefirst light receiver 1114 and thesecond light receiver 1115. Thefirst light receiver 1114 and thesecond light receiver 1115 are installed in different directions, the left side and the right side, of theeye 1111, and receive varying amounts of light depending on the position of the iris and the position of the white part of the eye. Since thefirst light receiver 1114 and thesecond light receiver 1115 are placed on the left side and the right side of theeye 1111, a change in the amount of light received due to a displacement of theeye 1111 is different for each light receiver. A movement of theeye 1111 can be detected when thecomparator 1116 obtains the difference between the amount of light received by thefirst light receiver 1114 and the amount of light received by thesecond light receiver 1115. This detection method is called a scleral reflection method. - When an output from the
comparator 1116 is larger than or equal to a predetermined value, themovement detector 1120 determines that a movement of theeye 1111 is detected, and outputs a movement detected signal. When an output from thecomparator 1116 is smaller than the predetermined value, themovement detector 1120 determines that theeye 1111 has not moved, and outputs a movement undetected signal. - Upon receipt of a movement detected signal from the
movement detector 1120, theidleness control unit 1121 brings thefirst camera 1117 and thesecond camera 1118 to an imaging state capable of imaging videos, and these cameras image theeye 1111 using the infrared light reflected by theeye 1111. An image processing unit (not shown) performs image processing on the videos imaged by thefirst camera 1117 and thesecond camera 1118, to estimate a detailed movement of the line of sight and the position of the viewpoint. A dark pupil method, a corneal reflection method, or the like is used for the image processing. - Upon receipt of a movement undetected signal from the
movement detector 1120, theidleness control unit 1121 brings the first camera and the second camera to an idle state in which part of the functions of the first and second cameras are stopped to reduce power consumption. - The movement
detection processing unit 1120 changes power to be supplied to thefirst light emitter 1112 and thesecond light emitter 1113 by sending a movement detected signal or a movement undetected signal to thecurrent control unit 1119. Thecurrent control unit 1119 performs control such that the amount of current in the imaging state is larger than the amount of current in the idle state. - Typically, camera elements consume more power than light receivers and need more light for detection. Thus, this control method reduces power consumption by the device by bringing the camera elements to the imaging state only when the light receivers have detected a rough movement, instead of keeping the camera elements in the imaging state all the time.
- Although two light emitters are used in the present embodiment by way of example, the number of light emitters is not limited to two. The line-of-sight sensor may have more light emitters. Further, the line-of-sight sensor may be so configured that the light emitters are controlled to emit light at different timings, and that the light receivers or camera elements acquire data to coincide with the light emission by the respective light emitters.
- Also, although two camera elements and two light receivers are used in the above example, their numbers are not limited to such numbers. Further, a light receiver and a camera element may be configured as a single element, and for example, part of the pixels of a camera element may be configured as a light receiver.
- 5. Viewpoint Deviation Correction and Detection of the Position of a Point of Gaze
- In the initialization step S020 illustrated in
FIG. 6 , initialization processing for thesecond sensor 1005 may be performed. When thesecond sensor 1005 is a sensor that detects the line of sight of thedevice user 20, deviation occurs between the point of gaze of thedevice user 20 and the position of the second sensor each time the device is used. - To correct this deviation, during the initialization processing the motion
determination processing unit 1006 displays avirtual video 1201 of diagonal lines in a display region as illustrated inFIG. 12A , and displays text prompting thedevice user 20 to gaze at the intersection of the diagonal lines. When thesecond sensor 1005 detects that the line of sight is steadily located at the position of the point of gaze for predetermined seconds, the motiondetermination processing unit 1006 determines that thedevice user 20 is gazing at the intersection in thevirtual video 1201, and sets the position of the point of gaze as a reference position p0 (h0, v0) for thesecond sensor 1005. - If the full video display range of the
video display unit 1001 is, like thevirtual video 1201, a square surrounded by P1 (Hmin, Vmin), P2 (Hmax, Vmin), P3 (Hmin, Vmax), and P4 (Hmax, Vmax), then h0=(Hmax−Hmin)/2, and v0=(Vmax−Vmin)/2. - In the point-of-gaze determination step S070, it is determined, using the reference position p0 (h0, v0) as the reference, whether the detection result of the
second sensor 1005 is on the video displayed on the video display unit 1001 (i.e., whether Z is ‘1’). Assume a case where the position of the point of gaze obtained by thesecond sensor 1005 is p (h, v) when, as illustrated inFIG. 12B , anyvirtual display video 1202 is displayed in a region surrounded by Q1 (Hl, Vd), Q2 (Hl, Vu), Q3 (Hr, Vd), and Q4 (Hr, Vu) (Hmin<=Hl<=Hmax, Hmin<=Hr<=Hmax, Vmin<=Vd<=Vmax, and Vmin<=Vu<=Vmax). In this case, when p (h, v) is inside the square Q1Q2Q3Q4, the motiondetermination processing unit 1006 determines that the position of the point of gaze is on the virtual video and that Z in Step S070 is true. - The reference position may be detected not in the initialization processing, but in a time designated by the
device user 20 through thecontroller 1020. Thedevice user 20 may command detection timing to thesecond sensor 1005 by purposely blinking for a particular length or in a particular order before or after gazing at a designated point. - The
virtual display video 1202 does not have to be square, but may be in other shapes such as a triangle or a circle. - 6. Directions of the Movement of the Head and the Movement of the Line of Sight
- The
movement determination unit 1006 detects, based on an motion vector output from thefirst sensor 1004 and a motion vector output from thesecond sensor 1005, whether the movement of the head and the movement of the line of sight match in direction. - Assume that an output from the
first sensor 1004 can be expressed by a three-dimensional vector A0. With thedevice user 20 being within the range to recognize thevirtual video 7001 and S denoting a virtual plane containing the four corners of thevirtual video 7001, an orthographic projection vector A of the three-dimensional vector A0 with respect to the plane S is obtained. When an output from thesecond sensor 1005 is a three-dimensional vector, similar vector transformation processing is performed. - The range in which the
device user 20 recognizes thevirtual video 7001 is determined by the optical configuration of theprojection unit 2007 in thevideo display unit 1001 and the like, and thedevice user 20 uses its focusing function of the eyeball to recognize thevirtual video 7001 at a location at a predetermined distance. - If the
second sensor 1005 outputs a two-dimensional vector B0, the two-dimensional vector B0 is transformed into a three-dimensional vector B1 on a plane T which is in three-dimensional space and contains a detection axis of thesecond sensor 1005, and an orthographic projection vector B of the three-dimensional vector Bi with respect to the plane S is obtained. When an output from thefirst sensor 1004 is a two-dimensional vector, similar vector transformation processing may be performed. - When the directions of the movements from the detection sensors are both expressed as vectors on a single plane with A being the motion vector outputted from the
first sensor 1004 and B being the motion vector outputted from thesecond sensor 1005, it is determined whether the directions of the movements substantially match, based on a comparison between the absolute value of an angle θ formed by these vectors, ANGLEθ=ANGLE(A−B), and any value α (α is a positive value). - Specifically, it is determined that the directions of the movements substantially match if θ<=α, and do not match if θ>α.
- 7. Modifications
- (1) The
first sensor 1004 and thesecond sensor 1005 may be an acceleration sensor, a geomagnetic sensor, a GPS, a camera that captures the user, a camera that captures a video of the surroundings seen from the user, a sensor that measures user's pulse, a sensor that measures user's blood flow, a watch, or the like. Further, each of thefirst sensor 1004 and thesecond sensor 1005 may include a filter, an amplifier, a level shifter, and/or the like. Also, each of thefirst sensor 1004 and thesecond sensor 1005 may include a comparator and be configured to transmit, along with the vector value, a binary result indicating whether a detection result is higher or lower than a threshold. Also, thefirst sensor 1004 and thesecond sensor 1005 may be configured to output a signal indicating that a movement is detected, when one of the following conditions is met: when the duration time of a movement, being a detection result, exceeds a predetermined period of time, when the speed of a movement exceeds a predetermined speed, and when the displacement of a movement exceeds a predetermined displacement. - (2) Instead of the determination processing that the motion
determination processing unit 1006 performs using an output from thefirst sensor 1004 or thesecond sensor 1005, the videodetermination processing unit 1009 may determine image features. - The video
determination processing unit 1009 determines whether video information can cause a display discomfort to thedevice user 20. For example, in a case of a video that moves continuously on the screen like the one illustrated inFIG. 9A , thedevice user 20 is expected to follow the moving video by moving their eyeballs. Also, for example, in a case of a video with an on-screen content, such as text, that thedevice user 20 can see its meaning by recognizing it vertically, horizontally, or diagonally like the one illustrated inFIG. 9B , thedevice user 20 is expected to move their line of sight along the video by moving their eyeballs. - For such an image, the video
determination processing unit 1009 can detect a movement in advance by performing video analysis on digital images and referring to the amount in difference data between image frames. - For videos, like the ones in
FIGS. 9A and 9B , for which thedevice user 20 is expected in advance to move their line of sight, the videodetermination processing unit 1009 outputs a control signal to thecontrol unit 1003 so that the processing C may be employed. - When the
projection unit 2007 of thevideo display device 10 is a transparent type, the videodetermination processing unit 1009 may determine whether a displayed video is a video related to the surroundings, e.g., an augmented reality (AR) video. The videodetermination processing unit 1009 can determine the type of a video based on metadata on the video or additional information to the video. For example, thevirtual video 7001 inFIG. 7B displays information related to the surroundings inFIG. 7A . When it is determined that a video is related to the surroundings, the importance in the visibility of a displayed image is high even if the point of gaze is detected at a position outside the virtual video 7001 (Z is false). Thus, the videodetermination processing unit 1009 commands thecontrol unit 1003 to employ the processing C. - (3) The display method illustrated in
FIG. 10A may be employed as the display method of thevideo display unit 1001. In the display method illustrated inFIG. 10A , the driving speed is double-speed as inFIG. 4B , and light-source control is stopped so that the light emission periods of thered LED 3001, thegreen LED 3002, and theblue LED 3003 of thelight source element 2003 are substantially synchronized. Thereby, light from the three LEDs are mixed in color, and thedevice user 20 recognizes the video as a black and white image, so that color breakup does not occur in principle. InFIG. 5 , this may be applied to the processing C or D ofFIG. 5 since the processing C and the processing D are employed when there is a high need to pay consideration to the discomfort that may be caused by the video. - Further, when images in video information in a plurality of successive frames are substantially the same, frame update by the
modulator 2005 may be stopped. For example, as inFIG. 10B , when frames 1, 2, and 3 are substantially the same images, themodulator 2005 keeps displaying thesame frame 1 for the period corresponding to these frames. - Cases where images are substantially the same include: a case where, when video information can be represented as, for example, H×V pieces of pixel information (both H and V are positive integers), the number of pixels that are changed in information between successive frames is sufficiently smaller than the value H×V; and a case where, when color information on each pixel can be represented by R, G, and B primary color information (e.g., R, G, and B are all positive integers between 0 to 255), changes in R, G, and B values between successive frames are sufficiently small.
- Thereby, image flickers can be reduced. Further, when the
modulator 2005 uses ferroelectric liquid crystals, to display the same frame a plurality of times consumes power for deletion and re-display of information. Stopping frame update by themodulator 2005 leads to a further reduction of power consumption. This may be applied to the processing C and the processing D inFIG. 5 . - (4) The
video information source 1007 may be configured to externally acquire video information. For example, thevideo information source 1007 may be a receiver conforming to video transmission standards such as DVI, HDMI (registered trademark), or Display Port, a receiver employing a general method for electric signal transmission, such as SPI, I2C, RS232, or USB, a receiver of a wired network such as Ethernet (registered trademark), or a receiver of a wireless network such as a wireless LAN or Bluetooth (registered trademark). - The
video information source 1007 may include a decoder that receives and expands compressed information to obtain video information, or may include a function to receive and decrypt encrypted video information. - (5) The
power supply unit 1012 supplies power to thevideo display device 10. As a power source, thepower supply unit 1012 includes at least one of a rechargeable battery that can be charged by an external power source, a power source circuit that takes a desired amount of power out from a replaceable primary battery, a converter that connects to an external power source such as an electrical outlet to take a predetermined amount of power therefrom, a power stabilization circuit. Further, thepower supply unit 1012 may include, in addition to the power source, an integrated circuit for power control to control charging and supplying power and to monitor the power source. - The
control unit 1003 acquires information on the level of power remaining in the power source from thepower supply unit 1012, and performs control such that thevideo processing unit 1008 performs video processing only when the remaining power level exceeds a predetermined value. - The
control unit 1003 may also be configured to be able to change the display method of thevideo display unit 1001 to shorten the display intervals only when the level of power remaining in thepower supply unit 1012 exceeds a predetermined value. - The
control unit 1003 may also be configured to change the display method of thevideo display unit 1001 to extend the display intervals when the level of power remaining in thepower supply unit 1012 falls below a predetermined value. -
FIG. 13 is adiagram illustrating Embodiment 2. Only points different from those inFIG. 1 are described. - A
first sensor 1304 and asecond sensor 1305 are provided separately from the casing of avideo display device 130. Each of the sensors detects an action of thedevice user 20, as the sensors inEmbodiment 1 do. - In the present embodiment, the
first sensor 1304 and thevideo display device 130 exchange information via acommunication unit 1013. Thecommunication unit 1013 and thefirst sensor 1304 may communicate with each other using electrical signals on a conductor physically connecting them to each other, or may communicate via a wireless communication such as a wireless LAN, a Bluetooth (registered trademark), or Zigbee (registered trademark). Thefirst sensor 1304 may include a communication unit (not shown). To use a wireless communication, thefirst sensor 1304 may be supplied with power from a power source different from the one for thevideo display device 130. - Like the
first sensor 1305, thesecond sensor 1305 may exchange information with thecommunication unit 1013 using the wired or wireless communication described above. Thesecond sensor 1305 may include a communication unit (not shown). Further, to use a wireless communication, thesecond sensor 1305 may be supplied with power from a power source different from the one for thevideo display device 130. - The motion
determination processing unit 1006 receives an output from thefirst sensor 1304 and an output from thesecond sensor 1305 via thecommunication unit 1013. -
FIGS. 14 and 15 are modifications of the present embodiment. In the modification illustrated inFIG. 14 , afirst sensor 1404, asecond sensor 1405, and a motiondetermination processing unit 1406 are provided separately from the casing of avideo display device 140. The motiondetermination processing unit 1406 and thecontrol unit 1003 exchange information via thecommunication unit 1013 using the wired or wireless communication described above. Thefirst sensor 1404, thesecond sensor 1405, and the motiondetermination processing unit 1406 may be contained in the same casing. Further, the motiondetermination processing unit 1406 may include a communication unit (not shown). - In the modification illustrated in
FIG. 15 , thesecond sensor 1305 is provided separately from avideo display device 150. Thesecond sensor 1305 and the motiondetermination processing unit 1006 may exchange information via thecommunication unit 1013 using a wired or wireless communication as described above. - The separately-provided
first sensor 1304 andsecond sensor 1305 do not need to be worn by thedevice user 20. Thefirst sensor 1304 and thesecond sensor 1305 only have to detect a movement of the head of thedevice user 20, a movement of an eye, and the like, and may be, for example, sensors using a camera and image processing. Such a case includes a situation where thedevice user 20 is at a fixed location and performs certain work by viewing a video, with thefirst sensor 1304 and thesecond sensor 1305 in camera forms being placed on the working table. - In a case where the
device user 20 uses thevideo display device 130, thevideo display device 140, or thevideo display device 150 with thedevice user 20 standing up or sitting down in a fixed position, a detection sensor such as a pressure distribution measurement device that measures the displacement of the center of gravity may be provided under thedevice user 20 and used as thefirst sensor 1304 or thesecond sensor 1305. -
FIG. 16 is adiagram illustrating Embodiment 3. Only points different from those inFIG. 1 are described. - In the present embodiment, a
storage unit 1610 and a frequencydetermination processing unit 1611 are provided in aserver 1601 separately from avideo display device 160. Thestorage unit 1610 and the frequencydetermination processing unit 1611 operate in the same manners as thestorage unit 1010 and the frequencydetermination processing unit 1011 inEmbodiment 1 do. - The storage unit 1310 and the
control unit 1003 exchange information via thecommunication unit 1013 of thevideo display device 160 and acommunication unit 1612 of theserver 1601, and thecommunication unit 1013 and thecommunication unit 1612 may communicate using electric signals on a conductor physically connecting them to each other, or may be performed via a wireless communication such as a wireless LAN, Bluetooth (registered trademark), or Zigbee (registered trademark). - Similarly, the frequency
determination processing unit 1611 and thevideo information source 1007 may exchange information via thecommunication unit 1013 and thecommunication unit 1612 using a wired or wireless communication as described above. -
FIG. 17 illustrates a control method for a system including a plurality ofvideo display devices 160. Each of a firstvideo display device 1711, a secondvideo display device 1712, a thirdvideo display device 1713, and a fourthvideo display device 1714 can communicate with theserver 1601 via anetwork 1730. - The first
video display device 1711 is used by afirst user 1721, the secondvideo display device 1712 is used by asecond user 1722, the thirdvideo display device 1713 is used by athird user 1723, and the fourthvideo display device 1714 is used by afourth user 1724. - The first
video display device 1711, the secondvideo display device 1712, the thirdvideo display device 1713, and the fourthvideo display device 1714 have the same capabilities as thevideo display device 160. - The
server 1601 has thestorage unit 1610 and the frequencydetermination processing unit 1611, and when the display method of thevideo display unit 1001 is changed or when the processing method of thevideo processing unit 1008 is changed, receives a history and a time of the change from thecontrol unit 1003 of a corresponding one of the firstvideo display device 1711, the secondvideo display device 1712, the thirdvideo display device 1713, and the fourthvideo display device 1714 via thenetwork 1730. - The
server 1601 extracts information common in the pieces of change information transmitted from the respective video display devices, and the frequencydetermination processing unit 1611 requests thevideo information sources 1007 to change video information when processing recorded in thestorage units 1610 and common in the video display devices exceeds a predetermined number of times within a predetermined period of time. To change video information is to change parameters such as image contrast, sharpness, saturation, hue, or image brightness. - The number of
video display devices 160 connected to thenetwork 1730 is not limited to the number shown in the present embodiment. At least onevideo display device 160 only needs to be connected. - Although the present invention has been described using the embodiments, the present invention is not limited to those embodiments and, as it is apparent to those skilled in the art, can be changed and altered variously without departing from the spirit of the present invention and the scope of the appended claims.
-
- 10 video display device
- 1001 video display unit
- 1002 display control unit
- 1003 control unit
- 1004 first sensor
- 1005 second sensor
- 1006 motion determination processing unit
- 1007 video information source
- 1008 video processing unit
- 1009 video determination processing unit
- 1010 storage unit
- 1011 frequency determination processing unit
- 1012 power supply unit
- 1013 communication unit
- 1020 controller
- 20 device user
- 2001 video signal processing unit
- 2002 light source element power supply control unit
- 2003 light source element
- 2004 light source driver
- 2005 modulator
- 2006 modulator driver
- 2007 projection unit
- 2008 settings control unit
- 3001 red LED
- 3002 green LED
- 3003 blue LED
- 3004,3005,3006 current limiting resistance
- 7001 virtual video
- 1101 sensor A
- 1102 sensor B
- 1103 detection control unit
- 1111 eye
- 1112, 1113 light emitter
- 1114, 1115 light receiver
- 1116 comparator
- 1117, 1118 camera
- 1119 current control unit
- 1120 movement detection processing unit
- 1121 idleness control unit
- 1201 virtual video
- 1202 virtual display video
- 130 video display device
- 1304 first sensor
- 1305 second sensor
- 140 video display device
- 1404 first sensor
- 1405 second sensor
- 1406 motion determination processing unit
- 150 video display device
- 160 video display device
- 1601 server
- 1610 storage unit
- 1611 frequency determination processing unit
- 1612 communication unit
- 1711 first video display device
- 1712 second video display device
- 1713 third video display device
- 1714 fourth video display device
- 1721 first user
- 1722 second user
- 1723 third user
- 1724 fourth user
- 1730 network
Claims (15)
1. A video display device wearable on a head of a user, comprising:
a video display unit capable of switching a display method between two or more display methods;
a control unit that designates the display method for the video display unit;
a first detection unit that detects a movement of the head of the user;
a second detection unit that detects a movement of a viewpoint of the user; and
a motion determination unit that determines a motional state of the user of the device based on an output from the first detection unit and an output from the second detection unit, wherein
the control unit instructs the video display unit to change the display method according to a result of the determination made by the motion determination unit.
2. The video display device according to claim 1 , wherein
the video display unit is a display using a liquid crystal element or a mirror array element, and
the two or more display methods are different from each other in at least one of an interval for updating information on pixels of the liquid crystal element or the mirror array element and an interval for driving a light source.
3. The video display device according to claim 1 , comprising video processing means capable of changing at least one of video parameters which are contrast, sharpness, saturation, hue, and image brightness, wherein
the two or more display methods are methods in which at least one of the contrast, the sharpness, the saturation, the hue, and the image brightness is made different by the video processing means.
4. The video display device according to claim 2 , comprising video processing means capable of changing at least one of video parameters which are contrast, sharpness, saturation, hue, and image brightness, wherein
the two or more display methods are methods in which at least one of the contrast, the sharpness, the saturation, the hue, and the image brightness is made different by the video processing means.
5. The video display device according to claim 2 , wherein
based on the output from the first detection unit and the output from the second detection unit, the motion determination unit determines, as the motional state of the user, whether there is a movement of the head of the user, whether there is a movement of a line of sight of the user, whether the viewpoint of the user is on a video being displayed by the video display unit, and whether the movement of the head and the movement of the line of sight match each other.
6. The video display device according to claim 5 , wherein
the larger the movement of the line of sight of the user relative to the movement of the head of the user, the control unit extends at least one of the interval for updating information on the pixels of the liquid crystal element or the mirror array element and the interval for driving the light source for the video display unit, and
the control unit lowers at least one of the contrast and the brightness of the video when the viewpoint of the user is not on the video being displayed by the video display unit.
7. The video display device according to claim 1 , further comprising:
a storage unit that stores a change of any of the video parameters commanded by the control unit; and
a frequency determination unit that determines content of the change stored in the storage unit and the number of times the change has been made within a prescribed period of time, wherein
the frequency determination unit notifies a video information source of the change of the video parameter that has been changed many times.
8. The video display device according to claim 5 , wherein
the video display unit is a field sequential display, and can perform black and white display by substantially synchronizing intervals of light emissions of all of light sources for R, G, and B, and
when the motion determination unit determines that there is at least one of the movement of the head of the user and the movement of the line of sight of the user, the control unit instructs the video display unit to perform the black and white display.
9. The video display device according to claim 5 , wherein
the motion determination unit transforms an output vector from the first detection unit and an output vector from the second detection unit into two-dimensional vectors by orthographically projecting the output vectors onto a video display plane, and determines that the movement of the head and the movement of the line of sight match in direction when an angle between the two transformed two-dimensional vectors falls below a prescribed value.
10. The video display device according to claim 1 , wherein
the first detection unit includes one of a gyro sensor, an acceleration sensor, a geomagnetic sensor, a GPS, and a camera.
11. The video display device according to claim 1 , wherein
the control unit acquires information on an amount of power remaining in a power source and limits the change of the display method of the video display unit.
12. A method for displaying a video on a video display unit wearable on a head of a user and capable of switching a display method between two or more display methods, the method comprising:
causing a first detection unit to detect a movement of the head of the user;
causing a second detection unit to detect a movement of a viewpoint of the user;
causing a motion determination unit to determine a motional state of the user of the device based on an output from the first detection unit and an output from the second detection unit; and
instructing the video display unit to change the display method according to a result of the determination made by the motion determination unit.
13. A method for detecting a movement of a line of sight or a viewpoint, comprising:
detecting a movement of a line of sight using a first sensor element; and
in response to detection of the movement of the line of sight by the first sensor element, switching a status of a second sensor element between an operation state capable of detecting the line of sight and an idle state, the second sensor element being capable of detecting a movement of the line of sight more precisely than the first sensor element.
14. A method for viewpoint deviation correction performed by a video display device wearable on a head of a user, comprising:
displaying a prescribed point on the video display device;
prompting the user to gaze at the prescribed point;
determining a position of a point of gaze of the user using a viewpoint detection sensor;
setting the thus-determined position of the point of gaze as a reference point; and
based on a distance between the reference point and the viewpoint of the user detected by the viewpoint detection sensor, determining whether the viewpoint of the user is on a video being displayed.
15. A line-of-sight detector comprising:
at least two light emitters that irradiate different positions on an eye with light;
two light receivers that are located at different positions and receive light reflected by the eye irradiated with the light;
a movement detector that detects a movement of an eyeball based on outputs from the two light receivers; and
a camera that captures an image of at least one eye and operates while the movement detector is detecting the movement of the eyeball.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2015/066383 WO2016194232A1 (en) | 2015-06-05 | 2015-06-05 | Video display device and control method |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/066383 A-371-Of-International WO2016194232A1 (en) | 2015-06-05 | 2015-06-05 | Video display device and control method |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/546,612 Division US11856323B2 (en) | 2015-06-05 | 2019-08-21 | Video display device and control method |
Publications (1)
Publication Number | Publication Date |
---|---|
US20180184042A1 true US20180184042A1 (en) | 2018-06-28 |
Family
ID=57440838
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/579,295 Abandoned US20180184042A1 (en) | 2015-06-05 | 2015-06-05 | Video display device and control method |
US16/546,612 Active US11856323B2 (en) | 2015-06-05 | 2019-08-21 | Video display device and control method |
US18/389,249 Pending US20240080412A1 (en) | 2015-06-05 | 2023-11-14 | Video display device and control method |
Family Applications After (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/546,612 Active US11856323B2 (en) | 2015-06-05 | 2019-08-21 | Video display device and control method |
US18/389,249 Pending US20240080412A1 (en) | 2015-06-05 | 2023-11-14 | Video display device and control method |
Country Status (4)
Country | Link |
---|---|
US (3) | US20180184042A1 (en) |
JP (1) | JP6571767B2 (en) |
CN (1) | CN107836020B (en) |
WO (1) | WO2016194232A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11176901B1 (en) * | 2019-08-13 | 2021-11-16 | Facebook Technologies, Llc. | Pan-warping and modifying sub-frames with an up-sampled frame rate |
US11283301B2 (en) * | 2018-01-18 | 2022-03-22 | Maxell, Ltd. | Wearable device, battery used in the same, and power supply system |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6540676B2 (en) * | 2016-12-20 | 2019-07-10 | ブラザー工業株式会社 | Head mounted display |
JP6801618B2 (en) * | 2017-09-23 | 2020-12-16 | ブラザー工業株式会社 | Portable image display device and image display control method |
US11955094B2 (en) | 2019-02-01 | 2024-04-09 | Sony Interactive Entertainment Inc. | Head-mounted display and image displaying method |
CN110706675A (en) * | 2019-09-29 | 2020-01-17 | Oppo广东移动通信有限公司 | Information display method and device |
JP7377124B2 (en) * | 2020-02-18 | 2023-11-09 | 株式会社ジャパンディスプレイ | display device |
US11308920B1 (en) * | 2021-05-07 | 2022-04-19 | Facebook Technologies, Llc. | Display artifact reduction |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160266386A1 (en) * | 2015-03-09 | 2016-09-15 | Jason Scott | User-based context sensitive hologram reaction |
US20170287222A1 (en) * | 2016-03-30 | 2017-10-05 | Seiko Epson Corporation | Head mounted display, method for controlling head mounted display, and computer program |
Family Cites Families (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2911125B2 (en) * | 1988-08-31 | 1999-06-23 | キヤノン株式会社 | Eye gaze detection device |
JP2861349B2 (en) * | 1990-09-29 | 1999-02-24 | キヤノン株式会社 | Optical device having line-of-sight detecting means |
JPH05215960A (en) * | 1992-02-06 | 1993-08-27 | Minolta Camera Co Ltd | Line of sight detection camera |
JPH0634873A (en) * | 1992-07-16 | 1994-02-10 | Canon Inc | Optical device provided with device for detecting line of sight |
JP3352195B2 (en) * | 1993-11-25 | 2002-12-03 | キヤノン株式会社 | Eye gaze detecting device, optical device, and eye gaze detecting method |
JP2007101618A (en) * | 2005-09-30 | 2007-04-19 | Konica Minolta Photo Imaging Inc | Display device |
JP2007114579A (en) * | 2005-10-21 | 2007-05-10 | Pioneer Electronic Corp | Display device, method, system, and server, and program |
JP5228305B2 (en) | 2006-09-08 | 2013-07-03 | ソニー株式会社 | Display device and display method |
CN101729918A (en) * | 2009-10-30 | 2010-06-09 | 无锡景象数字技术有限公司 | Method for realizing binocular stereo image correction and display optimization |
GB2491538A (en) * | 2010-03-09 | 2012-12-05 | Hdt Inc | Color display device and method |
CN102263874B (en) * | 2010-05-28 | 2014-03-12 | 京瓷办公信息系统株式会社 | Image forming apparatus having power saving mode |
JP2012203128A (en) * | 2011-03-24 | 2012-10-22 | Seiko Epson Corp | Head mounted display and method for controlling head mounted display |
US9497501B2 (en) * | 2011-12-06 | 2016-11-15 | Microsoft Technology Licensing, Llc | Augmented reality virtual monitor |
US9501152B2 (en) * | 2013-01-15 | 2016-11-22 | Leap Motion, Inc. | Free-space user interface and control using virtual constructs |
JP5994412B2 (en) * | 2012-06-13 | 2016-09-21 | 富士ゼロックス株式会社 | Image display apparatus, image control apparatus, image forming apparatus, and program |
US8736692B1 (en) * | 2012-07-09 | 2014-05-27 | Google Inc. | Using involuntary orbital movements to stabilize a video |
JP2014068145A (en) * | 2012-09-25 | 2014-04-17 | Kyocera Corp | Portable terminal, display control program and display control method |
US9619911B2 (en) * | 2012-11-13 | 2017-04-11 | Qualcomm Incorporated | Modifying virtual object display properties |
JP5640178B1 (en) * | 2013-02-04 | 2014-12-10 | 株式会社 オルタステクノロジー | Liquid crystal display |
JP2014225725A (en) * | 2013-05-15 | 2014-12-04 | ソニー株式会社 | Display device and light source for image display device |
US9367960B2 (en) * | 2013-05-22 | 2016-06-14 | Microsoft Technology Licensing, Llc | Body-locked placement of augmented reality objects |
US9230368B2 (en) * | 2013-05-23 | 2016-01-05 | Microsoft Technology Licensing, Llc | Hologram anchoring and dynamic positioning |
JP6337433B2 (en) * | 2013-09-13 | 2018-06-06 | セイコーエプソン株式会社 | Head-mounted display device and method for controlling head-mounted display device |
US9753288B2 (en) * | 2014-01-21 | 2017-09-05 | Osterhout Group, Inc. | See-through computer display systems |
JP5904246B2 (en) * | 2014-09-24 | 2016-04-13 | ソニー株式会社 | Head-mounted display device and display method |
-
2015
- 2015-06-05 CN CN201580080171.2A patent/CN107836020B/en active Active
- 2015-06-05 US US15/579,295 patent/US20180184042A1/en not_active Abandoned
- 2015-06-05 WO PCT/JP2015/066383 patent/WO2016194232A1/en active Application Filing
- 2015-06-05 JP JP2017521479A patent/JP6571767B2/en active Active
-
2019
- 2019-08-21 US US16/546,612 patent/US11856323B2/en active Active
-
2023
- 2023-11-14 US US18/389,249 patent/US20240080412A1/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160266386A1 (en) * | 2015-03-09 | 2016-09-15 | Jason Scott | User-based context sensitive hologram reaction |
US20170287222A1 (en) * | 2016-03-30 | 2017-10-05 | Seiko Epson Corporation | Head mounted display, method for controlling head mounted display, and computer program |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11283301B2 (en) * | 2018-01-18 | 2022-03-22 | Maxell, Ltd. | Wearable device, battery used in the same, and power supply system |
US20220166258A1 (en) * | 2018-01-18 | 2022-05-26 | Maxell, Ltd. | Wearable device, battery used in the same, and power supply system |
US11614630B2 (en) * | 2018-01-18 | 2023-03-28 | Maxell, Ltd. | Wearable device, battery used in the same, and power supply system |
US11176901B1 (en) * | 2019-08-13 | 2021-11-16 | Facebook Technologies, Llc. | Pan-warping and modifying sub-frames with an up-sampled frame rate |
Also Published As
Publication number | Publication date |
---|---|
US20240080412A1 (en) | 2024-03-07 |
JPWO2016194232A1 (en) | 2018-05-24 |
JP6571767B2 (en) | 2019-09-04 |
CN107836020B (en) | 2020-09-25 |
CN107836020A (en) | 2018-03-23 |
US20190379860A1 (en) | 2019-12-12 |
WO2016194232A1 (en) | 2016-12-08 |
US11856323B2 (en) | 2023-12-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11856323B2 (en) | Video display device and control method | |
US10725300B2 (en) | Display device, control method for display device, and program | |
KR102233223B1 (en) | Image display device and image display method, image output device and image output method, and image display system | |
US9613592B2 (en) | Head mounted display device and control method for head mounted display device | |
US9836120B2 (en) | Display device, method of controlling the same, and computer program | |
US9448625B2 (en) | Head-mounted display device, control method for head-mounted display device, and image display system | |
US20160239082A1 (en) | Head-mounted display device and method of controlling head-mounted display device | |
US20160291666A1 (en) | Display apparatus and display apparatus control method | |
US9898097B2 (en) | Information processing apparatus and control method of information processing apparatus | |
US20140225812A1 (en) | Head mounted display, control method for head mounted display, and image display system | |
CN109960481B (en) | Display system and control method thereof | |
TW201527798A (en) | Head mounted display apparatus and backlight adjustment method thereof | |
JP2012203128A (en) | Head mounted display and method for controlling head mounted display | |
US20160109703A1 (en) | Head mounted display, method for controlling head mounted display, and computer program | |
US20120236131A1 (en) | Display device, display system, and method for controlling display device | |
US10473937B2 (en) | Displays | |
JP5803193B2 (en) | Head-mounted display device and method for controlling head-mounted display device | |
WO2017153778A1 (en) | Head-mountable display | |
JP6631014B2 (en) | Display system and display control method | |
JP6428024B2 (en) | Display device, display device control method, and program | |
JP6743254B2 (en) | Video display device and control method | |
JP5828400B2 (en) | Video display device and video display method | |
JP6430347B2 (en) | Electronic device and display method | |
JP2019013050A (en) | Electronic apparatus and display method | |
JP2015212828A (en) | Head mounted display and method for controlling head mounted display |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: MAXELL, LTD., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MATSUDA, TAKAHIRO;OUCHI, SATOSHI;SEO, YOSHIHO;AND OTHERS;SIGNING DATES FROM 20171110 TO 20171113;REEL/FRAME:044675/0142 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |