WO2010095347A1 - Head mounted display - Google Patents

Head mounted display Download PDF

Info

Publication number
WO2010095347A1
WO2010095347A1 PCT/JP2010/000181 JP2010000181W WO2010095347A1 WO 2010095347 A1 WO2010095347 A1 WO 2010095347A1 JP 2010000181 W JP2010000181 W JP 2010000181W WO 2010095347 A1 WO2010095347 A1 WO 2010095347A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
output value
output
line
value
Prior art date
Application number
PCT/JP2010/000181
Other languages
French (fr)
Japanese (ja)
Inventor
矢田裕紀
Original Assignee
ブラザー工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ブラザー工業株式会社 filed Critical ブラザー工業株式会社
Publication of WO2010095347A1 publication Critical patent/WO2010095347A1/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/10Beam splitting or combining systems
    • G02B27/14Beam splitting or combining systems operating by reflection only
    • G02B27/145Beam splitting or combining systems operating by reflection only having sequential partially reflecting surfaces
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/08Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
    • G02B26/10Scanning systems
    • G02B26/101Scanning systems with both horizontal and vertical deflecting means, e.g. raster or XY scanners
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/10Beam splitting or combining systems
    • G02B27/1006Beam splitting or combining systems for splitting or combining different wavelengths
    • G02B27/102Beam splitting or combining systems for splitting or combining different wavelengths for generating a colour image from monochromatic image signal sources
    • G02B27/104Beam splitting or combining systems for splitting or combining different wavelengths for generating a colour image from monochromatic image signal sources for use with scanning systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0112Head-up displays characterised by optical features comprising device for genereting colour display
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0118Head-up displays characterised by optical features comprising devices for improving the contrast of the display / brillance control visibility
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems

Definitions

  • the present disclosure relates to a head-mounted display that presents a content image indicated by content data to a user's eyes so that the user can recognize the content image.
  • a technology related to a head-mounted display has been proposed in which an image indicated by content data is presented to a user's eye so that the image can be viewed, and the image is recognized by the user.
  • the image determination unit includes: When the image displayed on the display unit is determined to be a line image, and the display color determination unit determines that the display color of the background of the line image is white, the display control unit sets the background display color to black, characters, and There has been proposed a technique for controlling the operation of the black and white reversing unit so that the display color of the line is white and the brightness of each line is reversed. (For example, refer to Patent Document 1).
  • the head mounted display may be used while moving, in which case it is driven by a battery.
  • Head mounted displays driven by various power sources including battery drive require a long-time display due to power saving.
  • the visibility of the content image indicated by the content data must not be impaired, and the presentation of the content image that ensures the visibility and the reduction of the power consumption are realized at the same time. There is a need.
  • This disclosure is intended to provide a new head-mounted display capable of reducing power consumption while ensuring the visibility of content images presented to the user's eyes.
  • the head mounted display determines whether a partial image included in a content image is a line image or a non-linear image, and emits image light indicating a non-linear image that is emitted in units of pixels forming the content image. Is output with an output value corresponding to an output output value lower than the output output value when the image light indicating the line image is output.
  • the head-mounted display of the present disclosure is intended for a light-emitting image element (for example, organic EL (Organic Electroluminescence), LED (each pixel is configured by an individual LED)), or a scanning image forming method. For example, it does not include a configuration employing a non-light-emitting image element such as liquid crystal and a light source that illuminates the image element.
  • a head-mounted display that allows a user to visually recognize a content image indicated by content data, the partial image included in the content image that emits light in units of pixels that form the content image
  • An image presentation unit that emits image light indicating the image
  • a determination unit that determines whether the partial image is a line image indicated by a line drawing, or a non-line image indicated by a non-line drawing, and the determination unit includes a line image
  • the image presentation unit is controlled to output image light indicating the line image with an output value corresponding to the first output value, and the determination unit is determined to be the non-linear image.
  • the image presentation control means for controlling the image presentation means to emit the image light indicating the non-linear image at an output value corresponding to a second emission output value lower than the first emission output value.
  • the output output value of image light indicating a non-linear image can be reduced.
  • the total amount of output power can be reduced as compared with a case where the entire content image is output with an output value corresponding to the first output output value.
  • the line image which is character information having a higher importance in the content image, has a relatively high output and is bright, so that visibility can be ensured.
  • the image presentation control unit is configured to display an image indicating the line image in a state in which image light indicating a non-linear image is output with an output value corresponding to the second output output value.
  • the emission output value is an output value corresponding to the first emission output value
  • the emission output value is an output value corresponding to the second emission output value.
  • the output power value can be appropriately changed.
  • the head mounted display includes a detection unit that can use a battery as a driving power source and detects a remaining battery level
  • the image presentation control unit includes the detection unit.
  • the detection unit When the battery remaining amount value detected by the control unit is less than a predetermined remaining amount value, the output of the image light by the output value corresponding to each of the first emission output value and the second emission output value is controlled, and the detection means When the detected battery remaining amount value is greater than the predetermined remaining amount value, the output of the image light indicating the line image and the image indicating the non-linear image are output according to the output value corresponding to the predetermined output value. It is possible to obtain a head mounted display characterized by controlling light emission. In such a head mounted display, power consumption can be reduced when the remaining battery level is low, and long-time driving can be realized.
  • HMD head mounted display
  • a control device that executes various processes are integrated
  • the HMD 100 includes a frame 110 that is worn on the user's head and an image presentation box 130 that presents a content image to be visually recognized by the user.
  • a frame 110 shown in FIG. 1 is formed in a shape similar to a frame of eyeglasses.
  • the frame 110 is not limited to this shape, and may have a helmet shape or the like, and may have another structure that can be mounted on the user's head.
  • An image presentation box 130 including an image presentation device 20 described later with reference to FIG. 2 is disposed in front of the user's left eye.
  • the image presentation box 130 is attached to a predetermined position of the frame 110.
  • the HMD 100 shown in FIG. 1 has a configuration in which one image presentation box 130 is disposed, but may be configured in a manner in which another image presentation box 130 is disposed in front of the user's right eye.
  • the image presentation device 20 in the image presentation box 130 emits image light.
  • the emitted image light is reflected in the direction of the user's left eye by a half mirror which is not drawn in FIG. 1, and is directly applied to the user's eyeball. Thereby, the user visually recognizes the content image.
  • the HMD 100 is a see-through HMD that allows a user to visually recognize the outside world through a half mirror.
  • the HMD 100 is connected to the power controller 400 by a power supply cable 300. Then, power is supplied from a battery (rechargeable battery) 500 integrally formed with the power controller 400 via the power controller 400 to drive the battery.
  • a battery rechargeable battery
  • the integrally configured power supply controller 400 and battery 500 are used by being attached to the user's waist or the like, for example.
  • the HMD 100 includes a control unit 12 that controls the device itself, and a storage unit 14 that stores content data 142.
  • the HMD 100 includes an operation unit 16 that is operated by a user and receives an instruction from the user, and an input / output I / F (Interface) 18.
  • the HMD 100 includes an image presentation device 20 that emits image light. Note that each of these components included in the HMD 100 is provided inside the image presentation box 130 (inside the casing constituting the image presentation box 130).
  • control unit 12 includes, for example, a CPU that executes various arithmetic processes, a ROM that stores various programs, and a RAM as a work area. Further, the control unit 12 includes, for example, a GPU (Graphics Processing Unit) that executes rendering processing and the like based on a command from the CPU.
  • storage part 14 is comprised by the non-volatile memory, for example.
  • the content data 142 stored in the storage unit 14 includes, for example, a landscape image of a work place as shown in FIG.
  • the operation unit 16 is configured by, for example, a key, and receives, for example, instructions to start and end playback of the content data 142.
  • the input / output I / F 18 receives supply of power, and transmits and receives various signals to and from the power controller 400 that supplies and shuts off the power (power) to the HMD 100.
  • the image presentation device 20 can be configured using a retinal scanning display.
  • the image presentation device 20 using the retinal scanning display scans the image light based on the content image data obtained by the rendering process performed on the content data 142 in a two-dimensional direction, and the scanned image light is scanned. It leads to the user's left eye and forms a content image on the user's retina.
  • the image presentation device 20 may be configured to use an organic EL (Electro-Luminescence) display, an LED display, or other light emitting devices in addition to a retinal scanning display. These image presentation devices 20 emit image light emitted in units of pixels forming a content image.
  • the content image includes, for example, the above-described landscape image, in other words, a non-linear image, for example, a line image indicated by a line such as a character, a table, or a diagram explaining the landscape image.
  • a non-linear image for example, a line image indicated by a line such as a character, a table, or a diagram explaining the landscape image.
  • the image presentation apparatus 20 using a retinal scanning display will be described as an example.
  • the content image data generated by rendering the content data 142 is also referred to as “scanned image data”, and the content image represented by the content image data is also referred to as “scanned image”.
  • the control unit 12 executes a program related to reproduction (rendering) of the content data 142 stored in the ROM that constitutes the controller 12 on the RAM, and generates scanned image data representing a scanned image formed by a plurality of pixels.
  • the control unit 12 determines whether the partial image included in the scanned image is a line image or a non-line image by executing a program stored in the ROM on the RAM.
  • the image presentation apparatus 20 is controlled so that emission of the scanning image light based on the determination result is executed. Therefore, the control unit 12 uses various data such as the content data 142 and executes various programs stored in the ROM on the RAM, thereby configuring various functional units (for example, determination unit, image presentation control unit).
  • the image presentation device 20 includes a scanning image light generation unit 21, a collimating optical system 22, a horizontal scanning unit 23, a vertical scanning unit 24, a relay optical system 25, and a relay optical system 26.
  • the scanned image light generation unit 21 is a device that reads the image signal output from the control unit 12 for each dot clock and modulates the intensity according to the read image signal to generate the scanned image light.
  • the scanned image light generation unit 21 includes a signal processing circuit 211, a light source unit 212, and a light combining unit 213.
  • the signal processing circuit 211 is connected to the control unit 12. Based on the “image signal” input from the control unit 12, the signal processing circuit 211 outputs each image signal of B (blue), G (green), and R (red) that is an element for generating scanned image light. 214 a to 214 c are generated and output to the light source unit 212.
  • the signal processing circuit 211 is connected to a horizontal scanning control circuit 23b of the horizontal scanning unit 23 described later.
  • the signal processing circuit 211 generates a horizontal drive signal 215 based on the “image signal” input from the control unit 12, and outputs the horizontal drive signal 215 to the horizontal scanning control circuit 23b. Further, the signal processing circuit 211 is connected to a vertical scanning control circuit 24b described later.
  • the signal processing circuit 211 generates a vertical drive signal 216 based on the “image signal” input from the control unit 12, and outputs the vertical drive signal 216 to the vertical scanning control circuit 24b.
  • the light source unit 212 includes a B laser driver 212a, a G laser driver 212b, an R laser driver 212c, a B laser 212d, a G laser 212e, and an R laser 212f.
  • the B laser driver 212a drives the B laser 212d based on the B (blue) image signal 214a output from the signal processing circuit 211 for each dot clock.
  • the B laser 212d emits intensity-modulated blue laser light based on the B (blue) image signal 214a.
  • the G laser 212e and the R laser 212f emit green laser light and red laser light, which are respectively intensity-modulated.
  • Each of the lasers 212d to 212f includes a semiconductor laser and a solid-state laser with a harmonic generation function.
  • the drive current is directly modulated to modulate the intensity of the laser beam. That is, the laser light emitting element capable of direct modulation is suitable for power saving because the current can be reduced to about the laser oscillation threshold while the light is not emitted.
  • each of the lasers 212d to 212f is provided with an external modulator to modulate the intensity of the laser light.
  • the efficiency of harmonic generation is not high, the laser output is normally controlled to be constant, and a loss in the external modulator is also added, so that the power consumption of the solid-state laser with a harmonic generation function increases. Since the solid-state laser with a harmonic generation function is not suitable for power saving even if the laser light output is reduced by an external modulator, the solid-state laser itself continues to oscillate at a constant output and is not suitable for power saving. It is necessary to use a laser that can be directly modulated, such as a semiconductor laser.
  • the light combining unit 213 includes collimating optical systems 213a to 213c, dichroic mirrors 213d to 213f, and a coupling optical system 213g.
  • the collimating optical systems 213a to 213c are disposed in front of the lasers 212d to 212f, respectively.
  • the collimating optical systems 213a to 213c collimate the laser beams emitted from the lasers 212d to 212f.
  • the dichroic mirrors 213d to 213f are disposed in front of the collimating optical systems 213a to 213c, respectively.
  • the dichroic mirrors 213d to 213f selectively reflect or transmit only the laser light having a wavelength in a predetermined range from the respective laser beams collimated by the collimating optical systems 213a to 213c.
  • the coupling optical system 213g is disposed in front of the dichroic mirror 213d. Blue laser light transmitted through the dichroic mirror 213d and green laser light and red laser light reflected from the dichroic mirrors 213e and 213f are incident on the coupling optical system 213g.
  • the coupling optical system 213 g collects the laser beams of the three primary colors and enters the optical fiber 27. Note that white can be expressed by equalizing the intensity of each of the blue laser light, the green laser light, and the red laser light.
  • the horizontal scanning unit 23 and the vertical scanning unit 24 generate scanning image light by scanning the laser light in the horizontal direction and the vertical direction in order to irradiate the laser light incident on the optical fiber 27 as an image.
  • the horizontal scanning unit 23 includes a resonant deflection element 23a, a horizontal scanning control circuit 23b, and a horizontal scanning angle detection circuit 23c.
  • the laser light incident on the optical fiber 27 is collimated by the collimating optical system 22 and is incident on the resonant deflection element 23a.
  • the resonant deflection element 23a has a reflection surface 23d that is swung by the horizontal scanning control circuit 23b.
  • the resonant deflection element 23a reflects the incident laser beam on the oscillating reflecting surface 23d and scans in the horizontal direction.
  • the horizontal scanning control circuit 23b generates a driving signal for swinging the reflecting surface 23d of the resonance type deflection element 23a based on the horizontal driving signal 215 output from the signal processing circuit 211.
  • the horizontal scanning angle detection circuit 23c detects a swing state such as a swing range and a swing frequency of the reflection surface 23d of the resonant deflection element 23a based on the displacement signal output from the resonant deflection element 23a.
  • a signal indicating the swing state is output to the control unit 12.
  • the vertical scanning unit 24 includes a deflection element 24a, a vertical scanning control circuit 24b, and a vertical scanning angle detection circuit 24c.
  • the deflection element 24a has a reflection surface 24d that is swung by the vertical scanning control circuit 24b.
  • the deflecting element 24a reflects the incident laser beam on the oscillating reflecting surface 24d, scans it in the vertical direction, and outputs it to the relay optical system 26 as two-dimensionally scanned image light.
  • the vertical scanning control circuit 24b Based on the vertical drive signal 216 output from the signal processing circuit 211, the vertical scanning control circuit 24b generates a drive signal for swinging the reflecting surface 24d of the deflection element 24a.
  • the vertical scanning angle detection circuit 24c Based on the displacement signal output from the deflection element 24a, the vertical scanning angle detection circuit 24c detects a swing state such as a swing range and a swing frequency of the reflection surface 24d of the deflection element 24a, and detects the swing state.
  • the signal shown is output to the control unit 12.
  • the relay optical system 25 is disposed between the resonant deflection element 23a and the deflection element 24a.
  • the relay optical system 25 converges the laser beam scanned in the horizontal direction on the reflection surface 23d of the resonance type deflection element 23a and makes it incident on the reflection surface 24d of the deflection element 24a.
  • the signal processing circuit 211 outputs the horizontal drive signal 215 and the vertical drive signal 216 to the horizontal scanning control circuit 23b and the vertical scanning control circuit 24b, respectively, based on the “image signal” input from the control unit 12, and the reflecting surface.
  • the scanning angles 23d and 24d are controlled.
  • the scanning angles of the reflecting surfaces 23d and 24d thus changed are detected as detection signals by the horizontal scanning angle detection circuit 23c and the vertical scanning angle detection circuit 24c, and the detection signals are input to the control unit 12, and the horizontal drive signal 215 and This is fed back to the vertical drive signal 216.
  • the relay optical system 26 includes lens systems 26a and 26b having a positive refractive power.
  • the image light emitted from the deflection element 24a is converted into convergent image light by the lens system 26a so that the respective image lights have their center lines substantially parallel to each other.
  • the converged image light is converted into substantially parallel scanned image light by the lens system 26b, and is condensed so that the center line of the scanned image light converges on the pupil Ea of the user's eye E.
  • the laser light incident from the optical fiber 27 is scanned in the horizontal direction by the horizontal scanning unit 23 and then scanned in the vertical direction by the vertical scanning unit 24.
  • the arrangement of the vertical scanning unit 24 may be changed, and after the vertical scanning unit 24 scans in the vertical direction, the horizontal scanning unit 23 scans in the horizontal direction.
  • the main process shown in FIG. 4 is performed by the control unit 12 executing a program stored in the ROM.
  • the control unit 12 that has started this process first detects the remaining battery level of the battery 500 via the power controller 400 and acquires the remaining battery level value (S100). Subsequently, the control unit 12 determines whether or not the acquired battery remaining value is equal to or less than a predetermined value that is a predetermined constant value (S102). As a result of the determination, when the remaining battery level is not less than or equal to the specified value (S102: No), the control unit 12 proceeds to S104, while when the remaining battery level is equal to or less than the specified value (S102: Yes) The process proceeds to S108.
  • the control unit 12 sets a laser output value (emission output value) of scanning image light (specifically, laser light) emitted from the image presentation device 20 to a predetermined default value P (a predetermined constant value).
  • the output power value is set to a value lower than the maximum laser output value.
  • the control unit 12 renders the content data 142 and generates scanned image data (S106).
  • the scanned image data generated in S106 is data representing a scanned image having a brightness corresponding to the default value P of the laser output set in S104. As described above, the scanned image is formed by a plurality of pixels.
  • the control unit 12 scans the scanned image data in which each pixel is set to brightness (gradation) obtained by multiplying the value (coefficient) corresponding to the default value P of the laser output and the brightness value of each color. Generate. For example, when represented by R (red), G (green), and B (blue), each pixel is set to an RGB value corresponding to the default value P of the laser output.
  • the brightness of the pixel (RGB value) and the laser output value are generally correlated. For example, when each of R (red), G (green), and B (blue) is low (in RGB values, the maximum value of each color is 255 in the case of 256 gradations), the laser output value Also lower.
  • the coefficient when the laser output is the maximum value is 1, and the coefficient when the laser output is the minimum value (not output) is 0, and a predetermined value is associated with each laser output value in the middle.
  • the registered table is stored in the storage unit 14.
  • a value (coefficient) corresponding to the default value P is selected from this table based on the default value P.
  • description will be made by taking RGB values as an example.
  • the control unit 12 sets the laser output value to a predetermined value and stores it in a predetermined area on the RAM. Subsequently, power saving image data generation processing is executed (S110).
  • the laser output value set in S108 is used in the power saving image data generation process (see FIG. 5) in S110.
  • the laser output value set in S108 is a value lower than the default value P. Details of the power saving image data generation processing executed in S110 will be described later.
  • the control unit 12 controls the image presentation device 20 to emit scanning image light indicating the scanning image represented by the scanning image data generated in S106, or represented by the scanning image data generated in S110. Scanning image light indicating the scanning image is emitted.
  • the laser output value of the emitted scanning image light is directly modulated with the scanning image data. Therefore, for a dark image, for example, a pixel with a low RGB value, the laser output value of the scanned image light is reduced.
  • Such a configuration is preferable because the laser output value is changed by control when the scanning image light is emitted.
  • the line drawing unit or the non-line drawing unit (see S200 (see FIG. 5) of the power saving image data generation process described below) is specified (determined), and the laser output value is controlled. It is also possible to adopt a configuration that changes according to
  • the control unit 12 that has started this process firstly includes a line drawing unit including a line image (for example, a character or a line drawing (line)) in a content image indicated by the content data 142, and a non-line image (for example, a landscape drawing).
  • the content data 142 is divided into a line drawing part and a non-line drawing part.
  • the control unit 12 may determine a character or a line drawing (line drawing unit) based on the file type of the content data 142. For example, when the content data 142 is a text file, the control unit 12 determines that the content data 142 is a line drawing unit. Further, if the RGB value pairs of the pixels forming the content image are widely distributed in various value pairs, it can be determined as a non-line drawing portion because there is a halftone. On the other hand, when the RGB value pairs are divided into two groups having greatly different values, and one of the pairs is not long and continuous, it can be determined as a character or a line drawing (line drawing portion).
  • the character or line drawing has a different color unless one set and the other set are long and continuous.
  • the continuous length may be determined by whether it is longer or shorter than the normal line width (several pixels).
  • the line drawing unit specified by the control unit 12 may be either a region matching the outline of the line image or a region including the line image (for example, a rectangular region). The same applies to the non-line drawing part.
  • the control unit 12 determines whether or not there is a line drawing part, specifically, whether or not the line drawing part is divided in S200 (S202). As a result of the determination in S202, when there is no line drawing part (S202: No), the control unit 12 moves the process to S212. On the other hand, if there is a line drawing part (S202: Yes), it is determined whether or not the line drawing part is binary data (S204). When the line drawing unit is binary data (S204: Yes), the control unit 12 moves the process to S208.
  • the line drawing unit is not binary data (S204: No)
  • binarization processing is executed on the line drawing unit (S206), and the process is performed in S208.
  • the binarization processing is processing for converting color data into gray data and converting the data into binary data using a predetermined threshold.
  • the control unit 12 performs the black and white reversal process on the area including the line drawing part in which S204 is affirmed or the line drawing part in which the binarization process in S206 is executed.
  • the black-and-white reversal process is, for example, a process of converting the color of a portion indicated by a line such as a character to white while converting the color of other portions to black.
  • the control unit 12 renders the line drawing unit in which S208 is executed, generates line drawing unit processed image data (S210), and shifts the processing to S212.
  • the control unit 12 generates line drawing part processed image data having an RGB value corresponding to a laser output value (first emission output value) equal to or greater than the above-described laser output default value P (see S104 in FIG. 4).
  • the generation of the line drawing section processed image data is performed by the same method as S106 in FIG. 4 described above based on the laser output value equal to or greater than the default value P.
  • a value (coefficient) corresponding to the laser output value equal to or greater than the default value P is selected from the above-described table based on the laser output value equal to or greater than the default value P.
  • the control unit 12 determines whether or not the non-line drawing unit is divided in S200. As a result of the determination, if there is no non-line drawing part (S212: No), the control part 12 moves the process to S218. On the other hand, when there is a non-line drawing part (S212: Yes), the control part 12 acquires the laser output value stored on the RAM in S108 of the main process shown in FIG. 4 (reading / S214). And the control part 12 performs rendering with respect to a non-line drawing part, and produces
  • the control unit 12 After executing S216, the control unit 12 combines the line drawing processed image data generated in S210 and the non-line drawing processed image data generated in S216 to generate one scanned image data (S218). .
  • the determination in S202 is negative (S202: No)
  • the non-line drawing portion processed image data generated in S216 becomes the scanned image data as it is.
  • the determination in S212 is negative (S212: No)
  • the line drawing section processed image data generated in S210 becomes the scanned image data as it is.
  • the laser output value stored in the RAM in S108 in FIG. 4 and acquired in S214 in FIG. 5 is lower than the default value P
  • a different configuration can be adopted.
  • the laser output value stored in S108 of FIG. 4 and acquired in S214 of FIG. 5 may be higher than the default value P if it is set lower than the laser output value used in S210 of FIG.
  • power saving can be realized.
  • FIG. 6A showing the image before conversion by the power saving image data generation processing shown in FIG. 5, and the power saving image data generation processing shown in FIG. A comparison with FIG. 6B showing the scanned image represented by the scanned image data will be described.
  • the region including the character “work place scenery” as a line image in other words, the region excluding the landscape image (non-line image) is shown in FIG. 5.
  • the black and white are reversed. Specifically, the black character “work place scenery” is converted to white, while the background is converted from white to black.
  • the scanning image light is not emitted when the pixel is black, and therefore the laser output value is zero.
  • the landscape image shown in FIG. 6B is converted into a dark image as a whole as compared with the landscape image shown in FIG.
  • This is S216 of the power saving image data generation process shown in FIG. 5 and is stored in the RAM in S108 of the main process shown in FIG. 4 and is acquired in S214 of FIG. 5 (default value P (FIG. 4). This is based on the execution of the process corresponding to the lower laser output).
  • the laser output value shown in the upper part of FIG. 7 was scanned on a line drawn in the scanned image shown in the lower part of FIG. 7 (a horizontal line that crosses the vicinity of the upper and lower central parts when viewed in front). The change of the laser output value at the time is shown.
  • the comparison method has substantially the same laser output value in the entire scanning range, whereas the laser output value of this method is in the range of a landscape image (non-linear image). While it is lower than the comparison method, it is increasing in the range of the character “work place” (line image). That is, the user of the HMD 100 presented with the image according to the present method visually recognizes a dark landscape image and bright characters as compared with the case of the comparison method.
  • the scanning image light (scanning image light which concerns on the pixel which forms a non-linear image) which concerns on non-linear images, such as a landscape image, a character etc.
  • the laser output value increases.
  • the laser output value decreases.
  • the area obtained by integrating the solid line which is the present method, is smaller than the area obtained by integrating the dotted line in the upper part of FIG. Furthermore, since the peak output of the character portion is larger in the solid line and brighter than the comparison method, the visibility of the character, which is more important information, is rather improved.
  • the HMD 100 of this embodiment is stored in the RAM in S108 of the main process shown in FIG. 4, and the laser output value acquired in S214 of the power saving image data generation process shown in FIG. 5 is used in S210 of FIG. A configuration that is set lower than the laser output value is adopted. Therefore, compared with the case where the pixels forming the scanned image represented by the scanned image data are RGB values corresponding to the laser output values used in S210, power saving can be realized without impairing visibility. it can.
  • the laser output value used in S210 of FIG. 5 is set to the default value P or less, the laser output value stored in the RAM in S108 of FIG. 4 and acquired in S214 is the laser output value of S210. Therefore, power saving can be realized as compared with the case where the pixels forming the scanned image represented by the scanned image data are RGB values corresponding to the default value P.

Abstract

Disclosed is a head mounted display through which a user can view a content image represented by content data and which comprises an image presentation means, a judging means, and an image presentation control means.  The image presentation means emits light for each unit of pixels which form the content image to output image light which represents a partial image contained in the content image.  The judging means judges whether the partial image is a line-drawn image which is represented by a line drawing or is a non line-drawn image which is represented by a drawing other than the line drawing.  If the judging means judges that the partial image is a line-drawn image, the image presentation control means controls the image presentation means to emit image light which represents the line-drawn image at an output value corresponding to a first emission output value.  If the judging means judges that the partial image is a non line-drawn image, the image presentation control means controls the image presentation means to emit image light which represents the non-line drawn image at an output value corresponding to a second emission output value lower than the first emission output value.

Description

ヘッドマウントディスプレイHead mounted display
 本開示は、利用者の眼にコンテンツデータにより示されるコンテンツ画像を視認可能に提示し、この利用者にコンテンツ画像を認識させるヘッドマウントディスプレイに関するものである。 The present disclosure relates to a head-mounted display that presents a content image indicated by content data to a user's eyes so that the user can recognize the content image.
 利用者の眼にコンテンツデータにより示される画像を視認可能に提示し、この利用者に画像を認識させるヘッドマウントディスプレイに関する技術が提案されている。例えば、使用者の眼球の近傍に配置されて使用される頭部装着式映像表示装置であるヘッドマウントディスプレイ(Head Mount Display)本体と、制御ユニット等とを有する映像表示装置において、画像判別部が表示ユニットに表示される画像を線画像と判別し、かつ表示色判別部が線画像の背景の表示色を白色と判別した場合、表示制御部は、背景の表示色を黒色に、かつ文字及び線の表示色を白色に、それぞれの明るさを反転させるように白黒反転部の作動を制御する技術が提案されている。(例えば、特許文献1参照)。 A technology related to a head-mounted display has been proposed in which an image indicated by content data is presented to a user's eye so that the image can be viewed, and the image is recognized by the user. For example, in an image display device having a head-mounted display (Head-Mount-Display) body that is a head-mounted image display device that is used in the vicinity of a user's eyeball, and a control unit, the image determination unit includes: When the image displayed on the display unit is determined to be a line image, and the display color determination unit determines that the display color of the background of the line image is white, the display control unit sets the background display color to black, characters, and There has been proposed a technique for controlling the operation of the black and white reversing unit so that the display color of the line is white and the brightness of each line is reversed. (For example, refer to Patent Document 1).
特開2008-116704号公報JP 2008-116704 A
 ところで、各種電気製品の省電力化が求められる中、ヘッドマウントディスプレイについても消費電力を低減することが必要である。例えば、ヘッドマウントディスプレイは移動しながら使用されることもあり、その場合、電池によって駆動される。電池駆動を含め各種電源によって駆動されるヘッドマウントディスプレイでは、省電力化による長時間表示が要求される。ここで、消費電力を低減するに際し、コンテンツデータによって示されるコンテンツ画像の視認性が損なわれてはならず、視認性を確保したコンテンツ画像の提示と消費電力の低減とが両立して実現される必要がある。 By the way, while power saving of various electric products is required, it is necessary to reduce the power consumption of the head mounted display. For example, the head mounted display may be used while moving, in which case it is driven by a battery. Head mounted displays driven by various power sources including battery drive require a long-time display due to power saving. Here, when reducing the power consumption, the visibility of the content image indicated by the content data must not be impaired, and the presentation of the content image that ensures the visibility and the reduction of the power consumption are realized at the same time. There is a need.
 本開示は、利用者の眼に提示されるコンテンツ画像の視認性を確保しつつ、電力消費を低減することができる、新たなヘッドマウントディスプレイを提供することを目的とする。 This disclosure is intended to provide a new head-mounted display capable of reducing power consumption while ensuring the visibility of content images presented to the user's eyes.
 本開示のヘッドマウントディスプレイは、コンテンツ画像に含まれる部分画像が線画像であるか非線画像であるかを判定し、コンテンツ画像を形成する画素単位で発光させた、非線画像を示す画像光を、線画像を示す画像光を出射する場合の出射出力値より低い出射出力値に応じた出力値で出射することとしたものである。なお、本開示のヘッドマウントディスプレイは、発光型の画像素子(例えば、有機EL(Organic Electroluminescence)、LED(各画素がそれぞれ個別LEDで構成される))、又は、走査型の画像形成方式を対象とし、例えば、液晶のような非発光型の画像素子及び画像素子を照明する光源を採用した構成を含まない。 The head mounted display according to the present disclosure determines whether a partial image included in a content image is a line image or a non-linear image, and emits image light indicating a non-linear image that is emitted in units of pixels forming the content image. Is output with an output value corresponding to an output output value lower than the output output value when the image light indicating the line image is output. The head-mounted display of the present disclosure is intended for a light-emitting image element (for example, organic EL (Organic Electroluminescence), LED (each pixel is configured by an individual LED)), or a scanning image forming method. For example, it does not include a configuration employing a non-light-emitting image element such as liquid crystal and a light source that illuminates the image element.
 本開示によれば、利用者の眼に提示されるコンテンツ画像の視認性を確保しつつ、電力消費を低減することができる、新たなヘッドマウントディスプレイを得ることができる。 According to the present disclosure, it is possible to obtain a new head mounted display capable of reducing power consumption while ensuring the visibility of the content image presented to the user's eyes.
 本開示の一側面によれば、利用者に、コンテンツデータにより示されるコンテンツ画像を視認させるヘッドマウントディスプレイであって、前記コンテンツ画像を形成する画素単位で発光し、前記コンテンツ画像に含まれる部分画像を示す画像光を出射する画像提示手段と、前記部分画像が、線画で示される線画像であるか、非線画で示される非線画像であるかを判定する判定手段と、前記判定手段が線画像であると判定した場合、前記線画像を示す画像光を第1出射出力値に応じた出力値で出射するよう前記画像提示手段を制御し、前記判定手段が前記非線画像であると判定した場合、前記非線画像を示す画像光を、前記第1出射出力値より低い第2出射出力値に応じた出力値で出射するよう前記画像提示手段を制御する画像提示制御手段と、を備えることを特徴とするヘッドマウントディスプレイを得ることができる。 According to one aspect of the present disclosure, a head-mounted display that allows a user to visually recognize a content image indicated by content data, the partial image included in the content image that emits light in units of pixels that form the content image An image presentation unit that emits image light indicating the image, a determination unit that determines whether the partial image is a line image indicated by a line drawing, or a non-line image indicated by a non-line drawing, and the determination unit includes a line image When it is determined that the image is an image, the image presentation unit is controlled to output image light indicating the line image with an output value corresponding to the first output value, and the determination unit is determined to be the non-linear image. In this case, the image presentation control means for controlling the image presentation means to emit the image light indicating the non-linear image at an output value corresponding to a second emission output value lower than the first emission output value. When, it is possible to obtain a head-mounted display, characterized in that it comprises a.
 このようなヘッドマウントディスプレイでは、非線画像を示す画像光の出射出力値を低減することができる。例えば、コンテンツ画像の全体を第1出射出力値に応じた出力値で出射した場合と比較し、出射出力の総量を低減することができる。このとき、コンテンツ画像の中でより重要度の高い文字情報などである線画像は相対的に出力が高く明るいので視認性も確保できる。 In such a head mounted display, the output output value of image light indicating a non-linear image can be reduced. For example, the total amount of output power can be reduced as compared with a case where the entire content image is output with an output value corresponding to the first output output value. At this time, the line image, which is character information having a higher importance in the content image, has a relatively high output and is bright, so that visibility can be ensured.
 本開示の他の側面によれば、前記画像提示制御手段は、前記第2出射出力値に応じた出力値で非線画像を示す画像光を出射している状態において、前記線画像を示す画像光を出射する場合、出射出力値を前記第1出射出力値に応じた出力値とし、前記第1出射出力値に応じた出力値で線画像を示す画像光を出射している状態において、前記非線画像を示す画像光を出射する場合、出射出力値を前記第2出射出力値に応じた出力値とすることを特徴とするヘッドマウントディスプレイを得ることができる。このようなヘッドマウントディスプレイでは、出射出力値を適切に変化させることができる。 According to another aspect of the present disclosure, the image presentation control unit is configured to display an image indicating the line image in a state in which image light indicating a non-linear image is output with an output value corresponding to the second output output value. In the case of emitting light, the emission output value is an output value corresponding to the first emission output value, and in the state of emitting image light indicating a line image with an output value corresponding to the first emission output value, When emitting image light indicating a non-linear image, it is possible to obtain a head-mounted display characterized in that the emission output value is an output value corresponding to the second emission output value. In such a head mounted display, the output power value can be appropriately changed.
 本開示のさらに他の側面によれば、前記ヘッドマウントディスプレイは、駆動用電源としてバッテリーを利用可能であって、バッテリー残量を検知する検知手段を備え、前記画像提示制御手段は、前記検知手段により検知されたバッテリー残量値が所定の残量値より少ない場合、前記第1出射出力値及び前記第2出射出力値各々に応じた出力値による画像光の出射を制御し、前記検知手段により検知されたバッテリー残量値が所定の残量値より多い場合、所定の一定値である出射出力値に応じた出力値による、前記線画像を示す画像光の出射と前記非線画像を示す画像光の出射とを制御することを特徴とするヘッドマウントディスプレイを得ることができる。このようなヘッドマウントディスプレイでは、バッテリー残量が低下している場合に消費電力の低減が可能で、長時間の駆動を実現することができる。 According to still another aspect of the present disclosure, the head mounted display includes a detection unit that can use a battery as a driving power source and detects a remaining battery level, and the image presentation control unit includes the detection unit. When the battery remaining amount value detected by the control unit is less than a predetermined remaining amount value, the output of the image light by the output value corresponding to each of the first emission output value and the second emission output value is controlled, and the detection means When the detected battery remaining amount value is greater than the predetermined remaining amount value, the output of the image light indicating the line image and the image indicating the non-linear image are output according to the output value corresponding to the predetermined output value. It is possible to obtain a head mounted display characterized by controlling light emission. In such a head mounted display, power consumption can be reduced when the remaining battery level is low, and long-time driving can be realized.
ヘッドマウントディスプレイを装着した状態を示す図である。It is a figure which shows the state which mounted | wore the head mounted display. ヘッドマウントディスプレイの機能ブロックを示す図である。It is a figure which shows the functional block of a head mounted display. 画像提示装置を示す図である。It is a figure which shows an image presentation apparatus. メイン処理のフローを示す図である。It is a figure which shows the flow of a main process. 省電力画像データ生成処理のフローを示す図である。It is a figure which shows the flow of a power saving image data generation process. (a)及び(b)は、走査画像を説明する図である。(A) And (b) is a figure explaining a scanning image. レーザ出力値を説明する図である。It is a figure explaining a laser output value.
 本開示を反映した実施形態について、図面を用いて以下に詳細に説明する。本開示は、以下に記載の構成に限定されるものではなく、同一の技術的思想において種々の構成を採用することができる。例えば、以下の説明では、ヘッドマウントディスプレイ本体と、各種処理を実行する制御装置と、を一体的に構成したヘッドマウントディスプレイ(以下、「HMD」という。)を例に説明するが、これら各装置を別体の装置として構成することもできる。 Embodiments reflecting the present disclosure will be described in detail below with reference to the drawings. The present disclosure is not limited to the configurations described below, and various configurations can be employed in the same technical idea. For example, in the following description, a head mounted display (hereinafter referred to as “HMD”) in which a head mounted display main body and a control device that executes various processes are integrated will be described as an example. Can be configured as a separate device.
 (HMDの概要)
 図1に示すようにHMD100は、利用者の頭部に装着されるフレーム110と、利用者に視認させるコンテンツ画像を提示する画像提示ボックス130とによって構成されている。図1に示されるフレーム110は、眼鏡のフレームに類似した形状に形成されている。フレーム110は、この形状に限定されず、ヘルメット形状等であってもよく、利用者の頭部に装着可能な他の構造とすることもできる。利用者の左眼前方に、図2に基づき後述する画像提示装置20を含む画像提示ボックス130が配置されている。画像提示ボックス130は、フレーム110の所定の位置に取り付けられている。なお、図1に示されるHMD100は、1つの画像提示ボックス130が配置された構成を有するが、利用者の右眼前方にも他の画像提示ボックス130を配置した構成とすることもできる。
(Overview of HMD)
As shown in FIG. 1, the HMD 100 includes a frame 110 that is worn on the user's head and an image presentation box 130 that presents a content image to be visually recognized by the user. A frame 110 shown in FIG. 1 is formed in a shape similar to a frame of eyeglasses. The frame 110 is not limited to this shape, and may have a helmet shape or the like, and may have another structure that can be mounted on the user's head. An image presentation box 130 including an image presentation device 20 described later with reference to FIG. 2 is disposed in front of the user's left eye. The image presentation box 130 is attached to a predetermined position of the frame 110. The HMD 100 shown in FIG. 1 has a configuration in which one image presentation box 130 is disposed, but may be configured in a manner in which another image presentation box 130 is disposed in front of the user's right eye.
 詳細については後述するが、画像提示ボックス130内の画像提示装置20は画像光を出射する。出射された画像光は、図1には描画していないハーフミラーで利用者の左眼方向に反射し、利用者の眼球に直接照射される。これによって、利用者はコンテンツ画像を視認する。なお、HMD100は、利用者がハーフミラーを介して外界を視認することができるシースルー型のHMDである。 Although details will be described later, the image presentation device 20 in the image presentation box 130 emits image light. The emitted image light is reflected in the direction of the user's left eye by a half mirror which is not drawn in FIG. 1, and is directly applied to the user's eyeball. Thereby, the user visually recognizes the content image. The HMD 100 is a see-through HMD that allows a user to visually recognize the outside world through a half mirror.
 HMD100は、電源供給用のケーブル300によって電源コントローラ400に接続されている。そして、電源コントローラ400と一体的に構成されたバッテリー(充電池)500から、電源コントローラ400を介して電源の供給を受け、駆動する。なお、一体的に構成された電源コントローラ400及びバッテリー500は、例えば、利用者の腰等に取り付け、利用される。 The HMD 100 is connected to the power controller 400 by a power supply cable 300. Then, power is supplied from a battery (rechargeable battery) 500 integrally formed with the power controller 400 via the power controller 400 to drive the battery. The integrally configured power supply controller 400 and battery 500 are used by being attached to the user's waist or the like, for example.
 (HMDの機能ブロック)
 図2に示すようにHMD100は、自装置の制御を司る制御部12と、コンテンツデータ142を記憶する記憶部14とを備える。また、HMD100は、利用者によって操作され、利用者からの指示を受け付ける操作部16と、入出力I/F(Interface)18とを備える。さらに、HMD100は、画像光を出射する画像提示装置20を備える。なお、HMD100が備えるこれら各構成は、画像提示ボックス130の内部(画像提示ボックス130を構成する筐体内部)に備えられている。
(Functional block of HMD)
As shown in FIG. 2, the HMD 100 includes a control unit 12 that controls the device itself, and a storage unit 14 that stores content data 142. The HMD 100 includes an operation unit 16 that is operated by a user and receives an instruction from the user, and an input / output I / F (Interface) 18. Furthermore, the HMD 100 includes an image presentation device 20 that emits image light. Note that each of these components included in the HMD 100 is provided inside the image presentation box 130 (inside the casing constituting the image presentation box 130).
 ここで、制御部12は、例えば、各種演算処理を実行するCPUと、各種プログラムを記憶するROMと、作業領域としてのRAMとによって構成される。また、制御部12は、例えば、CPUからの指令に基づきレンダリング処理等を実行するGPU(Graphics Processing Unit)を含む。記憶部14は、例えば、不揮発性メモリにより構成されている。記憶部14に記憶されるコンテンツデータ142は、例えば、後述する図6に示すような作業場所の風景画及びその説明文(文字)を含むものである。操作部16は、例えば、キーにより構成され、例えば、コンテンツデータ142の再生開始及び再生終了の指示を受け付ける。入出力I/F18は、電源の供給を受け、また、HMD100への電源(電力)の供給及び遮断を行う電源コントローラ400との間で各種信号の送受信を行う。 Here, the control unit 12 includes, for example, a CPU that executes various arithmetic processes, a ROM that stores various programs, and a RAM as a work area. Further, the control unit 12 includes, for example, a GPU (Graphics Processing Unit) that executes rendering processing and the like based on a command from the CPU. The memory | storage part 14 is comprised by the non-volatile memory, for example. The content data 142 stored in the storage unit 14 includes, for example, a landscape image of a work place as shown in FIG. The operation unit 16 is configured by, for example, a key, and receives, for example, instructions to start and end playback of the content data 142. The input / output I / F 18 receives supply of power, and transmits and receives various signals to and from the power controller 400 that supplies and shuts off the power (power) to the HMD 100.
 画像提示装置20は、網膜走査型のディスプレイを用いて構成することができる。網膜走査型のディスプレイを用いた画像提示装置20は、コンテンツデータ142に対し実行したレンダリング処理によって得られたコンテンツ画像データに基づく画像光を2次元方向に走査し、その走査された走査画像光を利用者の左眼に導き、利用者の網膜上にコンテンツ画像を形成する。画像提示装置20は、網膜走査型のディスプレイの他、有機EL(Electro-Luminescence)ディスプレイ、LEDディスプレイその他の発光型の装置を用いた構成とすることもできる。これら画像提示装置20は、コンテンツ画像を形成する画素単位で発光させた画像光を出射する。なお、コンテンツ画像は、上述した風景画、換言すれば、非線画像の他、例えば、風景画を説明する文字、表又は線図のような線で示される線画像を含む。以下、網膜走査型のディスプレイによる画像提示装置20を例に説明することとする。また、コンテンツデータ142をレンダリングして生成されるコンテンツ画像データを「走査画像データ」ともいい、コンテンツ画像データによって表されるコンテンツ画像を「走査画像」ともいう。 The image presentation device 20 can be configured using a retinal scanning display. The image presentation device 20 using the retinal scanning display scans the image light based on the content image data obtained by the rendering process performed on the content data 142 in a two-dimensional direction, and the scanned image light is scanned. It leads to the user's left eye and forms a content image on the user's retina. The image presentation device 20 may be configured to use an organic EL (Electro-Luminescence) display, an LED display, or other light emitting devices in addition to a retinal scanning display. These image presentation devices 20 emit image light emitted in units of pixels forming a content image. The content image includes, for example, the above-described landscape image, in other words, a non-linear image, for example, a line image indicated by a line such as a character, a table, or a diagram explaining the landscape image. Hereinafter, the image presentation apparatus 20 using a retinal scanning display will be described as an example. The content image data generated by rendering the content data 142 is also referred to as “scanned image data”, and the content image represented by the content image data is also referred to as “scanned image”.
 制御部12は、自身を構成するROMに記憶されているコンテンツデータ142の再生(レンダリング)に関するプログラムをRAM上で実行し、複数の画素で形成された走査画像を表す走査画像データを生成する。また、制御部12は、ROMに記憶されたプログラムをRAM上で実行することで、走査画像に含まれる部分画像が線画像であるか、非線画像であるかを判定する。そして、判定結果に基づいた走査画像光の出射が実行されるように画像提示装置20を制御する。したがって、制御部12が、コンテンツデータ142等の各種データを用い、ROMに記憶された各種プログラムをRAM上で実行することにより、各種機能手段(例えば、判定手段、画像提示制御手段)が構成される。 The control unit 12 executes a program related to reproduction (rendering) of the content data 142 stored in the ROM that constitutes the controller 12 on the RAM, and generates scanned image data representing a scanned image formed by a plurality of pixels. The control unit 12 determines whether the partial image included in the scanned image is a line image or a non-line image by executing a program stored in the ROM on the RAM. And the image presentation apparatus 20 is controlled so that emission of the scanning image light based on the determination result is executed. Therefore, the control unit 12 uses various data such as the content data 142 and executes various programs stored in the ROM on the RAM, thereby configuring various functional units (for example, determination unit, image presentation control unit). The
 (画像提示装置の構成)
 図3に示すように画像提示装置20は、走査画像光生成部21、コリメート光学系22、水平走査部23、垂直走査部24、リレー光学系25、リレー光学系26を有している。
(Configuration of image presentation device)
As shown in FIG. 3, the image presentation device 20 includes a scanning image light generation unit 21, a collimating optical system 22, a horizontal scanning unit 23, a vertical scanning unit 24, a relay optical system 25, and a relay optical system 26.
 走査画像光生成部21は、制御部12が出力する画像信号を、ドットクロック毎に読み出し、読み出した画像信号に応じて強度変調して走査画像光を生成する装置である。走査画像光生成部21は、信号処理回路211、光源部212、光合成部213を有している。 The scanned image light generation unit 21 is a device that reads the image signal output from the control unit 12 for each dot clock and modulates the intensity according to the read image signal to generate the scanned image light. The scanned image light generation unit 21 includes a signal processing circuit 211, a light source unit 212, and a light combining unit 213.
 信号処理回路211は、制御部12に接続されている。信号処理回路211は、制御部12から入力された「画像信号」に基づいて、走査画像光を生成するための要素となるB(青)、G(緑)、R(赤)の各画像信号214a~214cを生成し、光源部212に出力する。また、信号処理回路211は、後述する水平走査部23の水平走査制御回路23bに接続されている。信号処理回路211は、制御部12から入力された「画像信号」に基づいて水平駆動信号215を生成し、この水平駆動信号215を水平走査制御回路23bに出力する。さらに、信号処理回路211は、後述する垂直走査制御回路24bに接続されている。信号処理回路211は、制御部12から入力された「画像信号」に基づいて垂直駆動信号216を生成し、この垂直駆動信号216を垂直走査制御回路24bに出力する。 The signal processing circuit 211 is connected to the control unit 12. Based on the “image signal” input from the control unit 12, the signal processing circuit 211 outputs each image signal of B (blue), G (green), and R (red) that is an element for generating scanned image light. 214 a to 214 c are generated and output to the light source unit 212. The signal processing circuit 211 is connected to a horizontal scanning control circuit 23b of the horizontal scanning unit 23 described later. The signal processing circuit 211 generates a horizontal drive signal 215 based on the “image signal” input from the control unit 12, and outputs the horizontal drive signal 215 to the horizontal scanning control circuit 23b. Further, the signal processing circuit 211 is connected to a vertical scanning control circuit 24b described later. The signal processing circuit 211 generates a vertical drive signal 216 based on the “image signal” input from the control unit 12, and outputs the vertical drive signal 216 to the vertical scanning control circuit 24b.
 光源部212は、Bレーザドライバ212a、Gレーザドライバ212b、Rレーザドライバ212c、Bレーザ212d、Gレーザ212e、Rレーザ212fから構成されている。Bレーザドライバ212aは、信号処理回路211からドットクロック毎に出力されたB(青)の画像信号214aに基づき、Bレーザ212dを駆動させる。Bレーザ212dは、B(青)の画像信号214aに基づき、強度変調された青色のレーザ光を出射する。Gレーザ212e及びRレーザ212fも、同様に、それぞれ強度変調された、緑色のレーザ光、赤色のレーザ光を出射する。 The light source unit 212 includes a B laser driver 212a, a G laser driver 212b, an R laser driver 212c, a B laser 212d, a G laser 212e, and an R laser 212f. The B laser driver 212a drives the B laser 212d based on the B (blue) image signal 214a output from the signal processing circuit 211 for each dot clock. The B laser 212d emits intensity-modulated blue laser light based on the B (blue) image signal 214a. Similarly, the G laser 212e and the R laser 212f emit green laser light and red laser light, which are respectively intensity-modulated.
 各レーザ212d~212fには、半導体レーザや、高調波発生機能付固体レーザが含まれる。なお、半導体レーザを用いる場合には、駆動電流を直接変調して、レーザ光の強度変調を行う。すなわち、直接変調可能なレーザ発光素子は発光しない間は電流をレーザ発振閾値程度まで小さくできるので省電力化に適している。また、高調波発生機能付固体レーザを用いる場合には、各レーザ212d~212fそれぞれに、外部変調器を備えてレーザ光の強度変調を行う。ただし、高調波発生の効率は高くなく、レーザ出力は通常、一定となるように制御され、さらに外部変調器での損失も加わるため、高調波発生機能付固体レーザの消費電力は大きくなる。高調波発生機能付固体レーザは外部変調器によりレーザ光出力が小さくされても、固体レーザ自体は一定出力でレーザ発振を続け省電力化には適さないため、各レーザ212d~212fの少なくとも1つに直接変調可能なレーザ、例えば半導体レーザを用いる必要がある。 Each of the lasers 212d to 212f includes a semiconductor laser and a solid-state laser with a harmonic generation function. When a semiconductor laser is used, the drive current is directly modulated to modulate the intensity of the laser beam. That is, the laser light emitting element capable of direct modulation is suitable for power saving because the current can be reduced to about the laser oscillation threshold while the light is not emitted. When using a solid-state laser with a harmonic generation function, each of the lasers 212d to 212f is provided with an external modulator to modulate the intensity of the laser light. However, the efficiency of harmonic generation is not high, the laser output is normally controlled to be constant, and a loss in the external modulator is also added, so that the power consumption of the solid-state laser with a harmonic generation function increases. Since the solid-state laser with a harmonic generation function is not suitable for power saving even if the laser light output is reduced by an external modulator, the solid-state laser itself continues to oscillate at a constant output and is not suitable for power saving. It is necessary to use a laser that can be directly modulated, such as a semiconductor laser.
 光合成部213は、コリメート光学系213a~213c、ダイクロイックミラー213d~213f、結合光学系213gとから構成されている。コリメート光学系213a~213cは、それぞれ、レーザ212d~212fの前方に配設される。コリメート光学系213a~213cは、各レーザ212d~212fが出射したレーザ光を、平行光化する。ダイクロイックミラー213d~213fは、それぞれ、コリメート光学系213a~213cの前方に配設される。ダイクロイックミラー213d~213fは、各コリメート光学系213a~213cが平行化した各レーザ光を、所定の範囲の波長のレーザ光のみを選択的に、反射又は透過する。 The light combining unit 213 includes collimating optical systems 213a to 213c, dichroic mirrors 213d to 213f, and a coupling optical system 213g. The collimating optical systems 213a to 213c are disposed in front of the lasers 212d to 212f, respectively. The collimating optical systems 213a to 213c collimate the laser beams emitted from the lasers 212d to 212f. The dichroic mirrors 213d to 213f are disposed in front of the collimating optical systems 213a to 213c, respectively. The dichroic mirrors 213d to 213f selectively reflect or transmit only the laser light having a wavelength in a predetermined range from the respective laser beams collimated by the collimating optical systems 213a to 213c.
 結合光学系213gは、ダイクロイックミラー213dの前方に配設されている。ダイクロイックミラー213dを透過した青色のレーザ光及び、ダイクロイックミラー213e、213fをそれぞれ反射した、緑色のレーザ光、赤色のレーザ光が、結合光学系213gに入射する。結合光学系213gは、各3原色のレーザ光を集光させて、光ファイバ27に入射させる。なお、青色のレーザ光、緑色のレーザ光、赤色のレーザ光それぞれの強度を均等にすると、白色を表現することができる。 The coupling optical system 213g is disposed in front of the dichroic mirror 213d. Blue laser light transmitted through the dichroic mirror 213d and green laser light and red laser light reflected from the dichroic mirrors 213e and 213f are incident on the coupling optical system 213g. The coupling optical system 213 g collects the laser beams of the three primary colors and enters the optical fiber 27. Note that white can be expressed by equalizing the intensity of each of the blue laser light, the green laser light, and the red laser light.
 水平走査部23及び垂直走査部24は、光ファイバ27に入射されたレーザ光を、画像として照射するために、当該レーザ光を水平方向と垂直方向に走査して走査画像光を生成する。 The horizontal scanning unit 23 and the vertical scanning unit 24 generate scanning image light by scanning the laser light in the horizontal direction and the vertical direction in order to irradiate the laser light incident on the optical fiber 27 as an image.
 水平走査部23は、共振型偏向素子23a、水平走査制御回路23b、水平走査角検出回路23cとから構成されている。光ファイバ27に入射されたレーザ光は、コリメート光学系22で平行光化され、共振型偏向素子23aに入射される。共振型偏向素子23aは、水平走査制御回路23bで揺動される反射面23dを有する。共振型偏向素子23aは、入射されたレーザ光を、揺動する反射面23dで反射させて水平方向に走査する。水平走査制御回路23bは、信号処理回路211から出力される水平駆動信号215に基づいて、共振型偏向素子23aの反射面23dを揺動させる駆動信号を発生する。水平走査角検出回路23cは、共振型偏向素子23aから出力される変位信号に基づいて、共振型偏向素子23aの反射面23dの揺動範囲及び揺動周波数等の揺動状態を検出し、当該揺動状態を示す信号を、制御部12に出力する。 The horizontal scanning unit 23 includes a resonant deflection element 23a, a horizontal scanning control circuit 23b, and a horizontal scanning angle detection circuit 23c. The laser light incident on the optical fiber 27 is collimated by the collimating optical system 22 and is incident on the resonant deflection element 23a. The resonant deflection element 23a has a reflection surface 23d that is swung by the horizontal scanning control circuit 23b. The resonant deflection element 23a reflects the incident laser beam on the oscillating reflecting surface 23d and scans in the horizontal direction. The horizontal scanning control circuit 23b generates a driving signal for swinging the reflecting surface 23d of the resonance type deflection element 23a based on the horizontal driving signal 215 output from the signal processing circuit 211. The horizontal scanning angle detection circuit 23c detects a swing state such as a swing range and a swing frequency of the reflection surface 23d of the resonant deflection element 23a based on the displacement signal output from the resonant deflection element 23a. A signal indicating the swing state is output to the control unit 12.
 垂直走査部24は、偏向素子24a、垂直走査制御回路24b、垂直走査角検出回路24cとから構成されている。偏向素子24aは、垂直走査制御回路24bで揺動される反射面24dを有する。偏向素子24aは、入射されたレーザ光を、揺動する反射面24dで反射させて垂直方向に走査し、2次元的に走査された画像光として、リレー光学系26に出射する。垂直走査制御回路24bは、信号処理回路211から出力される垂直駆動信号216に基づいて、偏向素子24aの反射面24dを揺動させる駆動信号を発生する。垂直走査角検出回路24cは、偏向素子24aから出力される変位信号に基づいて、偏向素子24aの反射面24dの揺動範囲及び揺動周波数等の揺動状態を検出し、当該揺動状態を示す信号を、制御部12に出力する。 The vertical scanning unit 24 includes a deflection element 24a, a vertical scanning control circuit 24b, and a vertical scanning angle detection circuit 24c. The deflection element 24a has a reflection surface 24d that is swung by the vertical scanning control circuit 24b. The deflecting element 24a reflects the incident laser beam on the oscillating reflecting surface 24d, scans it in the vertical direction, and outputs it to the relay optical system 26 as two-dimensionally scanned image light. Based on the vertical drive signal 216 output from the signal processing circuit 211, the vertical scanning control circuit 24b generates a drive signal for swinging the reflecting surface 24d of the deflection element 24a. Based on the displacement signal output from the deflection element 24a, the vertical scanning angle detection circuit 24c detects a swing state such as a swing range and a swing frequency of the reflection surface 24d of the deflection element 24a, and detects the swing state. The signal shown is output to the control unit 12.
 リレー光学系25は、共振型偏向素子23aと偏向素子24aの間に配設されている。リレー光学系25は、共振型偏向素子23aの反射面23dで、水平方向に走査されたレーザ光を収束させて、偏向素子24aの反射面24dに入射させる。 The relay optical system 25 is disposed between the resonant deflection element 23a and the deflection element 24a. The relay optical system 25 converges the laser beam scanned in the horizontal direction on the reflection surface 23d of the resonance type deflection element 23a and makes it incident on the reflection surface 24d of the deflection element 24a.
 信号処理回路211は、制御部12から入力された「画像信号」に基づいて、水平駆動信号215と垂直駆動信号216を、それぞれ水平走査制御回路23bと垂直走査制御回路24bに出力し、反射面23d、24dの走査角を制御する。 The signal processing circuit 211 outputs the horizontal drive signal 215 and the vertical drive signal 216 to the horizontal scanning control circuit 23b and the vertical scanning control circuit 24b, respectively, based on the “image signal” input from the control unit 12, and the reflecting surface. The scanning angles 23d and 24d are controlled.
 こうして変更された反射面23d、24dの走査角度は、水平走査角検出回路23c及び垂直走査角検出回路24cによって検出信号として検出され、当該検出信号が制御部12に入力され、水平駆動信号215及び垂直駆動信号216にフィードバックされる。 The scanning angles of the reflecting surfaces 23d and 24d thus changed are detected as detection signals by the horizontal scanning angle detection circuit 23c and the vertical scanning angle detection circuit 24c, and the detection signals are input to the control unit 12, and the horizontal drive signal 215 and This is fed back to the vertical drive signal 216.
 リレー光学系26は、正の屈折力を持つレンズ系26a、26bを有している。偏向素子24aから出射された画像光は、レンズ系26aによって、それぞれの画像光が、その走査画像光の中心線を相互に略平行にされ、かつそれぞれ収束画像光に変換される。この収束画像光は、レンズ系26bによってそれぞれほぼ平行な走査画像光となるとともに、これらの走査画像光の中心線が利用者の眼Eの瞳孔Eaに収束するように集光される。 The relay optical system 26 includes lens systems 26a and 26b having a positive refractive power. The image light emitted from the deflection element 24a is converted into convergent image light by the lens system 26a so that the respective image lights have their center lines substantially parallel to each other. The converged image light is converted into substantially parallel scanned image light by the lens system 26b, and is condensed so that the center line of the scanned image light converges on the pupil Ea of the user's eye E.
 なお、本実施形態では、光ファイバ27から入射されたレーザ光を、水平走査部23で水平方向に走査した後、垂直走査部24によって垂直方向に走査することとしたが、水平走査部23と垂直走査部24の配置を入れ替え、垂直走査部24に垂直方向に走査した後、水平走査部23で水平方向に走査するように構成してもよい。 In the present embodiment, the laser light incident from the optical fiber 27 is scanned in the horizontal direction by the horizontal scanning unit 23 and then scanned in the vertical direction by the vertical scanning unit 24. The arrangement of the vertical scanning unit 24 may be changed, and after the vertical scanning unit 24 scans in the vertical direction, the horizontal scanning unit 23 scans in the horizontal direction.
 (メイン処理)
 図4に示すメイン処理は、制御部12がROMに記憶されているプログラムを実行することで行われる。この処理を開始した制御部12は、先ず、電源コントローラ400を介してバッテリー500のバッテリー残量を検出し、バッテリー残量値を取得する(S100)。続けて、制御部12は、取得したバッテリー残量値が所定の一定値である規定値以下であるか否かを判断する(S102)。判断の結果、バッテリー残量が規定値以下でない場合(S102:No)、制御部12は、処理をS104に移行する一方、バッテリー残量が規定値以下である場合(S102:Yes)、処理をS108に移行する。
(Main process)
The main process shown in FIG. 4 is performed by the control unit 12 executing a program stored in the ROM. The control unit 12 that has started this process first detects the remaining battery level of the battery 500 via the power controller 400 and acquires the remaining battery level value (S100). Subsequently, the control unit 12 determines whether or not the acquired battery remaining value is equal to or less than a predetermined value that is a predetermined constant value (S102). As a result of the determination, when the remaining battery level is not less than or equal to the specified value (S102: No), the control unit 12 proceeds to S104, while when the remaining battery level is equal to or less than the specified value (S102: Yes) The process proceeds to S108.
 S104で制御部12は、画像提示装置20が出射する走査画像光(具体的には、レーザ光)のレーザ出力値(出射出力値)を、予め定めたデフォルト値P(所定の一定値である出射出力値。レーザ出力の最大値より低い値)に設定する。そして、制御部12は、コンテンツデータ142をレンダリングし、走査画像データを生成する(S106)。ここで、S106で生成される走査画像データは、S104で設定されたレーザ出力のデフォルト値Pに対応する明るさの走査画像を表すデータである。走査画像は、上述したとおり、複数の画素によって形成されている。したがって、S106で制御部12は、各画素がレーザ出力のデフォルト値Pに応じた値(係数)と各色の明るさ値とを掛け合わせた明るさ(階調)に設定された走査画像データを生成する。例えば、R(赤)G(緑)B(青)で表現される場合、各画素がレーザ出力のデフォルト値Pに対応したRGB値に設定される。ここで、画素の明るさ(RGB値)とレーザ出力値とは概ね相関関係にある。例えば、暗い画像(RGB値で示すと、R(赤)G(緑)B(青)の各値が低い(256階調の場合、各色の最大値は255))である場合、レーザ出力値も低くなる。なお、本実施形態のHMD100では、レーザ出力が最大値のときの係数を1とし、最小値(未出力)のときの係数を0とし、その中間について所定の値が各レーザ出力値に対応付けて登録されたテーブルを、例えば、記憶部14に記憶している。デフォルト値Pに応じた値(係数)は、デフォルト値Pに基づきこのテーブルから選択される。以下、RGB値を例に説明する。 In S104, the control unit 12 sets a laser output value (emission output value) of scanning image light (specifically, laser light) emitted from the image presentation device 20 to a predetermined default value P (a predetermined constant value). The output power value is set to a value lower than the maximum laser output value. Then, the control unit 12 renders the content data 142 and generates scanned image data (S106). Here, the scanned image data generated in S106 is data representing a scanned image having a brightness corresponding to the default value P of the laser output set in S104. As described above, the scanned image is formed by a plurality of pixels. Therefore, in S106, the control unit 12 scans the scanned image data in which each pixel is set to brightness (gradation) obtained by multiplying the value (coefficient) corresponding to the default value P of the laser output and the brightness value of each color. Generate. For example, when represented by R (red), G (green), and B (blue), each pixel is set to an RGB value corresponding to the default value P of the laser output. Here, the brightness of the pixel (RGB value) and the laser output value are generally correlated. For example, when each of R (red), G (green), and B (blue) is low (in RGB values, the maximum value of each color is 255 in the case of 256 gradations), the laser output value Also lower. In the HMD 100 of the present embodiment, the coefficient when the laser output is the maximum value is 1, and the coefficient when the laser output is the minimum value (not output) is 0, and a predetermined value is associated with each laser output value in the middle. For example, the registered table is stored in the storage unit 14. A value (coefficient) corresponding to the default value P is selected from this table based on the default value P. Hereinafter, description will be made by taking RGB values as an example.
 S108で制御部12は、レーザ出力値を所定の値に設定し、これをRAM上の所定領域に記憶する。続けて、省電力画像データ生成処理を実行する(S110)。なお、S108で設定されるレーザ出力値は、S110における省電力画像データ生成処理(図5参照)にて利用される。例えば、S108で設定されるレーザ出力値は、デフォルト値Pより低い値である。S110で実行される省電力画像データ生成処理についての詳細は後述する。 In S108, the control unit 12 sets the laser output value to a predetermined value and stores it in a predetermined area on the RAM. Subsequently, power saving image data generation processing is executed (S110). The laser output value set in S108 is used in the power saving image data generation process (see FIG. 5) in S110. For example, the laser output value set in S108 is a value lower than the default value P. Details of the power saving image data generation processing executed in S110 will be described later.
 S112で制御部12は、画像提示装置20を制御し、S106で生成した走査画像データによって表される走査画像を示す走査画像光を出射し、又は、S110で生成した走査画像データによって表される走査画像を示す走査画像光を出射する。ここで、出射される走査画像光のレーザ出力値は、走査画像データで直接変調される。そのため、暗い画像、例えば、RGB値が低い画素については、走査画像光のレーザ出力値が低下する。かかる構成によれば、走査画像光を出射するに際し、レーザ出力値を制御によって変化させるので好適である。ただし、走査画像光を出射するに際し、線画部又は非線画部(次に記載の省電力画像データ生成処理のS200(図5)参照)を特定(判断)し、レーザ出力値を制御し、これによって変化させる構成を採用することもできる。 In S112, the control unit 12 controls the image presentation device 20 to emit scanning image light indicating the scanning image represented by the scanning image data generated in S106, or represented by the scanning image data generated in S110. Scanning image light indicating the scanning image is emitted. Here, the laser output value of the emitted scanning image light is directly modulated with the scanning image data. Therefore, for a dark image, for example, a pixel with a low RGB value, the laser output value of the scanned image light is reduced. Such a configuration is preferable because the laser output value is changed by control when the scanning image light is emitted. However, when the scanning image light is emitted, the line drawing unit or the non-line drawing unit (see S200 (see FIG. 5) of the power saving image data generation process described below) is specified (determined), and the laser output value is controlled. It is also possible to adopt a configuration that changes according to
 (省電力画像データ生成処理)
 次に、上述したメイン処理のS110において実行される省電力画像データ生成処理について、図5を用いて説明する。この処理を開始した制御部12は、先ず、コンテンツデータ142によって示されるコンテンツ画像の内、線画像(例えば、文字、線画(線))を含む線画部と、非線画像(例えば、風景画)を含む非線画部とを特定し、コンテンツデータ142を、線画部と非線画部とに分割する。
(Power-saving image data generation processing)
Next, the power saving image data generation process executed in S110 of the main process described above will be described with reference to FIG. The control unit 12 that has started this process firstly includes a line drawing unit including a line image (for example, a character or a line drawing (line)) in a content image indicated by the content data 142, and a non-line image (for example, a landscape drawing). The content data 142 is divided into a line drawing part and a non-line drawing part.
 なお、S200で制御部12は、コンテンツデータ142のファイルの種別で文字や線画(線画部)を判断してもよい。例えば、コンテンツデータ142がテキストファイルである場合、制御部12は、コンテンツデータ142を線画部と判断する。また、コンテンツ画像を形成する画素のRGB値の組が種々の値の組に広く分布していれば、中間調があるので非線画部と判断できる。これに対し、RGB値の組が値の大きく異なる2グループに分かれ、かつ、一方の組が長く連続しない場合には文字や線画(線画部)と判断できる。複数のグループに分かれても、一組を除いて、他の組が長く連続しなければ異なった色の文字や線画と判断できる。2組以上が長く連続する場合は、その色で塗られた領域がある非線画像と判断できる。連続する長さは通常の線幅(数ピクセル)より長いか短いかで判断すればよい。また、制御部12が特定する線画部は、線画像の輪郭に一致する領域又は線画像を含む領域(例えば、矩形領域)のいずれであってもよい。非線画部についても同じである。 In S200, the control unit 12 may determine a character or a line drawing (line drawing unit) based on the file type of the content data 142. For example, when the content data 142 is a text file, the control unit 12 determines that the content data 142 is a line drawing unit. Further, if the RGB value pairs of the pixels forming the content image are widely distributed in various value pairs, it can be determined as a non-line drawing portion because there is a halftone. On the other hand, when the RGB value pairs are divided into two groups having greatly different values, and one of the pairs is not long and continuous, it can be determined as a character or a line drawing (line drawing portion). Even if it is divided into a plurality of groups, it can be determined that the character or line drawing has a different color unless one set and the other set are long and continuous. When two or more sets are continuous for a long time, it can be determined that there is a non-linear image with a region painted in that color. The continuous length may be determined by whether it is longer or shorter than the normal line width (several pixels). Further, the line drawing unit specified by the control unit 12 may be either a region matching the outline of the line image or a region including the line image (for example, a rectangular region). The same applies to the non-line drawing part.
 続けて、制御部12は、線画部があるか否か、詳細には、S200によって線画部が分割されたか否かを判断する(S202)。S202の判断の結果、線画部がない場合(S202:No)、制御部12は処理をS212に移行する。これに対し、線画部がある場合(S202:Yes)、線画部が2値のデータであるか否かを判断する(S204)。線画部が2値のデータである場合(S204:Yes)、制御部12は処理をS208に移行する。これに対し、線画部が2値のデータではない場合(S204:No)、換言すれば、カラーのデータである場合、線画部に対して2値化処理を実行し(S206)、処理をS208に移行する。ここで、2値化処理は、カラーのデータをグレーのデータに変換し、所定の閾値を用いて2値のデータに変換する処理である。 Subsequently, the control unit 12 determines whether or not there is a line drawing part, specifically, whether or not the line drawing part is divided in S200 (S202). As a result of the determination in S202, when there is no line drawing part (S202: No), the control unit 12 moves the process to S212. On the other hand, if there is a line drawing part (S202: Yes), it is determined whether or not the line drawing part is binary data (S204). When the line drawing unit is binary data (S204: Yes), the control unit 12 moves the process to S208. On the other hand, if the line drawing unit is not binary data (S204: No), in other words, if it is color data, binarization processing is executed on the line drawing unit (S206), and the process is performed in S208. Migrate to Here, the binarization processing is processing for converting color data into gray data and converting the data into binary data using a predetermined threshold.
 S208で制御部12は、S204が肯定された線画部又はS206の2値化処理が実行された線画部を含む領域に対して白黒反転処理を実行する。ここで、白黒反転処理は、例えば、文字のような線で示される部分の色を白色に変換する一方、その他の部分の色を黒色に変換する処理である。 In S208, the control unit 12 performs the black and white reversal process on the area including the line drawing part in which S204 is affirmed or the line drawing part in which the binarization process in S206 is executed. Here, the black-and-white reversal process is, for example, a process of converting the color of a portion indicated by a line such as a character to white while converting the color of other portions to black.
 続けて、制御部12は、S208が実行された線画部をレンダリングし、線画部処理画像データを生成し(S210)、処理をS212に移行する。S210で制御部12は、例えば、上述したレーザ出力のデフォルト値P(図4のS104参照)以上のレーザ出力値(第1出射出力値)に対応したRGB値の線画部処理画像データを生成する。なお、線画部処理画像データの生成は、デフォルト値P以上のレーザ出力値に基づき、上述した図4のS106と同様の手法で行われる。デフォルト値P以上のレーザ出力値に応じた値(係数)は、デフォルト値P以上のレーザ出力値に基づき上述したテーブルから選択される。 Subsequently, the control unit 12 renders the line drawing unit in which S208 is executed, generates line drawing unit processed image data (S210), and shifts the processing to S212. In S210, for example, the control unit 12 generates line drawing part processed image data having an RGB value corresponding to a laser output value (first emission output value) equal to or greater than the above-described laser output default value P (see S104 in FIG. 4). . The generation of the line drawing section processed image data is performed by the same method as S106 in FIG. 4 described above based on the laser output value equal to or greater than the default value P. A value (coefficient) corresponding to the laser output value equal to or greater than the default value P is selected from the above-described table based on the laser output value equal to or greater than the default value P.
 S212で制御部12は、S200によって非線画部が分割されたか否かを判断する。判断の結果、非線画部がない場合(S212:No)、制御部12は処理をS218に移行する。これに対し、非線画部がある場合(S212:Yes)、制御部12は、図4に示すメイン処理のS108でRAM上に記憶したレーザ出力値を取得する(読み出す/S214)。そして、制御部12は、非線画部に対してレンダリングを実行し、非線画部処理画像データを生成する。この際、制御部12は、S214で取得したレーザ出力値(第2出射出力値)に対応するRGB値の非線画部処理画像データを生成する。S216の処理は、上述した図4のメイン処理のS106と同様に実行される。S214で取得したレーザ出力値に応じた値(係数)は、取得したレーザ出力値に基づき上述したテーブルから選択される。 In S212, the control unit 12 determines whether or not the non-line drawing unit is divided in S200. As a result of the determination, if there is no non-line drawing part (S212: No), the control part 12 moves the process to S218. On the other hand, when there is a non-line drawing part (S212: Yes), the control part 12 acquires the laser output value stored on the RAM in S108 of the main process shown in FIG. 4 (reading / S214). And the control part 12 performs rendering with respect to a non-line drawing part, and produces | generates non-line drawing part process image data. At this time, the control unit 12 generates non-line drawing processed image data of RGB values corresponding to the laser output value (second emission output value) acquired in S214. The process of S216 is executed in the same manner as S106 of the main process of FIG. 4 described above. The value (coefficient) corresponding to the laser output value acquired in S214 is selected from the table described above based on the acquired laser output value.
 S216を実行した後、制御部12は、S210で生成された線画部処理画像データと、S216で生成された非線画部処理画像データとを合成し、1つの走査画像データを生成する(S218)。ここで、S202の判断が否定された場合(S202:No)、S216で生成された非線画部処理画像データが、そのまま走査画像データになる。また、S212の判断が否定された場合(S212:No)、S210で生成された線画部処理画像データが、そのまま走査画像データになる。 After executing S216, the control unit 12 combines the line drawing processed image data generated in S210 and the non-line drawing processed image data generated in S216 to generate one scanned image data (S218). . Here, when the determination in S202 is negative (S202: No), the non-line drawing portion processed image data generated in S216 becomes the scanned image data as it is. If the determination in S212 is negative (S212: No), the line drawing section processed image data generated in S210 becomes the scanned image data as it is.
 なお、上記では、図4のS108でRAM上に記憶され、図5のS214で取得されるレーザ出力値が、デフォルト値Pより低い値の場合を例に説明した。しかし、これとは異なる構成を採用することもできる。例えば、図4のS108で記憶され図5のS214で取得されるレーザ出力値は、図5のS210で用いられるレーザ出力値より低く設定されていれば、デフォルト値Pより高くすることもできる。走査画像の全体が、図5のS210で用いられるレーザ出力値で出射される場合と比較し、省電力を実現することができる。 In the above description, the case where the laser output value stored in the RAM in S108 in FIG. 4 and acquired in S214 in FIG. 5 is lower than the default value P has been described as an example. However, a different configuration can be adopted. For example, the laser output value stored in S108 of FIG. 4 and acquired in S214 of FIG. 5 may be higher than the default value P if it is set lower than the laser output value used in S210 of FIG. As compared with the case where the entire scanned image is emitted with the laser output value used in S210 of FIG. 5, power saving can be realized.
 (走査画像及びレーザ出力)
 コンテンツデータ142によって示されるコンテンツ画像、換言すれば、図5に示す省電力画像データ生成処理による変換前の画像を示す図6(a)と、図5に示す省電力画像データ生成処理によって生成された走査画像データによって表される走査画像を示す図6(b)との比較について説明する。
(Scanned image and laser output)
The content image indicated by the content data 142, in other words, FIG. 6A showing the image before conversion by the power saving image data generation processing shown in FIG. 5, and the power saving image data generation processing shown in FIG. A comparison with FIG. 6B showing the scanned image represented by the scanned image data will be described.
 図6(a)及び(b)を比較すると明らかなとおり、線画像である文字「作業場所風景」を含む領域、換言すれば、風景画(非線画像)を除く領域では、図5に示す省電力画像データ生成処理のS208が実行された結果、白黒が反転している。具体的には、黒色であった文字「作業場所風景」が白色に変換される一方、その背景が白色から黒色に変換されている。なお、後述するレーザ出力に関し、画素が黒色であるとき走査画像光は出射されず、したがって、レーザ出力値は0である。 6A and 6B, as shown in FIG. 5, the region including the character “work place scenery” as a line image, in other words, the region excluding the landscape image (non-line image) is shown in FIG. 5. As a result of executing S208 of the power saving image data generation process, the black and white are reversed. Specifically, the black character “work place scenery” is converted to white, while the background is converted from white to black. Regarding the laser output described later, the scanning image light is not emitted when the pixel is black, and therefore the laser output value is zero.
 また、図6(b)に示す風景画は、図6(a)に示す風景画と比較し、全体として暗い画像に変換されている。これは、図5に示す省電力画像データ生成処理のS216で、図4に示すメイン処理のS108でRAM上に記憶され、図5のS214で取得されたレーザ出力値(デフォルト値P(図4のS104参照)より低いレーザ出力)に応じた処理が実行されたことに基づくものである。 Further, the landscape image shown in FIG. 6B is converted into a dark image as a whole as compared with the landscape image shown in FIG. This is S216 of the power saving image data generation process shown in FIG. 5 and is stored in the RAM in S108 of the main process shown in FIG. 4 and is acquired in S214 of FIG. 5 (default value P (FIG. 4). This is based on the execution of the process corresponding to the lower laser output).
 図4に示すメイン処理で生成された走査画像データによって表される走査画像を示す走査画像光を出射した際のレーザ出力値の変化について、図7を用いて説明する。図7上段のレーザ出力値を示すグラフにおいて、実線(図7において「本手法」と記載)は、図4のS110(詳細は図5に示す省電力画像データ生成処理参照)で生成された走査画像データに基づく走査画像光を出射した場合のレーザ出力値を示す。これに対し、破線(図7において「比較手法」と記載)は、図4のS106で生成された走査画像データに基づく走査画像光を出射した場合のレーザ出力値を示す。なお、図7上段に示すレーザ出力値は、図7下段に示す走査画像中に描画された線(図7を正面視した場合において、上下中央部付近を左右に横断する水平線)上を走査した際のレーザ出力値の変化を示している。 Changes in the laser output value when the scanning image light indicating the scanning image represented by the scanning image data generated by the main processing shown in FIG. 4 is emitted will be described with reference to FIG. In the graph showing the laser output value in the upper part of FIG. 7, the solid line (described as “this method” in FIG. 7) is the scan generated in S <b> 110 of FIG. 4 (see the power saving image data generation process shown in FIG. 5 for details). A laser output value when scanning image light based on image data is emitted is shown. On the other hand, the broken line (described as “comparison method” in FIG. 7) indicates the laser output value when the scanning image light based on the scanning image data generated in S106 of FIG. 4 is emitted. The laser output value shown in the upper part of FIG. 7 was scanned on a line drawn in the scanned image shown in the lower part of FIG. 7 (a horizontal line that crosses the vicinity of the upper and lower central parts when viewed in front). The change of the laser output value at the time is shown.
 図7に示すレーザ出力値から明らかなとおり、比較手法は、全走査範囲において略同一のレーザ出力値であるのに対し、本手法のレーザ出力値は、風景画(非線画像)の範囲で比較手法より低下している一方、文字「作業場所」(線画像)の範囲で増加している。すなわち、本手法による画像が提示されたHMD100の利用者は、比較手法の場合と比較し、暗めの風景画と明るい文字とを視認する。 As is clear from the laser output value shown in FIG. 7, the comparison method has substantially the same laser output value in the entire scanning range, whereas the laser output value of this method is in the range of a landscape image (non-linear image). While it is lower than the comparison method, it is increasing in the range of the character “work place” (line image). That is, the user of the HMD 100 presented with the image according to the present method visually recognizes a dark landscape image and bright characters as compared with the case of the comparison method.
 なお、本実施形態の本手法の構成によれば、風景画等の非線画像に係る走査画像光(非線画像を形成する画素に係る走査画像光)を出射している状態において、文字等の線画像に係る走査画像光(線画像を形成する画素に係る走査画像光)が出射される場合、レーザ出力値は上昇する。これに対し、線画像に係る走査画像光から非線画像に係る走査画像光に変化した場合、レーザ出力値は低下する。半導体レーザを直接変調する場合には、レーザ出力値が大きいほど、大きな電流を流す必要があり、電力消費が大きくなる。図7上段で点線を積分した面積(消費電力にほぼ比例)よりも本手法である実線を積分した面積の方が小さく省電力化されていることがわかる。さらに、文字の部分のピーク出力は実線の方が大きく、比較手法よりも明るいのでより重要な情報である文字の視認性はむしろ向上している。 In addition, according to the structure of this method of this embodiment, in the state which has emitted the scanning image light (scanning image light which concerns on the pixel which forms a non-linear image) which concerns on non-linear images, such as a landscape image, a character etc. When the scanning image light relating to the line image (scanning image light relating to the pixels forming the line image) is emitted, the laser output value increases. On the other hand, when the scanning image light related to the line image is changed to the scanning image light related to the non-line image, the laser output value decreases. When the semiconductor laser is directly modulated, the larger the laser output value, the larger the current needs to flow, resulting in higher power consumption. It can be seen that the area obtained by integrating the solid line, which is the present method, is smaller than the area obtained by integrating the dotted line in the upper part of FIG. Furthermore, since the peak output of the character portion is larger in the solid line and brighter than the comparison method, the visibility of the character, which is more important information, is rather improved.
 (本実施形態の構成に基づく有利な効果)
 本実施形態のHMD100は、図4に示すメイン処理のS108でRAM上に記憶され、図5に示す省電力画像データ生成処理のS214で取得されるレーザ出力値を、図5のS210で用いられるレーザ出力値より低く設定する構成を採用した。そのため、走査画像データによって表される走査画像を形成する画素が、S210で用いられるレーザ出力値に対応したRGB値である場合と比較し、視認性を損なうことなく省電力化を実現することができる。
(Advantageous effects based on the configuration of the present embodiment)
The HMD 100 of this embodiment is stored in the RAM in S108 of the main process shown in FIG. 4, and the laser output value acquired in S214 of the power saving image data generation process shown in FIG. 5 is used in S210 of FIG. A configuration that is set lower than the laser output value is adopted. Therefore, compared with the case where the pixels forming the scanned image represented by the scanned image data are RGB values corresponding to the laser output values used in S210, power saving can be realized without impairing visibility. it can.
 また、例えば、図5のS210で用いられるレーザ出力値をデフォルト値P以下に設定すれば、図4のS108でRAM上に記憶され、S214で取得されるレーザ出力値は、S210のレーザ出力値より低いため、走査画像データによって表される走査画像を形成する画素が、デフォルト値Pに対応したRGB値である場合よりも、省電力化を実現することができる。 Further, for example, if the laser output value used in S210 of FIG. 5 is set to the default value P or less, the laser output value stored in the RAM in S108 of FIG. 4 and acquired in S214 is the laser output value of S210. Therefore, power saving can be realized as compared with the case where the pixels forming the scanned image represented by the scanned image data are RGB values corresponding to the default value P.

Claims (3)

  1.  利用者に、コンテンツデータにより示されるコンテンツ画像を視認させるヘッドマウントディスプレイであって、
     前記コンテンツ画像を形成する画素単位で発光し、前記コンテンツ画像に含まれる部分画像を示す画像光を出射する画像提示手段と、
     前記部分画像が、線画で示される線画像であるか、非線画で示される非線画像であるかを判定する判定手段と、
     前記判定手段が線画像であると判定した場合、前記線画像を示す画像光を第1出射出力値に応じた出力値で出射するよう前記画像提示手段を制御し、前記判定手段が前記非線画像であると判定した場合、前記非線画像を示す画像光を、前記第1出射出力値より低い第2出射出力値に応じた出力値で出射するよう前記画像提示手段を制御する画像提示制御手段と、を備えることを特徴とするヘッドマウントディスプレイ。
    A head-mounted display that allows a user to visually recognize a content image indicated by content data,
    Image presentation means that emits light in units of pixels forming the content image and emits image light indicating a partial image included in the content image;
    Determining means for determining whether the partial image is a line image shown by a line drawing or a non-line image shown by a non-line drawing;
    When the determination means determines that the image is a line image, the image presentation means is controlled to emit image light indicating the line image at an output value corresponding to a first emission output value, and the determination means Image presentation control for controlling the image presentation means to emit image light indicating the non-linear image at an output value corresponding to a second emission output value lower than the first emission output value when it is determined that the image is an image. A head-mounted display.
  2.  前記画像提示制御手段は、前記第2出射出力値に応じた出力値で非線画像を示す画像光を出射している状態において、前記線画像を示す画像光を出射する場合、出射出力値を前記第1出射出力値に応じた出力値とし、前記第1出射出力値に応じた出力値で線画像を示す画像光を出射している状態において、前記非線画像を示す画像光を出射する場合、出射出力値を前記第2出射出力値に応じた出力値とすることを特徴とする請求項1に記載のヘッドマウントディスプレイ。 When the image presentation control means emits image light indicating the line image in a state where the image light indicating the non-linear image is output with an output value corresponding to the second output output value, the output value is output. An output value corresponding to the first output output value is set, and image light indicating the non-linear image is output in a state in which image light indicating a line image is output with an output value corresponding to the first output output value. 2. The head mounted display according to claim 1, wherein the output output value is an output value corresponding to the second output output value.
  3.  前記ヘッドマウントディスプレイは、駆動用電源としてバッテリーを利用可能であって、
     バッテリー残量を検知する検知手段を備え、
     前記画像提示制御手段は、前記検知手段により検知されたバッテリー残量値が所定の残量値より少ない場合、前記第1出射出力値及び前記第2出射出力値各々に応じた出力値による画像光の出射を制御し、前記検知手段により検知されたバッテリー残量値が所定の残量値より多い場合、所定の一定値である出射出力値に応じた出力値による、前記線画像を示す画像光の出射と前記非線画像を示す画像光の出射とを制御することを特徴とする請求項1又は請求項2に記載のヘッドマウントディスプレイ。
    The head mounted display can use a battery as a power source for driving,
    It has a detection means to detect the remaining battery level,
    When the battery remaining amount value detected by the detecting unit is less than a predetermined remaining amount value, the image presentation control unit is configured to output image light based on output values corresponding to the first emission output value and the second emission output value. When the battery remaining amount value detected by the detecting means is greater than a predetermined remaining amount value, the image light indicating the line image is output with an output value corresponding to the predetermined output value. The head-mounted display according to claim 1, wherein emission of image light and emission of image light indicating the non-linear image are controlled.
PCT/JP2010/000181 2009-02-20 2010-01-14 Head mounted display WO2010095347A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2009037292A JP2010191303A (en) 2009-02-20 2009-02-20 Head mounted display
JP2009-037292 2009-02-20

Publications (1)

Publication Number Publication Date
WO2010095347A1 true WO2010095347A1 (en) 2010-08-26

Family

ID=42633640

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2010/000181 WO2010095347A1 (en) 2009-02-20 2010-01-14 Head mounted display

Country Status (2)

Country Link
JP (1) JP2010191303A (en)
WO (1) WO2010095347A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017068269A (en) * 2016-10-28 2017-04-06 セイコーエプソン株式会社 Virtual image display device

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5870778B2 (en) * 2012-03-12 2016-03-01 ブラザー工業株式会社 Head mounted display, display method, and display program
JP6083193B2 (en) * 2012-11-02 2017-02-22 ソニー株式会社 Image output device, operation method of image output device, electronic circuit, electronic device, and program
JP2021021879A (en) * 2019-07-30 2021-02-18 セイコーエプソン株式会社 Optical element and image display device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003081567A1 (en) * 2002-03-27 2003-10-02 Sanyo Electric Co., Ltd. Display device, mobile terminal, and luminance control method in mobile terminal
JP2005345678A (en) * 2004-06-02 2005-12-15 Mitsubishi Electric Corp Portable display unit
JP2008096749A (en) * 2006-10-12 2008-04-24 Canon Inc Image display device, control method thereof, and computer program
JP2008104210A (en) * 1997-10-07 2008-05-01 Masanobu Kujirada Multi-channel display system connected with a plurality of interlocking display apparatuses

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008104210A (en) * 1997-10-07 2008-05-01 Masanobu Kujirada Multi-channel display system connected with a plurality of interlocking display apparatuses
WO2003081567A1 (en) * 2002-03-27 2003-10-02 Sanyo Electric Co., Ltd. Display device, mobile terminal, and luminance control method in mobile terminal
JP2005345678A (en) * 2004-06-02 2005-12-15 Mitsubishi Electric Corp Portable display unit
JP2008096749A (en) * 2006-10-12 2008-04-24 Canon Inc Image display device, control method thereof, and computer program

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017068269A (en) * 2016-10-28 2017-04-06 セイコーエプソン株式会社 Virtual image display device

Also Published As

Publication number Publication date
JP2010191303A (en) 2010-09-02

Similar Documents

Publication Publication Date Title
JP5975285B2 (en) Laser scanning display device
US8164621B2 (en) Image display device
US9462244B2 (en) Image display apparatus and optical component
JP5195942B2 (en) Scanning image display device
US20110199582A1 (en) Light Source Unit, Optical Scanning Display, and Retinal Scanning Display
WO2010029788A1 (en) Head mount display
JP5316346B2 (en) Retina scanning image display device
EP3179717B1 (en) Projecting device
US20120188623A1 (en) Scanning image display device and method of controlling the same
JP2010139901A (en) Head mount display
JP4840175B2 (en) Image display device
JP2014086426A (en) Laser output control device and laser scanning type display device
WO2010095347A1 (en) Head mounted display
JP2009086366A (en) Optical scanning device, optical scanning type image display device, and retinal scanning type image display device
JP5076427B2 (en) Image display device and image size changing method
JP2018156062A (en) Display device, object device, and display method
JP2014130256A (en) Image display device, image display method, and program
JP2010161152A (en) Image display device
JP5055760B2 (en) projector
JP2012141362A (en) Image display device
JP2012078532A (en) Image display device
JP2010145669A (en) Head mounted display
JP2017079132A (en) Light output control unit and light projection device
WO2010103974A1 (en) Image display device
JP2011075952A (en) Image display device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10743480

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10743480

Country of ref document: EP

Kind code of ref document: A1