WO2016121049A1 - Information display terminal and information display method - Google Patents

Information display terminal and information display method Download PDF

Info

Publication number
WO2016121049A1
WO2016121049A1 PCT/JP2015/052486 JP2015052486W WO2016121049A1 WO 2016121049 A1 WO2016121049 A1 WO 2016121049A1 JP 2015052486 W JP2015052486 W JP 2015052486W WO 2016121049 A1 WO2016121049 A1 WO 2016121049A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
image
information display
user
display
Prior art date
Application number
PCT/JP2015/052486
Other languages
French (fr)
Japanese (ja)
Inventor
大内 敏
瀬尾 欣穂
川村 友人
俊輝 中村
佑哉 大木
将史 山本
Original Assignee
日立マクセル株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日立マクセル株式会社 filed Critical 日立マクセル株式会社
Priority to PCT/JP2015/052486 priority Critical patent/WO2016121049A1/en
Publication of WO2016121049A1 publication Critical patent/WO2016121049A1/en

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/02Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/10Intensity circuits

Definitions

  • the present invention relates to an information display terminal and an information display method.
  • Wearable information display terminals such as glasses are known.
  • Various methods for controlling a wearable information display terminal have been proposed.
  • a glasses-type information display terminal displays superimposed display information (for example, exhibition contents at a public facility) related to an object imaged by an imaging unit (for example, a camera).
  • display information for example, exhibition contents at a public facility
  • an imaging unit for example, a camera
  • Patent Document 1 discloses a technique in which a wearable information display terminal combines and displays an icon corresponding to given data on image data of a current field of view captured by a CCD video camera or the like. Is described.
  • the display unit of the information display terminal that displays the superimposed display information is composed of a translucent member. For this reason, in the conventional technology, for example, in the evening in fine weather, amber light is transmitted through the display unit, and the visibility of information displayed on the display unit may be reduced.
  • An object of the present invention is to provide a technique that can improve the visibility of information displayed on a display unit.
  • An information display terminal is an information display terminal that can be worn on a user's head, and the information display terminal is attached to the user's head in the state of the user's head.
  • a display portion is arranged in front of the eyes.
  • the display unit displays information by changing at least one of luminance and color according to the detected situation around the information display terminal.
  • the information display terminal is an information display terminal that can be worn on a user's head, and the information display terminal is attached to the user's head, It has a display part arranged in front of the user's eyes.
  • a front imaging unit that captures an imaging region in the user's line-of-sight direction in a state of being mounted on the user's head.
  • a rear imaging unit that images an imaging region in a direction opposite to the line-of-sight direction;
  • the display unit displays a captured image captured by at least one of the front imaging unit and the rear imaging unit.
  • the information display terminal is an information display terminal that can be worn on a user's head, and the information display terminal is attached to the user's head, It has a display part arranged in front of the user's eyes. Moreover, it has an imaging part which images the imaging area
  • a communication unit configured to transmit core region image data for displaying the core region image extracted by the audio / video processing unit to an external terminal;
  • the information display method is a display that can be worn on the user's head and is placed in front of the user's eyes while being worn on the user's head.
  • the visibility of information displayed on the display unit is improved.
  • FIG. 2 is a diagram illustrating an outline of a hardware configuration example of the information display terminal according to Embodiment 1.
  • FIG. (A) is a perspective view which shows the state with which the information display terminal which concerns on Embodiment 1 was mounted
  • (B) is a perspective view which shows the state by which the information display terminal which concerns on Embodiment 1 was removed. It is a figure which shows the outline
  • FIG. 6 is a diagram showing an outline of a configuration example of a color parameter table stored in a memory of the information display terminal according to Embodiment 1.
  • FIG. FIG. 3 is a diagram showing an outline of overall processing of the information display terminal according to Embodiment 1; 6 is a diagram illustrating an outline of a hardware configuration example of an information display terminal according to Embodiment 2.
  • FIG. It is a perspective view which shows the state with which the information display terminal which concerns on Embodiment 2 was mounted
  • FIG. FIG. 10 is a diagram showing an outline of overall processing of an information display terminal according to Embodiment 3.
  • FIG. 10 is a diagram for describing processing in which an information display terminal according to Embodiment 3 generates a corrected image.
  • FIG. 1 is a diagram showing an outline of a configuration example of an information display system having the information display terminal 10 according to the first embodiment.
  • the information display system is connected to the information display terminal 10, the portable terminal 300 connected to the information display terminal 10 via the network 200, and the information display terminal 10 via the network 200.
  • the mobile phone network 11 is connected to the display terminal 10 via the network 200.
  • the information display terminals 10 and 140, the portable terminal 300, the facility terminal 400, and the server 500 are implemented by predetermined hardware and software.
  • the information display terminals 10 and 140, the portable terminal 300, the facility terminal 400, and the server 500 are configured by a processor and a memory, and the information display terminals 10 and 140, The computer of the portable terminal 300, the facility terminal 400, and the server 500 is caused to function.
  • the information display terminal 10 is a wearable glasses-type terminal.
  • the information display terminal 10 is worn on the head of the user 1000.
  • the information display terminal 10 includes an imaging unit (for example, a camera) 109 that captures an object that exists in the line-of-sight direction of the user 1000.
  • the display unit 110 of the information display terminal 10 displays a background image.
  • the background image is, for example, an image of one color (for example, white painting) or an image captured by the imaging unit 109 (hereinafter sometimes referred to as a captured image).
  • the display unit 110 displays the superimposed display information related to the object imaged by the imaging unit 109 in a superimposed manner on the background image.
  • the superimposed display information is information useful to the user 1000 such as, for example, store sales information, discount coupons, exhibition contents at public facilities, admission coupons, and points that can be used as money at the store. It is.
  • the information display terminal 10 acquires superimposed display information related to an object imaged from the mobile terminal 300, the facility terminal 400, the server 500, and the like via the network 200.
  • the storage 510 of the server 500 stores captured image data for displaying a captured image of an object captured by the imaging unit 109 and superimposed display information related to the object in association with each other. Then, the server 500 transmits the superimposed display information stored in the storage 510 to the information display terminal 10 via the network 200. The display unit 110 of the information display terminal 10 displays the superimposed display information transmitted from the server 500.
  • the facility terminal 400 is installed in a store or public facility.
  • the facility terminal 400 transmits the superimposed display information to the mobile terminal 300.
  • the facility terminal 400 accepts an operation by a person in charge such as a store or a public facility.
  • the superimposed display information is not limited to the information stored in the storage 510 of the server 500.
  • the superimposed display information may be stored by an unspecified number of devices connected to the network 200 and disclosed to an external device.
  • the information display terminal 10 is not necessarily compatible with all communication environments where the user 1000 is placed.
  • the information display terminal 10 includes a short-distance communication unit but does not include a unit for enabling communication with a mobile phone
  • the superimposed display information is acquired from the server 500 via the network 200. Can not do it.
  • the user 1000 has a mobile terminal 300 (for example, a mobile phone or a tablet terminal).
  • the information display terminal 10 can be connected to the mobile terminal 300 via short-range communication, and the mobile terminal 300 can be connected to the network 200 as a router.
  • the facility terminal 400 that transmits the superimposed display information is not connected to the network 200.
  • the information display terminal 10 is connected to the portable terminal 300 by short-range communication, and the portable terminal 300 is connected to the facility terminal 400 via the wireless LAN as a wireless LAN unit.
  • FIG. 2 is a diagram illustrating an outline of a hardware configuration example of the information display terminal 10 according to the first embodiment.
  • the information display terminal 10 has a head mounted display 100 and a holder 130, one end connected to the head mounted display 100 and the other end connected to the holder 130. Cable 121 of a flexible member. That is, the holder 130 and the head mounted display 100 are connected via the cable 121.
  • the head mounted display 100 includes a main control unit 101, a communication unit 103, an audio input / output unit 104, a position detection unit 106, a power management unit 107, an image audio processing unit 108, An imaging unit 109, a display unit 110, a sensor unit 111, an audio processing unit 114, and a time measuring unit 115.
  • the head mounted display 100 detects changes in the color detected by the color sensor 119, the illuminance level detected by the illuminance sensor 120, and the situation around the information display terminal 10. Then, the display unit 110 displays a display image by changing at least one of luminance and color according to the specified surrounding situation.
  • the main control unit 101 expands the basic operation program stored in the memory 102 in the RAM. Then, the main control unit 101 executes the basic operation program expanded in the RAM. Accordingly, the main control unit 101 controls the head mounted display 100 and performs various determinations and arithmetic processing.
  • a semiconductor element memory such as a flash ROM, an SSD (Solid State Drive), or a CPU (Central Integrated Circuit) built-in memory is used.
  • a device such as a magnetic disk drive such as an HDD (Hard Disc Drive) may be used.
  • a RAM Random Access Memory included in the memory 102 serves as a work area for executing a basic operation program and other operation programs.
  • the RAM may be configured separately from the memory 102.
  • the RAM may be configured integrally with the main control unit 101.
  • Time measuring unit 115 measures the current time.
  • the timing unit 115 inputs the current time measured to the main control unit 101.
  • the sensor unit 111 includes an attitude sensor 105 (for example, a gyro sensor), a color sensor 119, and an illuminance sensor 120.
  • attitude sensor 105 for example, a gyro sensor
  • color sensor 119 for example, a color sensor 119
  • illuminance sensor 120 for example, a gyro sensor
  • the attitude sensor 105 detects the tilt of the head mounted display 100.
  • the attitude sensor 105 inputs the detected tilt of the head mounted display 100 to the main control unit 101.
  • the color sensor 119 detects the color of the periphery (periphery where the user 1000 exists). The color sensor 119 inputs the detected color to the main control unit 101.
  • the illumination sensor 120 detects the ambient illumination level.
  • the illuminance sensor 120 inputs the detected illuminance level of the illuminance to the main control unit 101.
  • the illuminance sensor 120 may detect surrounding colors.
  • the imaging unit 109 (for example, a camera) has a size and weight (for example, 100 g or less) that can be accepted as a device included in the wearable head mounted display 100.
  • the imaging unit 109 is a small camera unit that is attached to the head mounted display 100.
  • an optical system is arranged so as to capture the direction of the line of sight of the user 1000 wearing the head mounted display 100. And the imaging part 109 images the imaging area of a user's gaze direction in the state with which the user's head was mounted
  • the captured image data of the captured image captured by the imaging unit 109 is input to the main control unit 101.
  • the captured image data of the captured image captured by the imaging unit 109 is stored in the memory 102 of the main control unit 101.
  • the imaging unit 109 includes an imaging element 113 that can receive high-sensitivity far infrared, near infrared, ultraviolet, X-ray, terahertz wave, muon wave, and the like in addition to normal visible light.
  • the display unit 110 has a display formed of a translucent member or a reflecting member such as a prism or a mirror.
  • the audio / video processing unit 108 outputs the captured image and the superimposed display information to the display of the display unit 110 based on the captured image data input from the main control unit 101 and the superimposed display information. Thereby, the display unit 110 displays the captured image and the superimposed display information on the display.
  • the main control unit 101 detects the situation around the information display terminal 10 based on the illuminance level detected by the sensor unit 111 (color sensor 119, illuminance sensor 120).
  • the main control unit 101 detects the situation around the information display terminal based on the current time measured by the time measuring unit 115.
  • the main control unit 101 detects a surrounding illuminance level and color by analyzing a captured image captured by the imaging unit 109 using a known technique, and detects a surrounding situation based on the detected illuminance level and color. To do. Specifically, the main control unit 101 changes the surrounding situation when the illuminance level corresponding to the luminance of the display image currently displayed on the display unit 110 is different from the illuminance level input from the illuminance sensor 120. Detect that In addition, the main control unit 101 analyzes the luminance level corresponding to the luminance of the display image currently displayed on the display unit 110 and the captured image data of the captured image captured by the imaging unit 109 by a known technique. When the specified illumination level is different, it is detected that the surrounding situation has changed.
  • the memory 102 stores image analysis data for matching illuminance levels and colors.
  • the main control unit 101 When detecting that the surrounding situation has changed based on the input illuminance level, the main control unit 101 acquires the luminance corresponding to the illuminance level from the luminance table (described later, FIG. 4) stored in the memory 102. To do.
  • FIG. 4 is a diagram showing an outline of the luminance table stored in the memory 102 according to Embodiment 1 of the present invention.
  • the luminance table has data items such as [illuminance level], [luminance], and [time].
  • [Illuminance level] indicates the value of ambient illuminance.
  • [Luminance] indicates the luminance of the display image displayed on the display unit.
  • [Time] indicates a time measured by the time measuring unit 115 included in the information display terminal 10.
  • the main control unit 101 acquires the luminance corresponding to the illuminance level from the memory 102, and sets the luminance of the display image displayed by the display unit 110 based on the acquired luminance.
  • the main control unit 101 acquires the luminance corresponding to the time input from the time measuring unit 115 from the memory 102, and sets the luminance of the display image displayed on the display unit 110 based on the acquired luminance.
  • the audio / video processing unit 108 changes the luminance of the display image displayed on the display unit 110 to the luminance acquired by the main control unit 101.
  • the audio / video processing unit 108 increases the luminance of the display image displayed on the display unit 110 if the periphery is dark and increases the brightness if the periphery is bright. Specifically, the audio / video processing unit 108 controls the current value of the LED of the optical module and the laser light source based on the luminance set by the main control unit 101, and thereby the luminance of the image displayed on the display unit 110. To change.
  • the user 1000 can visually recognize the display image displayed on the display unit 110 with appropriate luminance even when the head mounted display 100 is used at night or in the evening.
  • the display unit 110 sets the luminance of the display image to be displayed to 1 cd / m 2 or less.
  • the surrounding illuminance level is sunny in the daytime with 100,000 lux, the iris of the pupil is closed and small. Therefore, the display unit 110 is difficult to see the display image unless the brightness of the display image to be displayed is increased. Therefore, the display unit 110 sets the luminance of the display image to be displayed to 5000 cd / m 2 .
  • the main control unit 101 determines the situation in the surroundings when the color parameter corresponding to the color of the display image currently displayed on the display unit 110 is different from the color parameter corresponding to the color input from the color sensor 119. Detect that has changed. Further, the main control unit 101 analyzes the color parameters corresponding to the color of the display image currently displayed on the display unit 110 and the captured image data of the captured image captured by the imaging unit 109 by a known technique. When the specified color parameter is different, it is detected that the surrounding situation has changed.
  • the main control unit 101 calculates the changed color parameter based on a known image processing algorithm that performs a reverse calculation process on the input color parameter. To do. Then, the audio / video processing unit 108 displays a display image whose color has been changed based on the calculated color parameter on the display unit 110.
  • the audio / video processing unit 108 changes the color of the display image displayed on the display unit 110 according to the surrounding colors by controlling the LEDs provided on the display unit 110.
  • the main control unit 101 controls the LED provided in the display unit 110 to control the green value of the color parameter. Increase, with blue value.
  • transmits the display part 110 turns into white, and visibility improves.
  • the main control unit 101 controls the LED provided in the display unit 110 so that the red value of the color parameter and the green color Increase the value.
  • the blue color transmitted through the display unit 110 becomes white, and the visibility is improved.
  • the display unit 110 can display a display image with adjusted colors even in a special environment such as a room, a tunnel, or a camera development darkroom.
  • the main control unit 101 When detecting that the surrounding situation has changed based on the color input from the color sensor 119, the main control unit 101 stores the color parameters corresponding to the color input from the color sensor 119 in the memory. Obtained from a parameter table (described later, FIG. 5).
  • FIG. 5 is a diagram showing an outline of the color parameter table stored in the memory 102 according to Embodiment 1 of the present invention.
  • the color parameter table has data items such as [color], [color parameter], and [time].
  • [Color] indicates surrounding colors.
  • [Color parameter] indicates a parameter for setting the color of the display image displayed by the display unit.
  • [Color Parameter] indicates values of three primary colors of red, green, and blue.
  • [Time] indicates a time measured by the time measuring unit 115 included in the information display terminal 10.
  • the main control unit 101 acquires the color parameter corresponding to the color from the memory 102, and sets the color parameter of the display image displayed by the display unit 110 based on the acquired color parameter. And the main control part 101 changes the color of the display image which the display part 110 displays by controlling LED provided in the display part 110.
  • the display unit 110 includes a case where the background image is a captured image captured by the image capturing unit 109 and a case where the background image is other than the captured image captured by the image capturing unit 109 (for example, a white-painted image).
  • the object background image, superimposed display image
  • the luminance or color is to be changed is switched.
  • the main control unit 101 causes the display unit 110 to display at least one of luminance and / or color of only the background image.
  • the main control unit 101 has at least either the luminance or the color of both the background image and the superimposed display image. Change one of them to display.
  • the main control unit 101 acquires the color parameter corresponding to the time input from the time measuring unit 115 from the memory 102, and sets the color parameter of the display image displayed by the display unit 110 based on the acquired color parameter.
  • the display unit 110 may change and display at least one of luminance and color according to an operation from the user 1000.
  • the display unit 110 learns the past surrounding situation and the changed brightness and color parameters, and thereafter automatically changes and displays at least one of the brightness and the color according to the surrounding situation. You may make it do.
  • the voice input / output unit 104 includes a microphone that senses the voice of the user 1000, an earphone that is converted into voice data via the voice processing unit 114, and outputs voice based on the voice data input / output from the image / voice processing unit 108. Consists of.
  • the communication unit 103 exchanges display images with the server, the mobile terminal 300, the facility terminal 400, the server 500, and the like using various communication environments such as a mobile phone network, a wireless LAN, and short-range communication.
  • the communication unit 103 includes a wireless LAN unit, a mobile phone unit, a short-range communication unit, and the like.
  • a short-range communication unit WiFi (registered trademark), Bluetooth (registered trademark), and the like are applicable.
  • a position detector 106 which is a GPS (Global Positioning System) unit, receives radio waves from a plurality of positioning satellites orbiting around the earth, and detects the current position coordinates of the information display terminal 10 on the earth.
  • GPS Global Positioning System
  • the power management unit 107 manages the battery that drives the head mounted display 100, monitors the state of the battery, and periodically detects the remaining amount.
  • the main control unit 101 performs pattern matching on the captured image captured by the imaging unit 109 based on the image data of the captured image and the image data stored in the memory 102. Then, the main control unit 101 determines whether image data that matches or approximates the image data of the image captured by the imaging unit as a result of pattern matching is stored in the memory 102. Accordingly, the main control unit 101 detects that a specific product is included in the captured image.
  • the main control unit 101 When the captured image includes a product, the main control unit 101 requests the server 500 for a superimposed display image corresponding to the product. Then, the main control unit 101 causes the display unit 110 to display the superimposed display image acquired from the server 500 together with the captured image.
  • the main control unit 101 manages the battery that drives the holder 130 and monitors the state of the battery.
  • the power management unit 132 periodically detects the remaining amount.
  • the communication unit 131 transmits information read by the reader (not shown) 133 to the head mounted display 100.
  • the communication unit 131 corresponds to standards such as WiFi (registered trademark), BlueTooth (registered trademark), and LTE (Long Term Evolution).
  • the imaging part 109 may have a camera image pick-up element which functions as a night vision camera or a heat sensing camera like a near-infrared camera, for example. Even in this case, the imaging unit 109 can capture a normal captured image of visible light.
  • the imaging unit 109 includes at least two imaging elements 113 and can capture a normal visible light captured image and a captured image by night vision imaging.
  • the audio / video processing unit 108 converts the image data input from the imaging unit 109 into a night vision image (an image in which the surrounding state of the user 1000 can be confirmed even when the surrounding of the user 1000 becomes dark). ). Then, the audio / video processing unit 108 causes the display unit 110 to display the processed night vision image. Specifically, the display unit 110 displays an image of a person included in an image captured by a night vision camera while changing at least one of luminance and color.
  • the audio / video processing unit 108 synthesizes a normal captured image of visible light and a captured image obtained by night vision imaging using a known image processing algorithm. Then, the audio / video processing unit 108 causes the display unit 110 to display the synthesized image. Accordingly, the display unit 110 can display a night vision image with improved visibility in a night or dark environment. Moreover, since the captured image of normal visible light and the captured image by night vision imaging are combined, the display unit 110 can more clearly display information that needs to be recognized, such as the color of the signs and signal lights.
  • the imaging unit 109 functioning as a night vision camera may be a dedicated night vision camera, and detects light of other wavelengths, for example, far infrared, ultraviolet, X-ray, terahertz, muon, yellow, 1500 nm You may make it have the image pick-up element 113 which senses light and waves of all wavelengths, such as infrared.
  • a head-mounted display that can identify an approaching object at an intersection can be realized.
  • a head-mounted display that can identify blood vessels, lesions, cell mutations, and the like can be realized in the medical field.
  • the imaging unit 109 detects a specific wavelength it is possible to realize a head-mounted display that can identify moisture, bases, and the like by inspection.
  • the imaging unit 109 detects a specific wavelength, it is possible to realize a head-mounted display that can identify defects due to aging caused on an outer wall or an inner wall of a social infrastructure such as a tunnel or a pipe. Further, when the imaging unit 109 detects a specific wavelength, it is possible to realize a head mounted display that can identify human compounds, foreign substances, oxides, contaminants, organic substances, organic compounds, and the like. Further, when the imaging unit 109 detects a specific wavelength, a head-mounted display that can identify brain waves, electromagnetic waves, cerebral blood flow, and the like can be realized. The imaging unit 109 may have these functions.
  • the imaging unit 109 may be capable of imaging from a close place to a fine space by focus control such as a zoom function, an enlargement function, and an approach image.
  • focus control such as a zoom function, an enlargement function, and an approach image.
  • FIG. 3A is a perspective view showing a state in which the information display terminal 10 is mounted.
  • FIG. 3B is a perspective view showing a state where the information display terminal 10 is detached.
  • the head mounted display 100 of the information display terminal 10 can be mounted on the head of the user 1000.
  • the head mounted display 100 is arranged in front of the user 1000 in a state where the head mounted display 100 is attached to the user's head while being attached to the user's 1000 ear. Rim portion 123 to be formed.
  • the temple portion 112 has one end connected to the rim portion 123 and the other end connected to the cable 121.
  • the rim portion 123 is provided with an imaging unit 109 and a display unit 110. Note that the imaging unit 109 and the display unit 110 may be attached to the rim unit 123.
  • the imaging unit 109 images the imaging region 202 in the line-of-sight direction of the user 1000 in a state where the head mounted display 100 is mounted on the user 1000's head.
  • the display unit 110 is disposed in front of the user 1000 with the head mounted display 100 mounted on the head of the user 1000.
  • the display unit 110 is configured by a transparent body, a half mirror, or a total reflection mirror.
  • the cable 121 is formed of, for example, a flexible member. Specifically, the cable 121 is a flexible wire or a shape memory tube. The shape of the cable 121 is changed by applying an external force. Moreover, the cable 121 maintains the shape after changing.
  • the user 1000 does not use the head mounted display 100, as shown in FIG. 3B, the user 1000 carries the information display terminal 10 with the cable 121 wound around the neck.
  • a pair of hook-and-loop fasteners or magnets may be provided at one end and the other end of the cable 121. In this case, the cable 121 is wound around the neck while the one end and the other end of the cable 121 are in contact with each other.
  • the cable 121 includes an optical fiber, an electric wire, a hard cover, and the like.
  • the imaging unit 109 images the imaging area 202.
  • the display unit 110 is provided in the range of the virtual image display area 203 with respect to the imaging area 202.
  • the display unit 110 displays a captured image of the captured imaging region 202.
  • the display unit 110 may enlarge and display an image of an area in the viewpoint direction of the user in the imaging area 202.
  • the display unit 110 may be provided in a range of about 1/10 to 1/2 of the area of the imaging region.
  • the focal position of the virtual image may be varied, and this may be adjusted either automatically or manually.
  • a material having a refractive index different from that of air such as glass or plastic for adjusting the lens position or optical length of the optical module may be used and inserted into the optical axis for adjustment. It is also possible to change the size and angle of view of the virtual image by changing the lens or changing the focal length by the zoom function.
  • the head mounted display 100 includes a display unit 110 disposed in front of the user 1000 and an imaging unit 109 that captures the user's 1000 line-of-sight direction while being mounted on the user 1000's head. If so, it may be a goggle type.
  • a method for displaying a display image on the display unit 110 a method using a half mirror, a method for realizing see-through by dividing only one direction using a mirror or a prism, or a virtual image directly on the retina of the user 1000 is displayed.
  • a projection method may be applied.
  • the holder 130 includes a communication unit 131, a power management unit 132, and a reader 133.
  • the holder 130 detachably holds a terminal (for example, a portable terminal) held by the user.
  • a key mobile port may be provided in the holder 130.
  • the key mobile stores an authentication key, an access code, and a security code.
  • a predetermined key mobile is inserted into the holder 130 to authenticate the user.
  • the holder 130 and the head mounted display 100 can be used.
  • the holder 130 may be provided with fingerprint authentication or vein authentication. In this case, when the holder 130 has succeeded in fingerprint authentication or vein authentication, the holder 130 and the head mounted display 100 can be used.
  • the holder 130 itself may function as a key mobile.
  • SPC Security Personal Computer
  • the holder 130 may have a reader.
  • the reader is a barcode reader, an RFID reader, an imaging device that reads a QR code, or the like.
  • the reader reads information from an IC chip mounted on an ID card, for example, in a non-contact manner or in a contact manner using a communication coil, a transmission coil, an induction current coil, or the like.
  • the holder 130 can perform individual authentication, ID authentication, settlement, and the like.
  • a battery that supplies power to the head mounted display 100 via the cable 121 may be detachably attached to the other end of the cable 121.
  • an input device such as a keyboard, a mouse, or a touch pad may be detachably attached to the other end of the cable 121.
  • the input device transmits the input instruction received from the user 1000 to the head mounted display 100 via the cable 121.
  • FIG. 6 is a diagram showing an overview of the overall processing of the information display terminal 10 according to the first embodiment.
  • the overall processing according to Embodiment 1 starts when the display unit 110 starts displaying a display image, for example.
  • description will be made on the assumption that the imaging area 202 is captured by the imaging unit 109 and the display unit 110 displays a captured image captured by the imaging unit 109.
  • the display unit 110 displays a display image based on the set color parameter and luminance.
  • the color sensor 119 inputs the color to the main control unit. Further, the illuminance sensor 120 inputs the illuminance level to the main control unit 101.
  • the main control unit 101 determines whether the surrounding situation has changed based on the illuminance level and the color input in S602. When the main control unit 101 determines that the surrounding situation has not changed (No in S603), the process returns to S601. On the other hand, when the main control unit 101 determines that the surrounding situation has changed (S603-Yes), the process proceeds to S604.
  • step S604 the main control unit 101 acquires the luminance corresponding to the illuminance level input from the illuminance sensor 120 from the memory 102, and sets the acquired luminance as the luminance of the display image displayed on the display unit. .
  • the main control unit 101 acquires color parameters corresponding to the colors input from the color sensor 119 from the memory 102, and sets the color parameters of the display image displayed on the display unit 110 based on the acquired color parameters.
  • the display unit 110 displays information by changing at least one of luminance and color in accordance with the detected situation around the information display terminal 10.
  • the visibility of information displayed on the display unit 110 can be improved.
  • the display unit 110 when the situation around the user 1000 becomes dark, the display unit 110 reduces the brightness of the display image. For this reason, the display unit 110 can display an optimal video matched with the surrounding darkness. In this case, it is possible to prevent the user from feeling that the video displayed on the display unit 110 is dazzling. Further, in the evening, in the morning glow, or in the blue background, the display unit 110 changes the color by reversing the color tone of the video to be displayed. Thereby, the visibility of the image displayed on the display unit 110 is improved.
  • an illumination sensor or the like is provided by the display unit 110 displaying information by changing at least one of luminance and color based on the color parameter corresponding to the current time measured by the time measuring unit 115.
  • the visibility of the display image displayed by the display unit 110 can be improved according to the surrounding situation.
  • the display unit 110 displays the human image included in the captured image captured by the night vision camera while changing at least one of the luminance and the color, so that it is included in the captured image in the dark surroundings. Can improve the visibility of the image of the person.
  • the display unit 110 displays the background image by the imaging unit 109
  • the background image is displayed by the imaging unit 109 by displaying at least one of the luminance and the color of the background image.
  • the visibility of the display image displayed on the display unit 110 can be improved.
  • the background image and / or the superimposed display information are displayed by changing at least one of luminance and color. Thereby, when the background image is other than the image captured by the imaging unit 109, the visibility of the display image displayed on the display unit 110 can be improved. (Embodiment 2)
  • the second embodiment is different from the first embodiment in that the front imaging unit that captures an imaging region in the user's gaze direction and the opposite direction to the gaze direction in a state where the second embodiment is mounted on the head. And a rear imaging unit that images the imaging region.
  • the difference between the second embodiment and the first embodiment will be described below mainly with reference to FIGS. ⁇ Configuration of information display terminal>
  • FIG. 7 is a diagram illustrating an outline of a hardware configuration example of the information display terminal 10 according to the second embodiment.
  • FIG. 8 is a perspective view showing a state where the information display terminal 10 according to Embodiment 2 is mounted.
  • the information display terminal 10 includes a front imaging unit 710 and a rear imaging unit 720.
  • the front imaging unit 710 images the imaging region 202 in the user's line-of-sight direction while being attached to the head.
  • the image of the imaging region 202 in the user's line-of-sight direction captured by the front imaging unit 710 may be referred to as a front captured image.
  • the front imaging unit 710 inputs front captured image data for displaying the captured front captured image to the main control unit 101. Then, the front captured image data input by the front imaging unit 710 is stored in the memory 102 of the main control unit 101.
  • the rear imaging unit 720 images the imaging area 204 in the direction opposite to the viewing direction.
  • the image of the imaging region 204 in the direction opposite to the line-of-sight direction captured by the rear imaging unit 720 may be referred to as a rear captured image.
  • the rear imaging unit 720 inputs rear captured image data for displaying the captured rear captured image to the main control unit 101. Then, the rear captured image data input by the rear imaging unit 720 is stored in the memory 102 by the main control unit 101.
  • the main control unit 101 causes the display unit 110 to display a front captured image based on the input front captured image data. Further, the main control unit 101 causes the display unit 110 to display a rear captured image based on the input rear captured image data.
  • the display unit 110 displays at least one of a front captured image based on the front captured image data input to the main control unit 101 and a rear captured image based on the rear captured image data.
  • the display unit 110 displays an image to be displayed every time the input device receives an input. Are switched from the front captured video to the rear captured video, or from the rear captured video to the front captured video.
  • the display unit 110 may divide the display area into two, display the front captured image in one display area, and display the rear captured image in the other display area.
  • the imaging area 202 of a gaze direction and the imaging area 204 of the opposite direction of a gaze direction are attached. You may enable imaging.
  • the front imaging unit 710 and the rear imaging unit 720 may have a camera imaging element that functions as a night vision camera or a heat sensing camera, such as a near infrared camera.
  • the front imaging unit 710 and the rear imaging unit 720 may be dedicated night vision cameras.
  • the front imaging unit 710 and the rear imaging unit 720 detect light of other wavelengths, for example, far infrared, ultraviolet, X-ray, terahertz, muon, yellow, You may make it have the image pick-up element 113 which detects the light and wave of all wavelengths, such as infrared of 1500 nm.
  • the information display terminal 10 may further include an imaging unit that captures the right direction of the user and an imaging unit that captures the left direction of the user. .
  • the audio / video processing unit 108 synthesizes each image captured by the front imaging unit 710, the rear imaging unit 720, the imaging unit that captures the user's right direction, and the imaging unit that captures the user's left direction. By doing so, an omnidirectional image in which the omnidirectional image can be confirmed may be generated.
  • the display unit 110 displays an omnidirectional image.
  • the user 1000 can operate during driving of a bicycle, a motorcycle, a car, or the like. , You can see the video behind.
  • the main control unit 101 analyzes the front captured image data and the rear captured image data input by the front image capturing unit 710 and the rear image capturing unit 720 by a known technique, so that the user 1000 (when the information display terminal 10 is being attached) is analyzed. It may be determined whether the user 1000) is in danger. For example, when it is detected that an object such as a motorcycle, a car, or a bicycle is approaching the user 1000 wearing the information display terminal 10, the main control unit 101 determines that the user 1000 is in danger. To do.
  • the information display terminal 10 may have sonar. In this case, the information display terminal 10 emits a sound wave and detects that the object is approaching based on the sound wave reflected from the object. The information display terminal 10 may detect the approach of an object using infrared rays. Further, the information display terminal 10 may include a vibration sensor that detects air vibration. In this case, the information display terminal 10 detects that an object is approaching based on the vibration of air detected by the vibration sensor.
  • the main control unit 101 that has determined that the danger is imminent to the user 1000 notifies the user 1000 that the danger is imminent by controlling the display unit 110.
  • the main control unit 101 causes the display unit 110 to display a single red background image.
  • main control unit 101 may cause the display unit 110 to display characters or marks indicating that danger is imminent as the superimposed display information. Further, the main control unit 101 may cause the voice input / output unit 104 to output a voice indicating that danger is imminent.
  • the rear imaging unit 720 may be attached to a helmet that is worn when the user 1000 rides on a motorcycle. In this case, the rear imaging unit 720 attached to the helmet transmits rear captured image data to the information display terminal 10 via wireless communication.
  • the rear imaging unit 720 attached to the helmet and the head mounted display 100 of the information display terminal 10 may be connected to each other via a cable. In this case, the rear imaging unit 720 transmits rear captured image data to the information display terminal 10 via a cable.
  • FIG. 9 is a diagram showing an overview of the overall processing of the information display terminal 10 according to the second embodiment.
  • the overall processing according to Embodiment 2 starts when the display unit 110 starts displaying a display image, for example.
  • the following description is based on the assumption that the front imaging unit 710 images the imaging area 202 and the rear imaging unit 720 images the imaging area 204.
  • step S ⁇ b> 901 the front imaging unit 710 inputs the front captured image data of the captured front captured image to the main control unit 101. Further, the rear imaging unit 720 inputs rear captured image data of the captured rear captured image to the main control unit 101.
  • step S902 the main control unit 101 analyzes at least one of the front captured image data and the rear captured image data input in step S901 to determine whether the user 1000 is in danger. judge.
  • the main control unit 101 determines that the user 1000 is not in danger (S902-No)
  • the process proceeds to S904.
  • the main control unit 101 determines that the danger is imminent to the user 1000 (S902-Yes)
  • the process proceeds to S903.
  • the main control unit 101 controls the display unit 110 to notify the user 1000 that the danger is imminent.
  • the display unit 110 displays at least one of a front captured image based on the front captured image data input in S901 and a rear captured image based on the rear captured image data. After S903, the process returns to S901.
  • step S904 the display unit 110 displays at least one of a front captured image based on the front captured image data input in step S901 and a rear captured image based on the rear captured image data. To do. After S903, the process returns to S901.
  • the display unit 110 displays a captured image captured by at least one of the front imaging unit and the rear imaging unit, thereby confirming the front imaging unit or the rear imaging unit and avoiding danger. It becomes like this. (Embodiment 3)
  • the head of the user 1000 may shake. there were.
  • the imaging unit 109 included in the information display terminal 10 also shakes.
  • the captured image captured by the imaging unit 109 is distributed to the information display terminal 140, the portable terminal 300, the facility terminal 400, and the like via wireless communication, the information display terminal 140, the portable terminal 300, the facility terminal 400, and the like ( Hereinafter, the captured image displayed on the external terminal may be shaken, and the person who views the captured image gets drunk.
  • the purpose of the third embodiment is to display an image that can be comfortably viewed even when the imaging unit 109 of the information display terminal 10 is shaken as the head of the user 1000 is shaken, the information display terminal 140, the portable terminal 300, providing a technique enabling display on the facility terminal 400 or the like.
  • FIG. 10 is a diagram showing an overview of the overall processing of the information display terminal 10 according to the third embodiment.
  • the overall processing according to Embodiment 3 starts when the imaging unit 109 starts imaging the imaging region 202, for example.
  • the following description is based on the assumption that the inclination of the head mounted display 100 detected by the attitude sensor 105 is periodically input to the main control unit 101. Further, the description will be made on the assumption that captured image data of a captured image captured by the imaging unit 109 is periodically input to the main control unit 101.
  • step S1001 the main control unit 101 determines whether the head mounted display 100 is shaken by a known technique based on the tilt of the head mounted display 100 input from the attitude sensor 105.
  • the main control unit 101 determines that the head mounted display 100 is not shaken (S1001-No)
  • the process returns to S1001.
  • the main control unit 101 determines that the head mounted display 100 is shaking (S1001-Yes)
  • the process proceeds to S1002.
  • step S ⁇ b> 1002 the main control unit 101 determines the degree of shaking (swaying) based on the captured image data of the captured image before shaking and the captured image data of the captured image input from the imaging unit 109.
  • the pixel movement vector and moving speed of the imaged image data of the imaged image after shaking are analyzed by a known technique with respect to the imaged image data of the previous imaged image.
  • the main control unit 101 causes the image / sound processing unit 108 to extract the core region image from the captured image of the captured image before shaking. Further, the main control unit 101 causes the image / sound processing unit 108 to extract the core region image from the captured image data of the captured image after shaking. For example, the main control unit 101 extracts each core region image based on the degree of shaking analyzed in S1002. Specifically, as shown in FIG. 11, the main control unit 101 sends the core area images 602 and 604 that are always imaged to the audio / video processing unit 108 even when the head mounted display 100 is shaking. Let it be extracted. That is, the audio / video processing unit 108 extracts the core area image that is included in both the captured image before shaking and the captured image after shaking from the captured image before shaking and the captured image after shaking. , Respectively.
  • the main control unit 101 causes the audio / video processing unit 108 to extract the core area image 602 from the captured image 601 before shaking. Further, the main control unit 101 causes the audio / video processing unit 108 to extract the core region image 604 from the captured image 603 after shaking.
  • the main control unit 101 causes the audio / video processing unit 108 to delete the images around the core area images 602 and 604 extracted in S1003, thereby displaying the corrected images 605 and 606. Corrected image data is generated.
  • the main control unit 101 inputs the generated corrected image data to the communication unit 103.
  • the communication unit 103 transmits the corrected image data generated in S1004 to the external terminal. Note that the communication unit 103 transmits each corrected image data to the external terminal only when it catches that the communication state of the external terminal is open. Of course, if the communication state of the external terminal is completed, the communication unit 103 does not transmit the corrected image data to the external terminal.
  • the external terminal receives the corrected image data transmitted in S1005. Then, the external terminal displays corrected images 605 and 606 based on the received corrected image data. After S1006, the process returns to S1001.
  • the imaging unit 109 is also shaken by transmitting the core region image data for displaying the core region images 602 and 604 extracted by the audio / video processing unit 108 to the external terminal. Even in such a case, the corrected image data that can be comfortably viewed can be displayed on the external terminal.
  • the external terminal cannot display all of the captured images, but can display a corrected image that can be comfortably viewed. Therefore, the external terminal can share the travel and experience of the user 1000 wearing the information display terminal 10, sports watching, book search, information search, conference, and the like without feeling sick.
  • Information display terminal DESCRIPTION OF SYMBOLS 100,140 ... Head mounted display, 101 ... Main control part, 102 ... Memory, 103 ... Communication part, 104 ... Voice input / output part, 105 ... Attitude sensor, 106 ... Position detection part, 107 ... Power supply management part, 108 ... Image Audio processing unit 109 ... Imaging unit 110 Display unit 111 Sensor unit 112 Temple unit 113 Image sensor 114 Audio processing unit 115 Timing unit 119 Color sensor 120 Illuminance sensor 121 ... cable, 130 ... Holder, 131 ... Communication unit, 132 ... Power management unit, 133 ... Reader, 200 ... Network, 300 ... mobile terminal, 400 ... Facility terminal, 500 ... server, 710 ... Front imaging unit, 720 ... Rear imaging unit, 1000: User.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

Provided is an information display terminal that can be mounted to a user's head, wherein the information display terminal has a display unit that is disposed in front of the user's eyes when the terminal is mounted to the user's head, and the display unit displays information by changing at least either the luminance or the hue of the information, in accordance with the detected circumstances of the environment surrounding the information display terminal, to thereby enhance the visibility of the information displayed by the display unit.

Description

情報表示端末および情報表示方法Information display terminal and information display method
 本発明は、情報表示端末および情報表示方法に関する。 The present invention relates to an information display terminal and an information display method.
 眼鏡型などのウェアラブルな情報表示端末が、知られている。また、ウェアラブルな情報表示端末を制御する手法が、種々提案されている。 Wearable information display terminals such as glasses are known. Various methods for controlling a wearable information display terminal have been proposed.
 眼鏡型の情報表示端末は、撮像部(例えば、カメラ)により撮像された物体と関連する重畳表示情報(例えば、公共施設での展示内容など)を表示することが知られている。 It is known that a glasses-type information display terminal displays superimposed display information (for example, exhibition contents at a public facility) related to an object imaged by an imaging unit (for example, a camera).
 特開2011-28763号公報(特許文献1)には、ウェアラブルな情報表示端末が、CCDビデオカメラなどで撮像した現在の視界の画像データに、付与データに対応するアイコンを合成して表示する技術が記載されている。 Japanese Patent Laid-Open No. 2011-28763 (Patent Document 1) discloses a technique in which a wearable information display terminal combines and displays an icon corresponding to given data on image data of a current field of view captured by a CCD video camera or the like. Is described.
特開2011-28763号公報JP 2011-28763 A
 重畳表示情報を表示する情報表示端末の表示部は、半透明部材で構成されている。そのため、従来の技術では、例えば夕方の晴天時に茜色の光が表示部を透過し、表示部に表示される情報の視認性が低下することがあった。 The display unit of the information display terminal that displays the superimposed display information is composed of a translucent member. For this reason, in the conventional technology, for example, in the evening in fine weather, amber light is transmitted through the display unit, and the visibility of information displayed on the display unit may be reduced.
 本発明の目的は、表示部に表示される情報の視認性を向上可能にする技術を提供することである。 An object of the present invention is to provide a technique that can improve the visibility of information displayed on a display unit.
 本願において開示される発明のうち、代表的なものの概要を簡単に説明すれば、次の通りである。 The outline of a representative one of the inventions disclosed in the present application will be briefly described as follows.
 本発明の一実施の形態の情報表示端末は、ユーザの頭部に装着可能な情報表示端末であって、前記情報表示端末は、前記ユーザの頭部に装着された状態にて、前記ユーザの目の前に配置される表示部を有する。また、前記表示部は、検出された前記情報表示端末の周辺の状況に応じて、輝度または色彩の少なくともいずれか一方を変化させて情報を表示する。 An information display terminal according to an embodiment of the present invention is an information display terminal that can be worn on a user's head, and the information display terminal is attached to the user's head in the state of the user's head. A display portion is arranged in front of the eyes. In addition, the display unit displays information by changing at least one of luminance and color according to the detected situation around the information display terminal.
 また、本発明の一実施の形態の情報表示端末は、ユーザの頭部に装着可能な情報表示端末であって、前記情報表示端末は、前記ユーザの頭部に装着された状態にて、前記ユーザの目の前に配置される表示部を有する。また、前記ユーザの頭部に装着された状態にて、前記ユーザの視線方向の撮像領域を撮像する前方撮像部を有する。前記視線方向と反対方向の撮像領域を撮像する後方撮像部を有する。また、前記表示部は、前記前方撮像部または前記後方撮像部の少なくともいずれか一方により撮像された撮像画像を表示する The information display terminal according to an embodiment of the present invention is an information display terminal that can be worn on a user's head, and the information display terminal is attached to the user's head, It has a display part arranged in front of the user's eyes. In addition, a front imaging unit that captures an imaging region in the user's line-of-sight direction in a state of being mounted on the user's head. A rear imaging unit that images an imaging region in a direction opposite to the line-of-sight direction; The display unit displays a captured image captured by at least one of the front imaging unit and the rear imaging unit.
 また、本発明の一実施の形態の情報表示端末は、ユーザの頭部に装着可能な情報表示端末であって、前記情報表示端末は、前記ユーザの頭部に装着された状態にて、前記ユーザの目の前に配置される表示部を有する。また、前記ユーザの頭部に装着された状態にて、前記ユーザの視線方向の撮像領域を撮像する撮像部を有する。また、揺れる前の撮像画像と揺れた後の撮像画像との両方に共通して含まれるコア領域画像を、前記撮像部が撮像した揺れる前の撮像画像と揺れた後の撮像画像とから抽出する画像音声処理部を有する。また、前記画像音声処理部が抽出した前記コア領域画像を表示するためのコア領域画像データを外部端末へ送信する通信部を有する。 The information display terminal according to an embodiment of the present invention is an information display terminal that can be worn on a user's head, and the information display terminal is attached to the user's head, It has a display part arranged in front of the user's eyes. Moreover, it has an imaging part which images the imaging area | region of the said user's gaze direction in the state with which the said user's head was mounted | worn. Further, a core area image that is commonly included in both of the captured image before shaking and the captured image after shaking is extracted from the captured image before shaking captured by the imaging unit and the captured image after shaking. It has an image / audio processor. A communication unit configured to transmit core region image data for displaying the core region image extracted by the audio / video processing unit to an external terminal;
 また、本発明の一実施の形態の情報表示方法は、ユーザの頭部に装着可能であって、前記ユーザの頭部に装着された状態にて、前記ユーザの目の前に配置される表示部を有する情報表示端末における情報表示方法であって、表示部が、検出された前記情報表示端末の周辺の状況に応じて、輝度または色彩の少なくともいずれか一方を変化させて情報を表示する。 The information display method according to an embodiment of the present invention is a display that can be worn on the user's head and is placed in front of the user's eyes while being worn on the user's head. An information display method in an information display terminal having a display unit, wherein the display unit displays information by changing at least one of luminance and color in accordance with the detected situation around the information display terminal.
 本願において開示される発明のうち、代表的なものによって得られる効果を簡単に説明すれば以下のとおりである。 Among the inventions disclosed in the present application, effects obtained by typical ones will be briefly described as follows.
 本発明の一実施の形態によれば、表示部に表示される情報の視認性が向上する。 According to one embodiment of the present invention, the visibility of information displayed on the display unit is improved.
実施の形態1に係る情報表示端末を有する情報表示システムの構成例の概要を示す図である。It is a figure which shows the outline | summary of the structural example of the information display system which has the information display terminal which concerns on Embodiment 1. FIG. 実施の形態1に係る情報表示端末のハードウェアの構成例の概要を示す図である。2 is a diagram illustrating an outline of a hardware configuration example of the information display terminal according to Embodiment 1. FIG. (a)は、実施の形態1に係る情報表示端末が装着された状態を示す斜視図である。(b)は、実施の形態1に係る情報表示端末が脱着された状態を示す斜視図である。(A) is a perspective view which shows the state with which the information display terminal which concerns on Embodiment 1 was mounted | worn. (B) is a perspective view which shows the state by which the information display terminal which concerns on Embodiment 1 was removed. 実施の形態1に係る情報表示端末のメモリに記憶されている輝度テーブルの構成例の概要を示す図である。It is a figure which shows the outline | summary of the structural example of the brightness | luminance table memorize | stored in the memory of the information display terminal which concerns on Embodiment 1. FIG. 実施の形態1に係る情報表示端末のメモリに記憶されている色パラメータテーブルの構成例の概要を示す図である。6 is a diagram showing an outline of a configuration example of a color parameter table stored in a memory of the information display terminal according to Embodiment 1. FIG. 実施の形態1に係る情報表示端末の全体処理の概要を示す図である。FIG. 3 is a diagram showing an outline of overall processing of the information display terminal according to Embodiment 1; 実施の形態2に係る情報表示端末のハードウェアの構成例の概要を示す図である。6 is a diagram illustrating an outline of a hardware configuration example of an information display terminal according to Embodiment 2. FIG. 実施の形態2に係る情報表示端末が装着された状態を示す斜視図である。It is a perspective view which shows the state with which the information display terminal which concerns on Embodiment 2 was mounted | worn. 実施の形態2に係る情報表示端末の全体処理の概要を示す図である。It is a figure which shows the outline | summary of the whole process of the information display terminal which concerns on Embodiment 2. FIG. 実施の形態3に係る情報表示端末の全体処理の概要を示す図である。FIG. 10 is a diagram showing an outline of overall processing of an information display terminal according to Embodiment 3. 実施の形態3に係る情報表示端末が補正画像を生成する処理を説明するための図である。FIG. 10 is a diagram for describing processing in which an information display terminal according to Embodiment 3 generates a corrected image.
 以下、本発明の実施の形態を図面に基づいて詳細に説明する。なお、実施の形態を説明するための全図において、同一部には原則として同一の符号を付し、その繰り返しの説明は省略する。
(実施の形態1)
 <システム構成>
Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings. Note that components having the same function are denoted by the same reference symbols throughout the drawings for describing the embodiment, and the repetitive description thereof will be omitted.
(Embodiment 1)
<System configuration>
 図1は、実施の形態1に係る情報表示端末10を有する情報表示システムの構成例の概要を示す図である。図1に示されるように、情報表示システムは、情報表示端末10と、情報表示端末10とネットワーク200を介して接続される携帯端末300や、情報表示端末10とネットワーク200を介して接続される施設端末(例えば、パーソナルコンピュータ)400と、情報表示端末10とネットワーク200を介して接続されるサーバ500と、情報表示端末10とネットワーク200を介して接続される他の情報表示端末140と、情報表示端末10とネットワーク200を介して接続される携帯電話網11とを有する。 FIG. 1 is a diagram showing an outline of a configuration example of an information display system having the information display terminal 10 according to the first embodiment. As shown in FIG. 1, the information display system is connected to the information display terminal 10, the portable terminal 300 connected to the information display terminal 10 via the network 200, and the information display terminal 10 via the network 200. A facility terminal (for example, a personal computer) 400, a server 500 connected to the information display terminal 10 via the network 200, another information display terminal 140 connected to the information display terminal 10 via the network 200, and information The mobile phone network 11 is connected to the display terminal 10 via the network 200.
 また、情報表示端末10,140、携帯端末300、施設端末400、サーバ500は、所定のハードウェアおよびソフトウェアにより実装される。例えば、情報表示端末10,140、携帯端末300、施設端末400、サーバ500は、プロセッサやメモリにより構成され、プロセッサによるメモリ上のプログラムの実行により、情報表示システムとして、情報表示端末10,140、携帯端末300、施設端末400、サーバ500のコンピュータを機能させる。 Further, the information display terminals 10 and 140, the portable terminal 300, the facility terminal 400, and the server 500 are implemented by predetermined hardware and software. For example, the information display terminals 10 and 140, the portable terminal 300, the facility terminal 400, and the server 500 are configured by a processor and a memory, and the information display terminals 10 and 140, The computer of the portable terminal 300, the facility terminal 400, and the server 500 is caused to function.
 情報表示端末10は、ウェアラブルな眼鏡型の端末である。情報表示端末10は、ユーザ1000の頭部に装着される。また、情報表示端末10は、ユーザ1000の視線方向に存在する物体を撮像する撮像部(例えば、カメラ)109を有する。 The information display terminal 10 is a wearable glasses-type terminal. The information display terminal 10 is worn on the head of the user 1000. In addition, the information display terminal 10 includes an imaging unit (for example, a camera) 109 that captures an object that exists in the line-of-sight direction of the user 1000.
 情報表示端末10の表示部110は、背景画像を表示する。背景画像は、例えば、一色(例えば白塗り)の画像または撮像部109により撮像された画像(以下、撮像画像と呼ぶ場合がある)である。 The display unit 110 of the information display terminal 10 displays a background image. The background image is, for example, an image of one color (for example, white painting) or an image captured by the imaging unit 109 (hereinafter sometimes referred to as a captured image).
 また、表示部110は、撮像部109により撮像された対象と関連する重畳表示情報を背景画像に重畳して表示する。重畳表示情報は、例えば、店舗でのセール情報や、割引クーポンや、公共施設での展示内容や、入館優待券や、店舗で金銭として使用可能なポイントの値などのユーザ1000にとって有為な情報である。情報表示端末10は、ネットワーク200を介して、携帯端末300や、施設端末400や、サーバ500などから撮像された物体と関連する重畳表示情報を取得する。 In addition, the display unit 110 displays the superimposed display information related to the object imaged by the imaging unit 109 in a superimposed manner on the background image. The superimposed display information is information useful to the user 1000 such as, for example, store sales information, discount coupons, exhibition contents at public facilities, admission coupons, and points that can be used as money at the store. It is. The information display terminal 10 acquires superimposed display information related to an object imaged from the mobile terminal 300, the facility terminal 400, the server 500, and the like via the network 200.
 サーバ500のストレージ510には、撮像部109により撮像された物体の撮像画像を表示するための撮像画像データと、物体と関連する重畳表示情報とが対応付けて記憶されている。そして、サーバ500は、ストレージ510に記憶されている重畳表示情報を、ネットワーク200を介して情報表示端末10へ送信する。情報表示端末10の表示部110は、サーバ500から送信される重畳表示情報を表示する。 The storage 510 of the server 500 stores captured image data for displaying a captured image of an object captured by the imaging unit 109 and superimposed display information related to the object in association with each other. Then, the server 500 transmits the superimposed display information stored in the storage 510 to the information display terminal 10 via the network 200. The display unit 110 of the information display terminal 10 displays the superimposed display information transmitted from the server 500.
 施設端末400は、店舗や公共施設などに設置される。施設端末400は、重畳表示情報を携帯端末300へ送信する。また、施設端末400は、店舗や公共施設などの担当者による操作を受け付ける。なお、重畳表示情報は、サーバ500のストレージ510により記憶されているものに限らない。例えば、重畳表示情報は、ネットワーク200に接続された不特定多数の装置により記憶され、外部装置に対して公開されていても良い。 The facility terminal 400 is installed in a store or public facility. The facility terminal 400 transmits the superimposed display information to the mobile terminal 300. In addition, the facility terminal 400 accepts an operation by a person in charge such as a store or a public facility. The superimposed display information is not limited to the information stored in the storage 510 of the server 500. For example, the superimposed display information may be stored by an unspecified number of devices connected to the network 200 and disclosed to an external device.
 ここで、情報表示端末10は、ユーザ1000が置かれたすべての通信環境に対応できるとは限らない。例えば、情報表示端末10が、近距離通信ユニットを備えているが、携帯電話との通信を可能にするためのユニットを備えていない場合、ネットワーク200を介して、サーバ500から重畳表示情報を取得することができない。一方、ユーザ1000が、携帯端末300(例えば、携帯電話やタブレット端末)を所持することが想定される。この場合、情報表示端末10は、近距離通信を介して携帯端末300と接続し、この携帯端末300をルーターとしてネットワーク200に接続できる。 Here, the information display terminal 10 is not necessarily compatible with all communication environments where the user 1000 is placed. For example, when the information display terminal 10 includes a short-distance communication unit but does not include a unit for enabling communication with a mobile phone, the superimposed display information is acquired from the server 500 via the network 200. Can not do it. On the other hand, it is assumed that the user 1000 has a mobile terminal 300 (for example, a mobile phone or a tablet terminal). In this case, the information display terminal 10 can be connected to the mobile terminal 300 via short-range communication, and the mobile terminal 300 can be connected to the network 200 as a router.
 また、重畳表示情報を送信する施設端末400が、ネットワーク200に未接続であることが想定される。この場合、情報表示端末10は、近距離通信により携帯端末300と接続し、この携帯端末300を無線LANユニットとして無線LAN経由で施設端末400と接続する。
 <情報表示端末の構成>
Further, it is assumed that the facility terminal 400 that transmits the superimposed display information is not connected to the network 200. In this case, the information display terminal 10 is connected to the portable terminal 300 by short-range communication, and the portable terminal 300 is connected to the facility terminal 400 via the wireless LAN as a wireless LAN unit.
<Configuration of information display terminal>
 以下、図2および図3(a)(b)を用いて、情報表示端末10の構成について説明する。図2は、実施の形態1に係る情報表示端末10のハードウェアの構成例の概要を示す図である。図2、図3(a)(b)に示されるように、情報表示端末10は、ヘッドマウントディスプレイ100と、ホルダ130と、一端がヘッドマウントディスプレイ100と連結され他端がホルダ130と連結される可撓性部材のケーブル121とを有する。すなわち、ホルダ130と、ヘッドマウントディスプレイ100とは、ケーブル121を介して接続される。 Hereinafter, the configuration of the information display terminal 10 will be described with reference to FIGS. 2 and 3A and 3B. FIG. 2 is a diagram illustrating an outline of a hardware configuration example of the information display terminal 10 according to the first embodiment. As shown in FIGS. 2, 3 (a) and 3 (b), the information display terminal 10 has a head mounted display 100 and a holder 130, one end connected to the head mounted display 100 and the other end connected to the holder 130. Cable 121 of a flexible member. That is, the holder 130 and the head mounted display 100 are connected via the cable 121.
 図2に示されるように、ヘッドマウントディスプレイ100は、主制御部101と、通信部103と、音声入出力部104と、位置検出部106と、電源管理部107と、画像音声処理部108と、撮像部109と、表示部110と、センサ部111、音声処理部114と、計時部115とを有する。 As shown in FIG. 2, the head mounted display 100 includes a main control unit 101, a communication unit 103, an audio input / output unit 104, a position detection unit 106, a power management unit 107, an image audio processing unit 108, An imaging unit 109, a display unit 110, a sensor unit 111, an audio processing unit 114, and a time measuring unit 115.
 ヘッドマウントディスプレイ100は、色センサ119が検出する色彩や、照度センサ120が検出する照度レベルや、情報表示端末10の周辺の状況の変化を検出する。そして、表示部110は、特定した周辺の状況に応じて、輝度または色彩の少なくとも一方を変化させて表示画像を表示する。 The head mounted display 100 detects changes in the color detected by the color sensor 119, the illuminance level detected by the illuminance sensor 120, and the situation around the information display terminal 10. Then, the display unit 110 displays a display image by changing at least one of luminance and color according to the specified surrounding situation.
 主制御部101は、メモリ102に記憶されている基本動作プログラムを、RAMに展開する。そして、主制御部101は、RAMに展開した基本動作プログラムを実行する。これによって、主制御部101は、ヘッドマウントディスプレイ100を統括するとともに各種判定、演算処理を行なう。 The main control unit 101 expands the basic operation program stored in the memory 102 in the RAM. Then, the main control unit 101 executes the basic operation program expanded in the RAM. Accordingly, the main control unit 101 controls the head mounted display 100 and performs various determinations and arithmetic processing.
 主制御部101が有するメモリ102は、フラッシュROMやSSD(Solid State Drive)やCPU(中央集積回路)内蔵メモリなどの半導体素子メモリなどが用いられる。なお、HDD(Hard Disc Drive)などの磁気ディスクドライブ等のデバイスが用いられるようにしても良い。 As the memory 102 included in the main control unit 101, a semiconductor element memory such as a flash ROM, an SSD (Solid State Drive), or a CPU (Central Integrated Circuit) built-in memory is used. A device such as a magnetic disk drive such as an HDD (Hard Disc Drive) may be used.
 また、メモリ102が有するRAM(Random Access Memory)は、基本動作プログラムやその他の動作プログラム実行時のワークエリアとなる。なお、RAMは、メモリ102とは別構成としてもよい。また、RAMは、主制御部101と一体で構成されても良い。 In addition, a RAM (Random Access Memory) included in the memory 102 serves as a work area for executing a basic operation program and other operation programs. Note that the RAM may be configured separately from the memory 102. The RAM may be configured integrally with the main control unit 101.
 計時部115は、現在の時刻を計時する。計時部115は、計時した現在の時刻を主制御部101に入力する。 Time measuring unit 115 measures the current time. The timing unit 115 inputs the current time measured to the main control unit 101.
 センサ部111には、姿勢センサ105(例えば、ジャイロセンサ)と、色センサ119と、照度センサ120とが含まれる。 The sensor unit 111 includes an attitude sensor 105 (for example, a gyro sensor), a color sensor 119, and an illuminance sensor 120.
 姿勢センサ105は、ヘッドマウントディスプレイ100の傾きを検出する。姿勢センサ105は、検出したヘッドマウントディスプレイ100の傾きを主制御部101に入力する。 The attitude sensor 105 detects the tilt of the head mounted display 100. The attitude sensor 105 inputs the detected tilt of the head mounted display 100 to the main control unit 101.
 色センサ119は、周辺(ユーザ1000が存在する周辺)の色彩を検出する。また、色センサ119は、検出した色彩を主制御部101に入力する。 The color sensor 119 detects the color of the periphery (periphery where the user 1000 exists). The color sensor 119 inputs the detected color to the main control unit 101.
 照度センサ120は、周辺の照度レベルを検出する。また、照度センサ120は、検出した照度の照度レベルを主制御部101に入力する。なお、照度センサ120が周辺の色彩を検出しても良い。 The illumination sensor 120 detects the ambient illumination level. The illuminance sensor 120 inputs the detected illuminance level of the illuminance to the main control unit 101. The illuminance sensor 120 may detect surrounding colors.
 撮像部109(例えば、カメラ)は、ウェアラブルなヘッドマウントディスプレイ100に備わる装置として許容出来るサイズおよび重量(例えば、100g以下)である。また、撮像部109は、ヘッドマウントディスプレイ100に取り付けられる、小型のカメラユニットである。 The imaging unit 109 (for example, a camera) has a size and weight (for example, 100 g or less) that can be accepted as a device included in the wearable head mounted display 100. The imaging unit 109 is a small camera unit that is attached to the head mounted display 100.
 撮像部109は、ヘッドマウントディスプレイ100を装着中のユーザ1000の視線方向を撮像するよう光学系が配置されている。そして、撮像部109は、ユーザの頭部に装着された状態にて、ユーザの視線方向の撮像領域を撮像する。 In the imaging unit 109, an optical system is arranged so as to capture the direction of the line of sight of the user 1000 wearing the head mounted display 100. And the imaging part 109 images the imaging area of a user's gaze direction in the state with which the user's head was mounted | worn.
 撮像部109により撮像された撮像画像の撮像画像データは、主制御部101に入力される。そして、撮像部109により撮像された撮像画像の撮像画像データは、主制御部101のメモリ102に記憶される。なお、撮像部109は、通常の可視光の他に、高感度の遠赤外や近赤外、紫外、X線、テラヘルツ波、ミュオン波などを受光できる撮像素子113を有する。 The captured image data of the captured image captured by the imaging unit 109 is input to the main control unit 101. The captured image data of the captured image captured by the imaging unit 109 is stored in the memory 102 of the main control unit 101. Note that the imaging unit 109 includes an imaging element 113 that can receive high-sensitivity far infrared, near infrared, ultraviolet, X-ray, terahertz wave, muon wave, and the like in addition to normal visible light.
 表示部110は、半透明部材もしくはプリズムやミラーなどの反射部材で形成されるディスプレイを有する。 The display unit 110 has a display formed of a translucent member or a reflecting member such as a prism or a mirror.
 画像音声処理部108は、主制御部101から入力された撮像画像データや、重畳表示情報などに基づき、撮像画像や、重畳表示情報を表示部110のディスプレイに出力する。これによって、表示部110は、撮像画像および重畳表示情報をディスプレイに表示する。 The audio / video processing unit 108 outputs the captured image and the superimposed display information to the display of the display unit 110 based on the captured image data input from the main control unit 101 and the superimposed display information. Thereby, the display unit 110 displays the captured image and the superimposed display information on the display.
 主制御部101は、センサ部111(色センサ119、照度センサ120)によって検出された照度レベルにより情報表示端末10の周辺の状況を検出する。 The main control unit 101 detects the situation around the information display terminal 10 based on the illuminance level detected by the sensor unit 111 (color sensor 119, illuminance sensor 120).
 また、主制御部101は、計時部115によって計時された現在の時刻により情報表示端末の周辺の状況を検出する。 Further, the main control unit 101 detects the situation around the information display terminal based on the current time measured by the time measuring unit 115.
 また、主制御部101は、撮像部109により撮像された撮像画像を公知の技術により解析することで周辺の照度レベルや色彩を検出し、検出した照度レベルや色彩に基づいて周辺の状況を検出する。詳細には、主制御部101は、現在、表示部110に表示されている表示画像の輝度と対応する照度レベルと、照度センサ120から入力される照度レベルとが異なる場合に周辺の状況が変化したことを検出する。また、主制御部101は、現在、表示部110に表示されている表示画像の輝度と対応する照度レベルと、撮像部109が撮像した撮像画像の撮像画像データを公知の技術により解析することで特定される照度レベルとが異なる場合に周辺の状況が変化したことを検出する。 Further, the main control unit 101 detects a surrounding illuminance level and color by analyzing a captured image captured by the imaging unit 109 using a known technique, and detects a surrounding situation based on the detected illuminance level and color. To do. Specifically, the main control unit 101 changes the surrounding situation when the illuminance level corresponding to the luminance of the display image currently displayed on the display unit 110 is different from the illuminance level input from the illuminance sensor 120. Detect that In addition, the main control unit 101 analyzes the luminance level corresponding to the luminance of the display image currently displayed on the display unit 110 and the captured image data of the captured image captured by the imaging unit 109 by a known technique. When the specified illumination level is different, it is detected that the surrounding situation has changed.
 ここで、メモリ102には、照度レベルや色彩をマッチングするための画像解析データが記憶されている。 Here, the memory 102 stores image analysis data for matching illuminance levels and colors.
 入力される照度レベルに基づいて周辺の状況が変化したことを検出する場合、主制御部101は、照度レベルと対応する輝度をメモリ102に記憶されている輝度テーブル(後述、図4)から取得する。 When detecting that the surrounding situation has changed based on the input illuminance level, the main control unit 101 acquires the luminance corresponding to the illuminance level from the luminance table (described later, FIG. 4) stored in the memory 102. To do.
 以下、図4を用いてメモリ102に記憶されている輝度テーブルについて説明する。図4は、本発明の実施の形態1におけるメモリ102に記憶されている輝度テーブルの概要を示す図である。図4に示されるように、輝度テーブルは、[照度レベル]、[輝度]、[時刻]などのデータ項目を有する。[照度レベル]は、周辺の照度の値を示す。[輝度]は、表示部が表示する表示画像の輝度を示す。[時刻]は、情報表示端末10が有する計時部115によって計時される時刻を示す。 Hereinafter, the luminance table stored in the memory 102 will be described with reference to FIG. FIG. 4 is a diagram showing an outline of the luminance table stored in the memory 102 according to Embodiment 1 of the present invention. As shown in FIG. 4, the luminance table has data items such as [illuminance level], [luminance], and [time]. [Illuminance level] indicates the value of ambient illuminance. [Luminance] indicates the luminance of the display image displayed on the display unit. [Time] indicates a time measured by the time measuring unit 115 included in the information display terminal 10.
 再び図2を参照する。主制御部101は、照度レベルと対応する輝度をメモリ102から取得し、取得した輝度に基づき、表示部110が表示する表示画像の輝度を設定する。 Refer to FIG. 2 again. The main control unit 101 acquires the luminance corresponding to the illuminance level from the memory 102, and sets the luminance of the display image displayed by the display unit 110 based on the acquired luminance.
 また、主制御部101は、計時部115から入力される時刻と対応する輝度をメモリ102から取得し、取得した輝度に基づき、表示部110が表示する表示画像の輝度を設定する。 Also, the main control unit 101 acquires the luminance corresponding to the time input from the time measuring unit 115 from the memory 102, and sets the luminance of the display image displayed on the display unit 110 based on the acquired luminance.
 そして、画像音声処理部108は、表示部110が表示する表示画像の輝度を、主制御部101が取得した輝度へと変更する。 Then, the audio / video processing unit 108 changes the luminance of the display image displayed on the display unit 110 to the luminance acquired by the main control unit 101.
 すなわち、画像音声処理部108は、表示部110に表示される表示画像の輝度を、周辺が暗ければ低くし周辺が明るければ高くする。具体的には、画像音声処理部108は、主制御部101により設定された輝度に基づき、光学モジュールのLEDやレーザ光源の電流値を制御することで、表示部110に表示される画像の輝度を変更する。 That is, the audio / video processing unit 108 increases the luminance of the display image displayed on the display unit 110 if the periphery is dark and increases the brightness if the periphery is bright. Specifically, the audio / video processing unit 108 controls the current value of the LED of the optical module and the laser light source based on the luminance set by the main control unit 101, and thereby the luminance of the image displayed on the display unit 110. To change.
 これにより、ユーザ1000は、夜間や夕方にヘッドマウントディスプレイ100を使用した場合であっても、表示部110に表示される表示画像を適切な輝度で視認できる。例えば夜間では表示部110は、表示する表示画像の輝度を1cd/m2以下とする。一方、周辺の照度レベルが10万ルクスの昼間の晴天のときには、瞳の光彩は、閉じて小さくなっている。そのため、表示部110は、表示する表示画像の輝度を高くしなければ、表示画像が見え難い。そのため、表示部110は、表示する表示画像の輝度を5000cd/m2にする。 Thereby, the user 1000 can visually recognize the display image displayed on the display unit 110 with appropriate luminance even when the head mounted display 100 is used at night or in the evening. For example, at night, the display unit 110 sets the luminance of the display image to be displayed to 1 cd / m 2 or less. On the other hand, when the surrounding illuminance level is sunny in the daytime with 100,000 lux, the iris of the pupil is closed and small. Therefore, the display unit 110 is difficult to see the display image unless the brightness of the display image to be displayed is increased. Therefore, the display unit 110 sets the luminance of the display image to be displayed to 5000 cd / m 2 .
 さらに、主制御部101は、現在、表示部110に表示されている表示画像の色彩と対応する色彩パラメータと、色センサ119から入力される色彩と対応する色彩パラメータとが異なる場合に周辺の状況が変化したことを検出する。また、主制御部101は、現在、表示部110に表示されている表示画像の色彩と対応する色彩パラメータと、撮像部109が撮像した撮像画像の撮像画像データを公知の技術により解析することで特定される色彩パラメータとが異なる場合に周辺の状況が変化したことを検出する。 Further, the main control unit 101 determines the situation in the surroundings when the color parameter corresponding to the color of the display image currently displayed on the display unit 110 is different from the color parameter corresponding to the color input from the color sensor 119. Detect that has changed. Further, the main control unit 101 analyzes the color parameters corresponding to the color of the display image currently displayed on the display unit 110 and the captured image data of the captured image captured by the imaging unit 109 by a known technique. When the specified color parameter is different, it is detected that the surrounding situation has changed.
 入力される色彩パラメータに基づいて周辺の状況が変化したことを検出する場合、主制御部101は、入力された色彩パラメータを逆算処理する公知の画像処理アルゴリズムに基づき、変更後の色彩パラメータを算出する。そして、画像音声処理部108は、算出した色彩パラメータに基づき色彩を変更した表示画像を表示部110に表示する。 When detecting that the surrounding situation has changed based on the input color parameter, the main control unit 101 calculates the changed color parameter based on a known image processing algorithm that performs a reverse calculation process on the input color parameter. To do. Then, the audio / video processing unit 108 displays a display image whose color has been changed based on the calculated color parameter on the display unit 110.
 すなわち、画像音声処理部108は、周辺の色彩に応じて表示部110に表示される表示画像の色彩を、表示部110に設けられるLEDを制御することで変更する。具体的には、夕方の晴天時で、色センサ119が、色彩として茜色を検出した場合、主制御部101は、表示部110に設けられるLEDを制御することで、色彩パラメータの緑の値と、青の値とを増加させる。これによって、表示部110を透過する茜色が白色となり、視認性が向上する。また、昼間の晴天時で、色センサ119が、色彩として青色を検出した場合、主制御部101は、表示部110に設けられるLEDを制御することで、色彩パラメータの赤の値と、緑の値とを増加させる。これによって、表示部110を透過する青色が白色となり、視認性が向上する。具体的には、表示部110は、室内や、トンネル内、カメラ現像暗室など、特殊な環境下でも、色彩を調整した表示画像を表示できる。 That is, the audio / video processing unit 108 changes the color of the display image displayed on the display unit 110 according to the surrounding colors by controlling the LEDs provided on the display unit 110. Specifically, when the color sensor 119 detects dark blue as a color in the fine weather in the evening, the main control unit 101 controls the LED provided in the display unit 110 to control the green value of the color parameter. Increase, with blue value. Thereby, the amber color which permeate | transmits the display part 110 turns into white, and visibility improves. Further, when the color sensor 119 detects blue as the color in the daytime in fine weather, the main control unit 101 controls the LED provided in the display unit 110 so that the red value of the color parameter and the green color Increase the value. Thereby, the blue color transmitted through the display unit 110 becomes white, and the visibility is improved. Specifically, the display unit 110 can display a display image with adjusted colors even in a special environment such as a room, a tunnel, or a camera development darkroom.
 色センサ119から入力される色彩に基づいて周辺の状況が変化したことを検出する場合、主制御部101は、色センサ119から入力される色彩と対応する色彩パラメータをメモリに記憶されている色パラメータテーブル(後述、図5)から取得する。 When detecting that the surrounding situation has changed based on the color input from the color sensor 119, the main control unit 101 stores the color parameters corresponding to the color input from the color sensor 119 in the memory. Obtained from a parameter table (described later, FIG. 5).
 以下、図5を用いてメモリ102に記憶されている色パラメータテーブルについて説明する。図5は、本発明の実施の形態1におけるメモリ102に記憶されている色パラメータテーブルの概要を示す図である。図5に示されるように、色パラメータテーブルは、[色彩]、[色彩パラメータ]、[時刻]などのデータ項目を有する。[色彩]は、周辺の色彩を示す。[色彩パラメータ]は、表示部が表示する表示画像の色彩を設定するためのパラメータを示す。具体的には、[色彩パラメータ]は、赤、緑、青の3つの原色の値を示す。[時刻]は、情報表示端末10が有する計時部115によって計時される時刻を示す。 Hereinafter, the color parameter table stored in the memory 102 will be described with reference to FIG. FIG. 5 is a diagram showing an outline of the color parameter table stored in the memory 102 according to Embodiment 1 of the present invention. As shown in FIG. 5, the color parameter table has data items such as [color], [color parameter], and [time]. [Color] indicates surrounding colors. [Color parameter] indicates a parameter for setting the color of the display image displayed by the display unit. Specifically, [Color Parameter] indicates values of three primary colors of red, green, and blue. [Time] indicates a time measured by the time measuring unit 115 included in the information display terminal 10.
 再び図2を参照する。主制御部101は、色彩と対応する色彩パラメータをメモリ102から取得し、取得した色彩パラメータに基づき、表示部110が表示する表示画像の色彩パラメータを設定する。そして、主制御部101は、表示部110に設けられるLEDを制御することで、表示部110が表示する表示画像の色彩を変更する。 Refer to FIG. 2 again. The main control unit 101 acquires the color parameter corresponding to the color from the memory 102, and sets the color parameter of the display image displayed by the display unit 110 based on the acquired color parameter. And the main control part 101 changes the color of the display image which the display part 110 displays by controlling LED provided in the display part 110. FIG.
 なお、表示部110は、背景画像が撮像部109により撮像される撮像画像である場合と、背景画像が撮像部109により撮像される撮像画像以外(例えば白塗りされた画像)である場合とで、輝度または色彩を変更する対象(背景画像、重畳表示画像)を切り替える。 The display unit 110 includes a case where the background image is a captured image captured by the image capturing unit 109 and a case where the background image is other than the captured image captured by the image capturing unit 109 (for example, a white-painted image). The object (background image, superimposed display image) whose luminance or color is to be changed is switched.
 具体的には、主制御部101は、背景画像が撮像部109により撮像される撮像画像である場合は、背景画像のみの輝度または色彩の少なくともいずれか一方を変化させて表示部110に表示させる。また、主制御部101は、背景画像が撮像部109により撮像される撮像画像以外(例えば白塗りされた画像)である場合には、背景画像および重畳表示画像の両方の輝度または色彩の少なくともいずれか一方を変化させて表示する。 Specifically, when the background image is a captured image captured by the image capturing unit 109, the main control unit 101 causes the display unit 110 to display at least one of luminance and / or color of only the background image. . In addition, when the background image is other than the captured image captured by the imaging unit 109 (for example, a white-painted image), the main control unit 101 has at least either the luminance or the color of both the background image and the superimposed display image. Change one of them to display.
 また、主制御部101は、計時部115から入力される時刻と対応する色彩パラメータをメモリ102から取得し、取得した色彩パラメータに基づき、表示部110が表示する表示画像の色彩パラメータを設定する。 Also, the main control unit 101 acquires the color parameter corresponding to the time input from the time measuring unit 115 from the memory 102, and sets the color parameter of the display image displayed by the display unit 110 based on the acquired color parameter.
 なお、ユーザ1000からの操作に応じて表示部110は、輝度または色彩の少なくともいずれか一方を変更して表示するようにしても良い。 Note that the display unit 110 may change and display at least one of luminance and color according to an operation from the user 1000.
 また、表示部110は、過去の周辺の状況と、変更後の輝度や色彩パラメータとを学習し、以後は、周辺の状況に応じて自動で輝度または色彩の少なくともいずれか一方を変更して表示するようにしても良い。 Further, the display unit 110 learns the past surrounding situation and the changed brightness and color parameters, and thereafter automatically changes and displays at least one of the brightness and the color according to the surrounding situation. You may make it do.
 音声入出力部104は、ユーザ1000の音声を感知するマイクと、音声処理部114を介して音声データ化され、画像音声処理部108から入出力される音声データに基づいて音声を出力するイヤホンとからなる。 The voice input / output unit 104 includes a microphone that senses the voice of the user 1000, an earphone that is converted into voice data via the voice processing unit 114, and outputs voice based on the voice data input / output from the image / voice processing unit 108. Consists of.
 通信部103は、携帯電話網、無線LAN、近距離通信など各種通信環境を利用してサーバや携帯端末300や施設端末400やサーバ500などと、ネットワークを介して表示画像のやりとりをする。 The communication unit 103 exchanges display images with the server, the mobile terminal 300, the facility terminal 400, the server 500, and the like using various communication environments such as a mobile phone network, a wireless LAN, and short-range communication.
 通信部103は、無線LANユニットや、携帯電話ユニットや、近距離通信ユニットなどから構成される。近距離通信ユニットとしては、Wifi(登録商標)、Bluetooth(登録商標)などが該当する。 The communication unit 103 includes a wireless LAN unit, a mobile phone unit, a short-range communication unit, and the like. As the short-range communication unit, WiFi (registered trademark), Bluetooth (registered trademark), and the like are applicable.
 GPS(Global Positioning System)ユニットである位置検出部106は、地球周囲を周回する複数の測位衛星からの電波を受信し、地球上での情報表示端末10の現在位置座標を検出する。 A position detector 106, which is a GPS (Global Positioning System) unit, receives radio waves from a plurality of positioning satellites orbiting around the earth, and detects the current position coordinates of the information display terminal 10 on the earth.
 電源管理部107は、ヘッドマウントディスプレイ100を駆動するバッテリの管理を行い、バッテリの状態をモニタリングしてその残量を定期的に検出する。 The power management unit 107 manages the battery that drives the head mounted display 100, monitors the state of the battery, and periodically detects the remaining amount.
 主制御部101は、撮像画像の画像データと、メモリ102に記憶されている画像データとに基づいて、撮像部109により撮像された撮像画像をパターンマッチングする。そして、主制御部101は、パターンマッチングした結果、撮像部により撮像された画像の画像データと、一致または近似する画像データが、メモリ102に記憶されているかを判定する。これによって、主制御部101は、撮像画像に特定の商品が含まれていることを検出する。 The main control unit 101 performs pattern matching on the captured image captured by the imaging unit 109 based on the image data of the captured image and the image data stored in the memory 102. Then, the main control unit 101 determines whether image data that matches or approximates the image data of the image captured by the imaging unit as a result of pattern matching is stored in the memory 102. Accordingly, the main control unit 101 detects that a specific product is included in the captured image.
 撮像画像に商品が含まれる場合、主制御部101は、当該商品と対応する重畳表示画像を、サーバ500に対して要求する。そして、主制御部101は、サーバ500から取得した重畳表示画像を、撮像画像とともに表示部110表示させる。 When the captured image includes a product, the main control unit 101 requests the server 500 for a superimposed display image corresponding to the product. Then, the main control unit 101 causes the display unit 110 to display the superimposed display image acquired from the server 500 together with the captured image.
 主制御部101は、ホルダ130を駆動するバッテリの管理を行い、バッテリの状態をモニタリングする。 The main control unit 101 manages the battery that drives the holder 130 and monitors the state of the battery.
 電源管理部132は、その残量を定期的に検出する。 The power management unit 132 periodically detects the remaining amount.
 通信部131は、リーダ(不図示)133が読み取った情報を、ヘッドマウントディスプレイ100へ送信する。通信部131は、Wifi(登録商標)や、BlueTooth(登録商標)、LTE(Long Term Evolution)などの規格に対応している。 The communication unit 131 transmits information read by the reader (not shown) 133 to the head mounted display 100. The communication unit 131 corresponds to standards such as WiFi (registered trademark), BlueTooth (registered trademark), and LTE (Long Term Evolution).
 なお、撮像部109が、例えば近赤外線カメラのように、暗視カメラや熱感知カメラとして機能するカメラ撮像素子を有するようにしても良い。この場合であっても、撮像部109は、通常の可視光の撮像画像を取り込み可能である。そして、撮像部109は、少なくとも2つ以上の撮像素子113を有し、通常の可視光の撮像画像と、暗視撮像による撮像画像とを撮像できる。この場合、画像音声処理部108は、撮像部109から入力される画像データを、公知の画像処理アルゴリズムにより暗視画像(ユーザ1000の周辺が暗くなった場合でも、周辺の状況を確認可能な画像)として加工処理する。そして、画像音声処理部108は、加工処理した暗視画像を表示部110に表示させる。具体的には、表示部110は、暗視カメラにより撮像される画像に含まれる人の画像を、輝度または色彩の少なくともいずれか一方を変化させて表示する。 In addition, you may make it the imaging part 109 have a camera image pick-up element which functions as a night vision camera or a heat sensing camera like a near-infrared camera, for example. Even in this case, the imaging unit 109 can capture a normal captured image of visible light. The imaging unit 109 includes at least two imaging elements 113 and can capture a normal visible light captured image and a captured image by night vision imaging. In this case, the audio / video processing unit 108 converts the image data input from the imaging unit 109 into a night vision image (an image in which the surrounding state of the user 1000 can be confirmed even when the surrounding of the user 1000 becomes dark). ). Then, the audio / video processing unit 108 causes the display unit 110 to display the processed night vision image. Specifically, the display unit 110 displays an image of a person included in an image captured by a night vision camera while changing at least one of luminance and color.
 また、画像音声処理部108は、通常の可視光の撮像画像と、暗視撮像による撮像画像とを、公知の画像処理アルゴリズムにより合成する。そして、画像音声処理部108は、合成した画像を表示部110に表示させる。これによって、表示部110は、夜間や暗闇の環境下における視認性を向上させた暗視画像を表示できる。また、通常の可視光の撮像画像と暗視撮像による撮像画像とが合成されているため、表示部110は、標識や信号灯の色など認識する必要がある情報をより明確に表示できる。 The audio / video processing unit 108 synthesizes a normal captured image of visible light and a captured image obtained by night vision imaging using a known image processing algorithm. Then, the audio / video processing unit 108 causes the display unit 110 to display the synthesized image. Accordingly, the display unit 110 can display a night vision image with improved visibility in a night or dark environment. Moreover, since the captured image of normal visible light and the captured image by night vision imaging are combined, the display unit 110 can more clearly display information that needs to be recognized, such as the color of the signs and signal lights.
 なお、暗視カメラとして機能する撮像部109は、専用の暗視カメラであっても良く、他の波長の光を検出する、例えば遠赤外、紫外線、X線、テラヘルツ、ミュオン、黄色、1500nmの赤外など、あらゆる波長の光や波を感知する撮像素子113を有するようにしても良い。 The imaging unit 109 functioning as a night vision camera may be a dedicated night vision camera, and detects light of other wavelengths, for example, far infrared, ultraviolet, X-ray, terahertz, muon, yellow, 1500 nm You may make it have the image pick-up element 113 which senses light and waves of all wavelengths, such as infrared.
 これによって、通常の裸眼では、確認できなかった、映像データを認識できる。さらに、撮像部109が特定の波長を検出することで、交差点での接近する物体を識別可能なヘッドマウントディスプレイを実現できる。また、撮像部109が特定の波長を検出することで、医療の分野では、血管や、病変部、細胞の変異などを識別可能なヘッドマウントディスプレイを実現できる。また、撮像部109が特定の波長を検出することで、検査にて水分や塩基物などを識別可能なヘッドマウントディスプレイを実現できる。また、撮像部109が特定の波長を検出することで、トンネルや配管などの社会インフラの外壁や内壁に生じた老朽化による欠陥などを識別可能なヘッドマウントディスプレイを実現できる。また、撮像部109が特定の波長を検出することで、人的化合物や異物、酸化物、汚染物質、有機物、有機化合物などを識別可能なヘッドマウントディスプレイを実現できる。また、撮像部109が特定の波長を検出することで、脳波、電磁波、脳血流量、などを識別可能なヘッドマウントディスプレイを実現できる。そして、これらの複数の機能を、撮像部109が有するようにしても良い。さらに、撮像部109は、ズーム機能や、拡大機能、接近画像などの焦点制御により、近いところから、微細空間まで撮像可能にしても良い。これによって、手術や、微細加工、微細実験などにもヘッドマウントディスプレイ100を活用できる。 This makes it possible to recognize video data that could not be confirmed with normal naked eyes. Furthermore, when the imaging unit 109 detects a specific wavelength, a head-mounted display that can identify an approaching object at an intersection can be realized. In addition, when the imaging unit 109 detects a specific wavelength, a head-mounted display that can identify blood vessels, lesions, cell mutations, and the like can be realized in the medical field. Further, when the imaging unit 109 detects a specific wavelength, it is possible to realize a head-mounted display that can identify moisture, bases, and the like by inspection. In addition, when the imaging unit 109 detects a specific wavelength, it is possible to realize a head-mounted display that can identify defects due to aging caused on an outer wall or an inner wall of a social infrastructure such as a tunnel or a pipe. Further, when the imaging unit 109 detects a specific wavelength, it is possible to realize a head mounted display that can identify human compounds, foreign substances, oxides, contaminants, organic substances, organic compounds, and the like. Further, when the imaging unit 109 detects a specific wavelength, a head-mounted display that can identify brain waves, electromagnetic waves, cerebral blood flow, and the like can be realized. The imaging unit 109 may have these functions. Further, the imaging unit 109 may be capable of imaging from a close place to a fine space by focus control such as a zoom function, an enlargement function, and an approach image. As a result, the head mounted display 100 can be used for surgery, microfabrication, micro experiments, and the like.
 次に、図3(a)および図3(b)を用いて、情報表示端末10の構成について説明する。図3(a)は、情報表示端末10が装着された状態を示す斜視図である。また、図3(b)は、情報表示端末10が脱着された状態を示す斜視図である。 Next, the configuration of the information display terminal 10 will be described with reference to FIGS. 3 (a) and 3 (b). FIG. 3A is a perspective view showing a state in which the information display terminal 10 is mounted. FIG. 3B is a perspective view showing a state where the information display terminal 10 is detached.
 図3(a)に示されるように、情報表示端末10のヘッドマウントディスプレイ100は、ユーザ1000の頭部に装着可能である。また、ヘッドマウントディスプレイ100は、ユーザの頭部に装着された状態にてユーザ1000の耳に引っ掛けられるテンプル部112と、ユーザの頭部に装着された状態にてユーザ1000の目の前に配置されるリム部123とから構成される。 As shown in FIG. 3A, the head mounted display 100 of the information display terminal 10 can be mounted on the head of the user 1000. In addition, the head mounted display 100 is arranged in front of the user 1000 in a state where the head mounted display 100 is attached to the user's head while being attached to the user's 1000 ear. Rim portion 123 to be formed.
 テンプル部112は、一端がリム部123と連結し他端がケーブル121と連結する。 The temple portion 112 has one end connected to the rim portion 123 and the other end connected to the cable 121.
 リム部123には、撮像部109と表示部110とが設けられる。なお、撮像部109と、表示部110とが、リム部123に取り付けられるようにしても良い。 The rim portion 123 is provided with an imaging unit 109 and a display unit 110. Note that the imaging unit 109 and the display unit 110 may be attached to the rim unit 123.
 撮像部109は、ヘッドマウントディスプレイ100がユーザ1000の頭部に装着された状態にて、ユーザ1000の視線方向の撮像領域202を撮像する。 The imaging unit 109 images the imaging region 202 in the line-of-sight direction of the user 1000 in a state where the head mounted display 100 is mounted on the user 1000's head.
 表示部110は、ヘッドマウントディスプレイ100がユーザ1000の頭部に装着された状態にて、ユーザ1000の目の前に配置される。なお、表示部110は、透明体もしくは、ハーフミラー、あるいは全反射ミラーで構成される。 The display unit 110 is disposed in front of the user 1000 with the head mounted display 100 mounted on the head of the user 1000. The display unit 110 is configured by a transparent body, a half mirror, or a total reflection mirror.
 ケーブル121は、例えば、可撓性部材により形成される。具体的には、ケーブル121は、フレキシブルワイヤや、形状記憶チューブである。そして、ケーブル121は外力を加えられることによって、形状が変化する。また、ケーブル121は、変化した後の形状を維持する。ユーザ1000は、ヘッドマウントディスプレイ100を使用しない場合は、図3(b)に示されるように、ケーブル121を首に巻き付けた状態で、情報表示端末10を携帯する。なお、ケーブル121の一端と他端とに、一対の面ファスナーまたは磁石を設けるようにしても良い。この場合、ケーブル121の一端と他端とが接触した状態で、ケーブル121は、首に巻き付けられる。また、ケーブル121は、内部に光ファイバや、電線、ハードカバーなどが内蔵されている。 The cable 121 is formed of, for example, a flexible member. Specifically, the cable 121 is a flexible wire or a shape memory tube. The shape of the cable 121 is changed by applying an external force. Moreover, the cable 121 maintains the shape after changing. When the user 1000 does not use the head mounted display 100, as shown in FIG. 3B, the user 1000 carries the information display terminal 10 with the cable 121 wound around the neck. A pair of hook-and-loop fasteners or magnets may be provided at one end and the other end of the cable 121. In this case, the cable 121 is wound around the neck while the one end and the other end of the cable 121 are in contact with each other. The cable 121 includes an optical fiber, an electric wire, a hard cover, and the like.
 撮像部109は、撮像領域202を撮像する。表示部110は、撮像領域202に対して虚像表示領域203の範囲に設けられる。また、表示部110は、撮像された撮像領域202の撮像画像を表示する。なお、表示部110は、撮像領域202のユーザの視点方向の領域の画像を拡大して表示しても良い。また、表示部110を、撮像領域の面積の1/10~1/2程度の範囲に設けても良い。また、虚像の焦点位置を可変してもよく、これは自動、手動のどちらで調整しても良い。この場合、光学モジュールのレンズ位置や、光学長を調整するガラスやプラスチックなどの空気と異なる屈折率の材料を用い、光軸に挿入して調節しても良い。また、レンズを変更したり、ズーム機能により焦点距離を変化させて、虚像の大きさや画角を変更することが可能である。 The imaging unit 109 images the imaging area 202. The display unit 110 is provided in the range of the virtual image display area 203 with respect to the imaging area 202. In addition, the display unit 110 displays a captured image of the captured imaging region 202. Note that the display unit 110 may enlarge and display an image of an area in the viewpoint direction of the user in the imaging area 202. The display unit 110 may be provided in a range of about 1/10 to 1/2 of the area of the imaging region. Further, the focal position of the virtual image may be varied, and this may be adjusted either automatically or manually. In this case, a material having a refractive index different from that of air such as glass or plastic for adjusting the lens position or optical length of the optical module may be used and inserted into the optical axis for adjustment. It is also possible to change the size and angle of view of the virtual image by changing the lens or changing the focal length by the zoom function.
 なお、ヘッドマウントディスプレイ100は、ユーザ1000の頭部に装着された状態で、ユーザ1000の目の前に配置される表示部110と、ユーザ1000の視線方向を撮像する撮像部109とを有していれば、ゴーグル型であっても良い。 The head mounted display 100 includes a display unit 110 disposed in front of the user 1000 and an imaging unit 109 that captures the user's 1000 line-of-sight direction while being mounted on the user 1000's head. If so, it may be a goggle type.
 また、表示部110に表示画像を表示する方式として、ハーフミラーを用いた方式や、ミラーやプリズムを用いて一方向のみ分割してシースルーを実現する方式や、ユーザ1000の網膜上に直接虚像を投影する方式を適用しても良い。 In addition, as a method for displaying a display image on the display unit 110, a method using a half mirror, a method for realizing see-through by dividing only one direction using a mirror or a prism, or a virtual image directly on the retina of the user 1000 is displayed. A projection method may be applied.
 ホルダ130は、通信部131と、電源管理部132と、リーダ133とを有する。ホルダ130は、ユーザが保持する端末(例えば、携帯端末)などを脱着可能に保持する。 The holder 130 includes a communication unit 131, a power management unit 132, and a reader 133. The holder 130 detachably holds a terminal (for example, a portable terminal) held by the user.
 なお、ホルダ130に替えて、ケーブル121の他端に、携帯端末を脱着可能に取り付けるようにしても良い。 In addition, it may replace with the holder 130 and you may make it attach a portable terminal to the other end of the cable 121 so that attachment or detachment is possible.
 また、ホルダ130にキーモバイルの差込口を設けるようにしても良い。ここで、キーモバイルには、認証キーやアクセスコードやセキュリティコードが記憶されている。そして、所定のキーモバイルがホルダ130に差し込まれることで、ユーザの認証が行われ、認証に成功した場合にホルダ130およびヘッドマウントディスプレイ100を使用可能になる。さらに、ホルダ130が、指紋認証または静脈認証を備えるようにしても良い。この場合、ホルダ130が指紋認証また静脈認証に成功した場合に、ホルダ130およびヘッドマウントディスプレイ100を使用可能になる。 In addition, a key mobile port may be provided in the holder 130. Here, the key mobile stores an authentication key, an access code, and a security code. A predetermined key mobile is inserted into the holder 130 to authenticate the user. When the authentication is successful, the holder 130 and the head mounted display 100 can be used. Furthermore, the holder 130 may be provided with fingerprint authentication or vein authentication. In this case, when the holder 130 has succeeded in fingerprint authentication or vein authentication, the holder 130 and the head mounted display 100 can be used.
 また、ホルダ130自体がキーモバイルとして機能するようにしても良い。この場合、ホルダ130をSPC(Security Personal Computer)に差し込むことで、ユーザの認証が行われ、認証に成功した場合にSPCを使用可能になる。 Further, the holder 130 itself may function as a key mobile. In this case, by inserting the holder 130 into an SPC (Security Personal Computer), the user is authenticated, and the SPC can be used when the authentication is successful.
 また、ホルダ130は、リーダを有するようにしても良い。リーダは、バーコードリーダや、RFIDリーダや、QRコードを読み取る撮像装置などである。リーダは、例えばIDカードに搭載されるICチップを、通信コイル、伝送コイル、誘導電流コイルなどによって、非接触もしくは接触で情報を読み取る。これによって、ホルダ130が、個別認証、ID認証、決済などが可能になる。 Further, the holder 130 may have a reader. The reader is a barcode reader, an RFID reader, an imaging device that reads a QR code, or the like. The reader reads information from an IC chip mounted on an ID card, for example, in a non-contact manner or in a contact manner using a communication coil, a transmission coil, an induction current coil, or the like. As a result, the holder 130 can perform individual authentication, ID authentication, settlement, and the like.
 また、ケーブル121の他端に、ケーブル121を介してヘッドマウントディスプレイ100へ電力を供給するバッテリを脱着可能に取り付けるようにしても良い。 Further, a battery that supplies power to the head mounted display 100 via the cable 121 may be detachably attached to the other end of the cable 121.
 また、ケーブル121の他端に、キーボードや、マウスや、タッチパッドなどの入力装置を脱着可能に取り付けるようにしても良い。この場合、入力装置は、ユーザ1000から受け付けた入力指示を、ケーブル121を介してヘッドマウントディスプレイ100へ送信する。
 <全体処理>
Further, an input device such as a keyboard, a mouse, or a touch pad may be detachably attached to the other end of the cable 121. In this case, the input device transmits the input instruction received from the user 1000 to the head mounted display 100 via the cable 121.
<Overall processing>
 図6は、実施の形態1に係る情報表示端末10の全体処理の概要を示す図である。実施の形態1に係る全体処理は、例えば、表示部110が表示画像の表示を開始した場合に開始する。以下、撮像部109により撮像領域202が撮像されており、表示部110が、撮像部109により撮像される撮像画像を表示していることを前提に説明する。 FIG. 6 is a diagram showing an overview of the overall processing of the information display terminal 10 according to the first embodiment. The overall processing according to Embodiment 1 starts when the display unit 110 starts displaying a display image, for example. Hereinafter, description will be made on the assumption that the imaging area 202 is captured by the imaging unit 109 and the display unit 110 displays a captured image captured by the imaging unit 109.
 まず、S601にて、表示部110は、設定されている色彩パラメータと輝度とに基づき、表示画像を表示する。 First, in S601, the display unit 110 displays a display image based on the set color parameter and luminance.
 次に、S602にて、色センサ119が、色彩を主制御部に入力する。また、照度センサ120が、照度レベルを主制御部101に入力する。 Next, in S602, the color sensor 119 inputs the color to the main control unit. Further, the illuminance sensor 120 inputs the illuminance level to the main control unit 101.
 次に、S603にて、主制御部101は、S602にて入力された照度レベルと、色彩とに基づき、周辺の状況が変化したかを判定する。主制御部101が、周辺の状況が変化していないと判定する場合(S603-No)、S601へ戻る。一方、主制御部101が、周辺の状況が変化したと判定する場合(S603-Yes)、S604へ進む。 Next, in S603, the main control unit 101 determines whether the surrounding situation has changed based on the illuminance level and the color input in S602. When the main control unit 101 determines that the surrounding situation has not changed (No in S603), the process returns to S601. On the other hand, when the main control unit 101 determines that the surrounding situation has changed (S603-Yes), the process proceeds to S604.
 次に、S604にて、主制御部101は、照度センサ120から入力される照度レベルと対応する輝度をメモリ102から取得し、取得した輝度を、表示部が表示する表示画像の輝度として設定する。また、主制御部101は、色センサ119から入力される色彩と対応する色彩パラメータをメモリ102から取得し、取得した色彩パラメータに基づき、表示部110が表示する表示画像の色彩パラメータを設定する。S604の次は、S601に戻る。
 <実施の形態1の効果>
Next, in step S604, the main control unit 101 acquires the luminance corresponding to the illuminance level input from the illuminance sensor 120 from the memory 102, and sets the acquired luminance as the luminance of the display image displayed on the display unit. . In addition, the main control unit 101 acquires color parameters corresponding to the colors input from the color sensor 119 from the memory 102, and sets the color parameters of the display image displayed on the display unit 110 based on the acquired color parameters. After S604, the process returns to S601.
<Effect of Embodiment 1>
 以上説明した実施の形態1によれば、表示部110が、検出された情報表示端末10の周辺の状況に応じて、輝度または色彩の少なくともいずれか一方を変化させて情報を表示することで、表示部110が表示する情報の視認性を向上できる。 According to the first embodiment described above, the display unit 110 displays information by changing at least one of luminance and color in accordance with the detected situation around the information display terminal 10. The visibility of information displayed on the display unit 110 can be improved.
 詳細には、ユーザ1000の周辺の状況が、暗くなった場合にて、表示部110は、表示画像の輝度を低くする。そのため、表示部110は、周辺の暗さとマッチングされた最適な映像を表示できる。この場合、表示部110に表示される映像が、眩しいとユーザに感じさせることを抑止できる。また、夕方や朝焼け時やブルーバック時など場合は、表示部110は、表示する映像の色調を逆にすること色彩を変化させる。これによって、表示部110に表示される映像の視認性が向上する。 More specifically, when the situation around the user 1000 becomes dark, the display unit 110 reduces the brightness of the display image. For this reason, the display unit 110 can display an optimal video matched with the surrounding darkness. In this case, it is possible to prevent the user from feeling that the video displayed on the display unit 110 is dazzling. Further, in the evening, in the morning glow, or in the blue background, the display unit 110 changes the color by reversing the color tone of the video to be displayed. Thereby, the visibility of the image displayed on the display unit 110 is improved.
 また、表示部110が、計時部115によって計時された現在の時刻と対応する色彩パラメータに基づき、輝度または色彩の少なくともいずれか一方を変化させて情報を表示することで、照度センサなどを設けることなく、周辺の状況に応じて表示部110が表示する表示画像の視認性を向上できる。 In addition, an illumination sensor or the like is provided by the display unit 110 displaying information by changing at least one of luminance and color based on the color parameter corresponding to the current time measured by the time measuring unit 115. In addition, the visibility of the display image displayed by the display unit 110 can be improved according to the surrounding situation.
 また、表示部110が、暗視カメラにより撮像される撮像画像に含まれる人の画像を、輝度または色彩の少なくともいずれか一方を変化させて表示することで、周辺が暗い状況における撮像画像に含まれる人の画像の視認性を向上できる。 In addition, the display unit 110 displays the human image included in the captured image captured by the night vision camera while changing at least one of the luminance and the color, so that it is included in the captured image in the dark surroundings. Can improve the visibility of the image of the person.
 また、表示部110が、背景画像が撮像部109により撮像される画像である場合は、背景画像の輝度または色彩の少なくともいずれか一方を変化させて表示することで、背景画像が撮像部109により撮像される画像である場合にて、表示部110が表示する表示画像の視認性を向上できる。 Further, when the display unit 110 displays the background image by the imaging unit 109, the background image is displayed by the imaging unit 109 by displaying at least one of the luminance and the color of the background image. In the case of an image to be captured, the visibility of the display image displayed on the display unit 110 can be improved.
 また、背景画像が撮像部109により撮像される画像以外である場合は、背景画像および重畳表示情報の輝度または色彩の少なくともいずれか一方を変化させて表示する。これによって、背景画像が撮像部109により撮像される画像以外である場合にて、表示部110が表示する表示画像の視認性を向上できる。
(実施の形態2)
When the background image is other than the image captured by the image capturing unit 109, the background image and / or the superimposed display information are displayed by changing at least one of luminance and color. Thereby, when the background image is other than the image captured by the imaging unit 109, the visibility of the display image displayed on the display unit 110 can be improved.
(Embodiment 2)
 実施の形態2が実施の形態1と異なる点は、実施の形態2が頭部に装着された状態にて、ユーザの視線方向の撮像領域を撮像する前方撮像部と、視線方向の反対方向の撮像領域を撮像する後方撮像部とを有する点である。以下、実施の形態2を実施の形態1と異なる点を主に図7~図9を用いて説明する。
 <情報表示端末の構成>
The second embodiment is different from the first embodiment in that the front imaging unit that captures an imaging region in the user's gaze direction and the opposite direction to the gaze direction in a state where the second embodiment is mounted on the head. And a rear imaging unit that images the imaging region. The difference between the second embodiment and the first embodiment will be described below mainly with reference to FIGS.
<Configuration of information display terminal>
 以下、図7および図8を用いて、情報表示端末10の構成について説明する。図7は、実施の形態2に係る情報表示端末10のハードウェアの構成例の概要を示す図である。図8は、実施の形態2に係る情報表示端末10が装着された状態を示す斜視図である。 Hereinafter, the configuration of the information display terminal 10 will be described with reference to FIGS. 7 and 8. FIG. 7 is a diagram illustrating an outline of a hardware configuration example of the information display terminal 10 according to the second embodiment. FIG. 8 is a perspective view showing a state where the information display terminal 10 according to Embodiment 2 is mounted.
 図7に示されるように、情報表示端末10は、前方撮像部710と、後方撮像部720とを有する。 As shown in FIG. 7, the information display terminal 10 includes a front imaging unit 710 and a rear imaging unit 720.
 前方撮像部710は、頭部に装着された状態にて、ユーザの視線方向の撮像領域202を撮像する。以下、前方撮像部710が撮像するユーザの視線方向の撮像領域202の画像を、前方撮像画像と呼ぶ場合がある。前方撮像部710は、撮像した前方撮像画像を表示するための前方撮像画像データを、主制御部101に入力する。そして、前方撮像部710により入力された前方撮像画像データは、主制御部101のメモリ102に記憶される。 The front imaging unit 710 images the imaging region 202 in the user's line-of-sight direction while being attached to the head. Hereinafter, the image of the imaging region 202 in the user's line-of-sight direction captured by the front imaging unit 710 may be referred to as a front captured image. The front imaging unit 710 inputs front captured image data for displaying the captured front captured image to the main control unit 101. Then, the front captured image data input by the front imaging unit 710 is stored in the memory 102 of the main control unit 101.
 後方撮像部720は、視線方向の反対方向の撮像領域204を撮像する。以下、後方撮像部720が撮像する視線方向の反対方向の撮像領域204の画像を、後方撮像画像と呼ぶ場合がある。後方撮像部720は、撮像した後方撮像画像を表示するための後方撮像画像データを、主制御部101に入力する。そして、後方撮像部720により入力された後方撮像画像データは、主制御部101により、メモリ102に記憶される。 The rear imaging unit 720 images the imaging area 204 in the direction opposite to the viewing direction. Hereinafter, the image of the imaging region 204 in the direction opposite to the line-of-sight direction captured by the rear imaging unit 720 may be referred to as a rear captured image. The rear imaging unit 720 inputs rear captured image data for displaying the captured rear captured image to the main control unit 101. Then, the rear captured image data input by the rear imaging unit 720 is stored in the memory 102 by the main control unit 101.
 主制御部101は、入力された前方撮像画像データに基づき、前方撮像画像を表示部110に表示させる。また、主制御部101は、入力された後方撮像画像データに基づき、後方撮像画像を表示部110に表示させる。 The main control unit 101 causes the display unit 110 to display a front captured image based on the input front captured image data. Further, the main control unit 101 causes the display unit 110 to display a rear captured image based on the input rear captured image data.
 そして、表示部110は、主制御部101に入力された前方撮像画像データに基づく前方撮像画像または、後方撮像画像データに基づく後方撮像画像の少なくともいずれか一方を表示する。 The display unit 110 displays at least one of a front captured image based on the front captured image data input to the main control unit 101 and a rear captured image based on the rear captured image data.
 例えば、ケーブル121の他端や、ホルダ130自体に、キーボードや、マウスや、タッチパッドなどの入力装置が取り付けられている場合、入力装置が入力を受け付ける度に、表示部110は、表示する映像を前方撮像映像から後方撮像映像へ、または、後方撮像映像から前方撮像映像へと切り替える。 For example, when an input device such as a keyboard, a mouse, or a touchpad is attached to the other end of the cable 121 or the holder 130 itself, the display unit 110 displays an image to be displayed every time the input device receives an input. Are switched from the front captured video to the rear captured video, or from the rear captured video to the front captured video.
 なお、表示部110は、表示領域を二つに分割し、一方の表示領域に前方撮像画像を表示し、他方の表示領域に後方撮像画像を表示するようにしても良い。 The display unit 110 may divide the display area into two, display the front captured image in one display area, and display the rear captured image in the other display area.
 なお、一つの撮像部の前方側に撮像するためのレンズを取り付けるとともに、後方側にも撮像するためのレンズを取り付けることで、視線方向の撮像領域202および視線方向の反対方向の撮像領域204を撮像可能にしても良い。 In addition, while attaching the lens for imaging to the front side of one imaging part, and attaching the lens for imaging also to the back side, the imaging area 202 of a gaze direction and the imaging area 204 of the opposite direction of a gaze direction are attached. You may enable imaging.
 また、前方撮像部710、後方撮像部720が、例えば近赤外線カメラのように、暗視カメラや熱感知カメラとして機能する、カメラ撮像素子を有するようにしても良い。 Further, the front imaging unit 710 and the rear imaging unit 720 may have a camera imaging element that functions as a night vision camera or a heat sensing camera, such as a near infrared camera.
 さらに、暗視カメラや熱感知カメラとして機能する場合、前方撮像部710、後方撮像部720は、専用の暗視カメラであっても良い。また、暗視カメラや熱感知カメラとして機能する場合、前方撮像部710、後方撮像部720は、他の波長の光を検出する、例えば遠赤外、紫外線、X線、テラヘルツ、ミュオン、黄色、1500nmの赤外など、あらゆる波長の光や波を感知する撮像素子113を有するようにしても良い。 Furthermore, in the case of functioning as a night vision camera or a heat sensing camera, the front imaging unit 710 and the rear imaging unit 720 may be dedicated night vision cameras. When functioning as a night vision camera or a heat sensing camera, the front imaging unit 710 and the rear imaging unit 720 detect light of other wavelengths, for example, far infrared, ultraviolet, X-ray, terahertz, muon, yellow, You may make it have the image pick-up element 113 which detects the light and wave of all wavelengths, such as infrared of 1500 nm.
 また、情報表示端末10が、前方撮像部710、後方撮像部720以外にも、ユーザの右側方向を撮像する撮像部と、ユーザの左側方向を撮像する撮像部とをさらに有するようにしても良い。そして、画像音声処理部108は、前方撮像部710と、後方撮像部720と、ユーザの右側方向を撮像する撮像部と、ユーザの左側方向を撮像する撮像部とによって撮像された各画像を合成することで全方位を確認可能な全方位画像を生成するようにしても良い。この場合、表示部110は、全方位画像を表示する。 In addition to the front imaging unit 710 and the rear imaging unit 720, the information display terminal 10 may further include an imaging unit that captures the right direction of the user and an imaging unit that captures the left direction of the user. . The audio / video processing unit 108 synthesizes each image captured by the front imaging unit 710, the rear imaging unit 720, the imaging unit that captures the user's right direction, and the imaging unit that captures the user's left direction. By doing so, an omnidirectional image in which the omnidirectional image can be confirmed may be generated. In this case, the display unit 110 displays an omnidirectional image.
 このように、視線方向の撮像領域202および視線方向の反対方向の撮像領域204を撮像し、各撮像画像を表示可能にすることで、ユーザ1000は、自転車や、バイク、自動車、などの運転時に、後方の映像を確認できる。 In this way, by capturing the imaging region 202 in the line-of-sight direction and the imaging region 204 in the opposite direction to the line-of-sight direction and enabling each captured image to be displayed, the user 1000 can operate during driving of a bicycle, a motorcycle, a car, or the like. , You can see the video behind.
 また、主制御部101は、前方撮像部710や後方撮像部720により入力された前方撮像画像データや後方撮像画像データを公知の技術により解析することで、ユーザ1000(情報表示端末10を装着中のユーザ1000)に危険が迫っているかを判定しても良い。例えば、バイクや自動車や自転車などの物体が、情報表示端末10を装着中のユーザ1000に接近していることを検出した場合に、主制御部101は、ユーザ1000に危険が迫っていること判定する。 Further, the main control unit 101 analyzes the front captured image data and the rear captured image data input by the front image capturing unit 710 and the rear image capturing unit 720 by a known technique, so that the user 1000 (when the information display terminal 10 is being attached) is analyzed. It may be determined whether the user 1000) is in danger. For example, when it is detected that an object such as a motorcycle, a car, or a bicycle is approaching the user 1000 wearing the information display terminal 10, the main control unit 101 determines that the user 1000 is in danger. To do.
 なお、情報表示端末10が、ソナーを有するようにしても良い。この場合、情報表示端末10が、音波を発射し、物体からの反射される音波に基づいて、物体が接近していることを検出する。また、情報表示端末10は、赤外線により物体の接近を検出するようにしても良い。さらに、情報表示端末10が、空気の振動を検出する振動センサを有するようにしても良い。この場合、情報表示端末10は、振動センサにより検出される空気の振動に基づき、物体が接近していることを検出する。 Note that the information display terminal 10 may have sonar. In this case, the information display terminal 10 emits a sound wave and detects that the object is approaching based on the sound wave reflected from the object. The information display terminal 10 may detect the approach of an object using infrared rays. Further, the information display terminal 10 may include a vibration sensor that detects air vibration. In this case, the information display terminal 10 detects that an object is approaching based on the vibration of air detected by the vibration sensor.
 ユーザ1000に危険が迫っていると判定した主制御部101は、表示部110を制御することで、危険が迫っていることをユーザ1000に報知する。例えば、主制御部101は、表示部110に赤色一色の背景画像を表示させる。 The main control unit 101 that has determined that the danger is imminent to the user 1000 notifies the user 1000 that the danger is imminent by controlling the display unit 110. For example, the main control unit 101 causes the display unit 110 to display a single red background image.
 なお、主制御部101は、重畳表示情報として、危険が迫っていることを示す文字やマークなどを表示部110に表示させるようにしても良い。さらに、主制御部101は、危険が迫っていることを示す音声を音声入出力部104に出力させるようにしても良い。 Note that the main control unit 101 may cause the display unit 110 to display characters or marks indicating that danger is imminent as the superimposed display information. Further, the main control unit 101 may cause the voice input / output unit 104 to output a voice indicating that danger is imminent.
 また、ユーザ1000がバイクに乗車する際に装着するヘルメットに後方撮像部720を取り付けるようにしても良い。この場合、ヘルメットに取り付けられた後方撮像部720は、後方撮像画像データを、無線通信を介して情報表示端末10に送信する。また、ヘルメットに取り付けられた後方撮像部720と情報表示端末10のヘッドマウントディスプレイ100とがケーブルを介して互いに接続されるようにしても良い。この場合、後方撮像部720は、後方撮像画像データを、ケーブルを介して情報表示端末10に送信する。
 <全体処理>
Further, the rear imaging unit 720 may be attached to a helmet that is worn when the user 1000 rides on a motorcycle. In this case, the rear imaging unit 720 attached to the helmet transmits rear captured image data to the information display terminal 10 via wireless communication. The rear imaging unit 720 attached to the helmet and the head mounted display 100 of the information display terminal 10 may be connected to each other via a cable. In this case, the rear imaging unit 720 transmits rear captured image data to the information display terminal 10 via a cable.
<Overall processing>
 図9は、実施の形態2に係る情報表示端末10の全体処理の概要を示す図である。実施の形態2に係る全体処理は、例えば、表示部110が表示画像の表示を開始した場合に開始する。以下、前方撮像部710が撮像領域202を撮像し、後方撮像部720が撮像領域204を撮像していることを前提に説明する。 FIG. 9 is a diagram showing an overview of the overall processing of the information display terminal 10 according to the second embodiment. The overall processing according to Embodiment 2 starts when the display unit 110 starts displaying a display image, for example. The following description is based on the assumption that the front imaging unit 710 images the imaging area 202 and the rear imaging unit 720 images the imaging area 204.
 まず、S901にて、前方撮像部710は、撮像した前方撮像画像の前方撮像画像データを、主制御部101に入力する。また、後方撮像部720は、撮像した後方撮像画像の後方撮像画像データを、主制御部101に入力する。 First, in step S <b> 901, the front imaging unit 710 inputs the front captured image data of the captured front captured image to the main control unit 101. Further, the rear imaging unit 720 inputs rear captured image data of the captured rear captured image to the main control unit 101.
 次に、S902にて、主制御部101は、S901にて入力された前方撮像画像データと、後方撮像画像データとの少なくともいずれか一方を解析することで、ユーザ1000に危険が迫っているかを判定する。主制御部101が、ユーザ1000に危険が迫っていないと判定する場合(S902-No)、S904へ進む。一方、主制御部101が、ユーザ1000に危険が迫っていると判定する場合(S902-Yes)、S903へ進む。 Next, in step S902, the main control unit 101 analyzes at least one of the front captured image data and the rear captured image data input in step S901 to determine whether the user 1000 is in danger. judge. When the main control unit 101 determines that the user 1000 is not in danger (S902-No), the process proceeds to S904. On the other hand, when the main control unit 101 determines that the danger is imminent to the user 1000 (S902-Yes), the process proceeds to S903.
 次に、S903にて、主制御部101は、表示部110を制御することで、危険が迫っていることをユーザ1000に報知する。また、表示部110は、S901にて入力された前方撮像画像データに基づく前方撮像画像または、後方撮像画像データに基づく後方撮像画像の少なくともいずれか一方を表示する。S903の次は、S901へ戻る。 Next, in S903, the main control unit 101 controls the display unit 110 to notify the user 1000 that the danger is imminent. The display unit 110 displays at least one of a front captured image based on the front captured image data input in S901 and a rear captured image based on the rear captured image data. After S903, the process returns to S901.
 S902にてNoだった場合、S904にて、表示部110は、S901にて入力された前方撮像画像データに基づく前方撮像画像または、後方撮像画像データに基づく後方撮像画像の少なくともいずれか一方を表示する。S903の次は、S901へ戻る。
 <実施の形態2の効果>
If NO in step S902, in step S904, the display unit 110 displays at least one of a front captured image based on the front captured image data input in step S901 and a rear captured image based on the rear captured image data. To do. After S903, the process returns to S901.
<Effect of Embodiment 2>
 実施の形態2では、表示部110が、前方撮像部または後方撮像部の少なくともいずれか一方により撮像された撮像画像を表示することで、前方撮像部または後方撮像部を確認し、危険を回避できるようになる。
(実施の形態3)
In Embodiment 2, the display unit 110 displays a captured image captured by at least one of the front imaging unit and the rear imaging unit, thereby confirming the front imaging unit or the rear imaging unit and avoiding danger. It becomes like this.
(Embodiment 3)
 従来、情報表示端末10を装着しているユーザ1000が移動(歩行による移動や、バイクや自動車に乗車しての移動など)しながら、撮像している場合、ユーザ1000の頭部が揺れることがあった。この場合、ユーザ1000の頭部が揺れることに伴い、情報表示端末10が有する撮像部109も揺れてしまう。撮像部109により撮像された撮像画像が、無線通信を介して、情報表示端末140、携帯端末300、施設端末400などに配信される場合、情報表示端末140、携帯端末300、施設端末400など(以下、これらを総称して外部端末と呼ぶ場合がある)に表示される撮像画像も揺れてしまい、撮像画像を視聴する者が酔ってしまうという問題が生じていた。実施の形態3の目的は、ユーザ1000の頭部が揺れることに伴い、情報表示端末10が有する撮像部109も揺れてしまうような場合でも、快適に視聴できる画像を情報表示端末140、携帯端末300、施設端末400などに表示可能にする技術を提供することである。 Conventionally, when the user 1000 wearing the information display terminal 10 moves (moves by walking, moves by riding a motorcycle or a car, etc.) while taking an image, the head of the user 1000 may shake. there were. In this case, as the head of the user 1000 shakes, the imaging unit 109 included in the information display terminal 10 also shakes. When the captured image captured by the imaging unit 109 is distributed to the information display terminal 140, the portable terminal 300, the facility terminal 400, and the like via wireless communication, the information display terminal 140, the portable terminal 300, the facility terminal 400, and the like ( Hereinafter, the captured image displayed on the external terminal may be shaken, and the person who views the captured image gets drunk. The purpose of the third embodiment is to display an image that can be comfortably viewed even when the imaging unit 109 of the information display terminal 10 is shaken as the head of the user 1000 is shaken, the information display terminal 140, the portable terminal 300, providing a technique enabling display on the facility terminal 400 or the like.
 以下、実施の形態3を実施の形態1と異なる点を主に図10、図11を用いて説明する。
 <全体処理>
Hereinafter, differences of the third embodiment from the first embodiment will be described with reference mainly to FIGS.
<Overall processing>
 図10は、実施の形態3に係る情報表示端末10の全体処理の概要を示す図である。実施の形態3に係る全体処理は、例えば、撮像部109が撮像領域202の撮像を開始した場合に開始する。以下、姿勢センサ105により検出されたヘッドマウントディスプレイ100の傾きが、定期的に、主制御部101に入力されることを前提に説明する。また、撮像部109により撮像される撮像画像の撮像画像データが、定期的に、主制御部101に入力されることを前提に説明する。 FIG. 10 is a diagram showing an overview of the overall processing of the information display terminal 10 according to the third embodiment. The overall processing according to Embodiment 3 starts when the imaging unit 109 starts imaging the imaging region 202, for example. The following description is based on the assumption that the inclination of the head mounted display 100 detected by the attitude sensor 105 is periodically input to the main control unit 101. Further, the description will be made on the assumption that captured image data of a captured image captured by the imaging unit 109 is periodically input to the main control unit 101.
 まず、S1001にて、主制御部101は、姿勢センサ105から入力されたヘッドマウントディスプレイ100の傾きに基づき、公知の技術によりヘッドマウントディスプレイ100が揺れているかを判定する。主制御部101が、ヘッドマウントディスプレイ100が揺れていないと判定する場合(S1001-No)、S1001へ戻る。一方、主制御部101が、ヘッドマウントディスプレイ100が揺れていると判定する場合(S1001-Yes)、S1002へ進む。 First, in step S1001, the main control unit 101 determines whether the head mounted display 100 is shaken by a known technique based on the tilt of the head mounted display 100 input from the attitude sensor 105. When the main control unit 101 determines that the head mounted display 100 is not shaken (S1001-No), the process returns to S1001. On the other hand, when the main control unit 101 determines that the head mounted display 100 is shaking (S1001-Yes), the process proceeds to S1002.
 次に、S1002にて、主制御部101は、撮像部109から入力された揺れる前の撮像画像の撮像画像データと、揺れた後の撮像画像の撮像画像データとに基づき、揺れの度合い(揺れる前の撮像画像の撮像画像データに対する、揺れた後の撮像画像の撮像画像データのピクセルの移動ベクトルおよび移動速度)を公知の技術により解析する。 In step S <b> 1002, the main control unit 101 determines the degree of shaking (swaying) based on the captured image data of the captured image before shaking and the captured image data of the captured image input from the imaging unit 109. The pixel movement vector and moving speed of the imaged image data of the imaged image after shaking are analyzed by a known technique with respect to the imaged image data of the previous imaged image.
 次に、S1003にて、主制御部101は、画像音声処理部108に揺れる前の撮像画像の撮像画像からコア領域画像を抽出させる。また、主制御部101は、画像音声処理部108に揺れた後の撮像画像の撮像画像データからコア領域画像を抽出させる。例えば、主制御部101は、S1002にて解析した揺れの度合いに基づき、各コア領域画像を抽出する。詳細には、図11に示されるように、主制御部101は、ヘッドマウントディスプレイ100が揺れている場合であっても、常に撮像されているコア領域画像602,604を画像音声処理部108に抽出させる。すなわち、画像音声処理部108は、揺れる前の撮像画像と揺れた後の撮像画像との両方に共通して含まれるコア領域画像を、揺れる前の撮像画像と、揺れた後の撮像画像とから、それぞれ抽出する。 Next, in S1003, the main control unit 101 causes the image / sound processing unit 108 to extract the core region image from the captured image of the captured image before shaking. Further, the main control unit 101 causes the image / sound processing unit 108 to extract the core region image from the captured image data of the captured image after shaking. For example, the main control unit 101 extracts each core region image based on the degree of shaking analyzed in S1002. Specifically, as shown in FIG. 11, the main control unit 101 sends the core area images 602 and 604 that are always imaged to the audio / video processing unit 108 even when the head mounted display 100 is shaking. Let it be extracted. That is, the audio / video processing unit 108 extracts the core area image that is included in both the captured image before shaking and the captured image after shaking from the captured image before shaking and the captured image after shaking. , Respectively.
 具体的には、主制御部101は、揺れる前の撮像画像601からコア領域画像602を画像音声処理部108に抽出させる。また、主制御部101は、揺れた後の撮像画像603からコア領域画像604を画像音声処理部108に抽出させる。 Specifically, the main control unit 101 causes the audio / video processing unit 108 to extract the core area image 602 from the captured image 601 before shaking. Further, the main control unit 101 causes the audio / video processing unit 108 to extract the core region image 604 from the captured image 603 after shaking.
 次に、S1004にて、主制御部101は、画像音声処理部108にS1003にて抽出したコア領域画像602,604の周辺の画像を削除させることで、補正画像605,606を表示するための補正画像データを生成させる。主制御部101は、生成させた各補正画像データを通信部103に入力する。 Next, in S1004, the main control unit 101 causes the audio / video processing unit 108 to delete the images around the core area images 602 and 604 extracted in S1003, thereby displaying the corrected images 605 and 606. Corrected image data is generated. The main control unit 101 inputs the generated corrected image data to the communication unit 103.
 次に、S1005にて、通信部103は、S1004にて生成された各補正画像データを外部端末に送信する。なお、通信部103は、外部端末の通信状態がオープンであることをキャッチした場合にのみ、各補正画像データを外部端末に送信する。当然、外部端末の通信状態が終了していれば、通信部103は、各補正画像データを外部端末に送信しない。 Next, in S1005, the communication unit 103 transmits the corrected image data generated in S1004 to the external terminal. Note that the communication unit 103 transmits each corrected image data to the external terminal only when it catches that the communication state of the external terminal is open. Of course, if the communication state of the external terminal is completed, the communication unit 103 does not transmit the corrected image data to the external terminal.
 次に、S1006にて、外部端末は、S1005にて送信された補正画像データを受信する。そして、外部端末は、受信した補正画像データに基づき補正画像605,606を表示する。S1006の次は、S1001に戻る。
 <実施の形態3の効果>
Next, in S1006, the external terminal receives the corrected image data transmitted in S1005. Then, the external terminal displays corrected images 605 and 606 based on the received corrected image data. After S1006, the process returns to S1001.
<Effect of Embodiment 3>
 以上説明した実施の形態3によれば、画像音声処理部108が抽出したコア領域画像602,604を表示するためのコア領域画像データを外部端末へ送信することで、撮像部109も揺れてしまうような場合でも、快適に視聴できる補正画像データを外部端末に表示可能になる。 According to the third embodiment described above, the imaging unit 109 is also shaken by transmitting the core region image data for displaying the core region images 602 and 604 extracted by the audio / video processing unit 108 to the external terminal. Even in such a case, the corrected image data that can be comfortably viewed can be displayed on the external terminal.
 この場合、外部端末は、撮像画像のすべてを表示できないものの、快適に視聴できる補正画像を表示できる。そのため、外部端末は、情報表示端末10を装着するユーザ1000の旅や体験、スポーツ観戦、書籍検索、情報検索、会議などを、気分悪くなることなく、共有することができる。 In this case, the external terminal cannot display all of the captured images, but can display a corrected image that can be comfortably viewed. Therefore, the external terminal can share the travel and experience of the user 1000 wearing the information display terminal 10, sports watching, book search, information search, conference, and the like without feeling sick.
 以上、本発明者によってなされた発明を実施の形態に基づき具体的に説明したが、本発明は前記実施の形態に限定されるものではなく、その要旨を逸脱しない範囲で種々変更可能であることはいうまでもない。 As mentioned above, the invention made by the present inventor has been specifically described based on the embodiment. However, the present invention is not limited to the embodiment, and various modifications can be made without departing from the scope of the invention. Needless to say.
10…情報表示端末、
100,140…ヘッドマウントディスプレイ、101…主制御部、102…メモリ、103…通信部、104…音声入出力部、105…姿勢センサ、106…位置検出部、107…電源管理部、108…画像音声処理部、109…撮像部、110…表示部、111…センサ部、112…テンプル部、113…撮像素子、114…音声処理部、115…計時部、119…色センサ、120…照度センサ、121…ケーブル、
130…ホルダ、131…通信部、132…電源管理部、133…リーダ、
200…ネットワーク、
300…携帯端末、
400…施設端末、
500…サーバ、
710…前方撮像部、720…後方撮像部、
1000…ユーザ。
10. Information display terminal,
DESCRIPTION OF SYMBOLS 100,140 ... Head mounted display, 101 ... Main control part, 102 ... Memory, 103 ... Communication part, 104 ... Voice input / output part, 105 ... Attitude sensor, 106 ... Position detection part, 107 ... Power supply management part, 108 ... Image Audio processing unit 109 ... Imaging unit 110 Display unit 111 Sensor unit 112 Temple unit 113 Image sensor 114 Audio processing unit 115 Timing unit 119 Color sensor 120 Illuminance sensor 121 ... cable,
130 ... Holder, 131 ... Communication unit, 132 ... Power management unit, 133 ... Reader,
200 ... Network,
300 ... mobile terminal,
400 ... Facility terminal,
500 ... server,
710 ... Front imaging unit, 720 ... Rear imaging unit,
1000: User.

Claims (9)

  1.  ユーザの頭部に装着可能な情報表示端末であって、
     前記情報表示端末は、前記ユーザの頭部に装着された状態にて、前記ユーザの目の前に配置される表示部を有し、
     前記表示部は、前記情報表示端末の周辺の状況に応じて、輝度または色彩の少なくともいずれか一方を変化させて情報を表示する、情報表示端末。
    An information display terminal that can be worn on the user's head,
    The information display terminal has a display unit arranged in front of the user's eyes in a state of being worn on the user's head,
    The said display part is an information display terminal which displays information by changing at least any one of a brightness | luminance or a color according to the surrounding condition of the said information display terminal.
  2.  請求項1に記載の情報表示端末において、
     現在の時刻を計時する計時部と、
     前記計時部によって計時された前記現在の時刻により前記情報表示端末の周辺の状況を検出する主制御部と、
     をさらに有し、
     前記表示部は、前記計時部によって計時された前記現在の時刻と対応する照度レベルまたは色彩パラメータに基づき、輝度または色彩の少なくともいずれか一方を変化させて情報を表示する、情報表示端末。
    The information display terminal according to claim 1,
    A timekeeping section that keeps track of the current time,
    A main control unit for detecting a situation around the information display terminal based on the current time measured by the time measuring unit;
    Further comprising
    The information display terminal, wherein the display unit displays information by changing at least one of luminance and color based on an illuminance level or a color parameter corresponding to the current time measured by the time measuring unit.
  3.  請求項1に記載の情報表示端末において、
     前記ユーザの頭部に装着された状態にて、前記ユーザの視線方向の撮像領域を撮像する撮像部と、
     前記撮像部によって撮像された撮像画像を解析することで前記情報表示端末の周辺の状況を判定する主制御部と、
     をさらに有し、
     前記表示部は、前記撮像画像に基づき検出される前記情報表示端末の周辺の状況に応じて、輝度または色彩の少なくともいずれか一方を変化させて情報を表示する、情報表示端末。
    The information display terminal according to claim 1,
    An imaging unit that captures an imaging region in the user's line-of-sight direction in a state of being mounted on the user's head;
    A main control unit that determines a situation around the information display terminal by analyzing a captured image captured by the imaging unit;
    Further comprising
    The information display terminal, wherein the display unit displays information by changing at least one of luminance and color according to a situation around the information display terminal detected based on the captured image.
  4.  請求項3に記載の情報表示端末において、
     前記撮像部は、暗視カメラであり、
     前記表示部は、前記暗視カメラにより撮像される画像に含まれる人の画像を、輝度または色彩の少なくともいずれか一方を変化させて表示する、情報表示端末。
    In the information display terminal according to claim 3,
    The imaging unit is a night vision camera,
    The information display terminal, wherein the display unit displays an image of a person included in an image captured by the night vision camera while changing at least one of luminance and color.
  5.  請求項3に記載の情報表示端末において、
     前記表示部に表示される情報は、背景画像と、前記背景画像に重畳して表示される重畳表示情報とからなり、
     前記表示部は、前記背景画像が前記撮像部により撮像される画像である場合は、前記背景画像の輝度または色彩の少なくともいずれか一方を変化させて表示する、情報表示端末。
    In the information display terminal according to claim 3,
    The information displayed on the display unit includes a background image and superimposed display information displayed superimposed on the background image,
    When the background image is an image picked up by the image pickup unit, the display unit displays information by changing at least one of luminance and color of the background image.
  6.  請求項3に記載の情報表示端末において、
     前記表示部に表示される情報は、背景画像と、前記背景画像に重畳して表示される重畳表示情報とからなり、
     前記表示部は、前記背景画像が前記撮像部により撮像される画像以外である場合は、前記背景画像および前記重畳表示情報の輝度または色彩の少なくともいずれか一方を変化させて表示する、情報表示端末。
    In the information display terminal according to claim 3,
    The information displayed on the display unit includes a background image and superimposed display information displayed superimposed on the background image,
    When the background image is an image other than an image captured by the imaging unit, the display unit displays at least one of the background image and the luminance or color of the superimposed display information while changing the background image. .
  7.  ユーザの頭部に装着可能な情報表示端末であって、
     前記情報表示端末は、前記ユーザの頭部に装着された状態にて、前記ユーザの目の前に配置される表示部と、
     前記ユーザの頭部に装着された状態にて、前記ユーザの視線方向の撮像領域を撮像する前方撮像部と、
     前記視線方向と反対方向の撮像領域を撮像する後方撮像部と、
     を有し、
     前記表示部は、前記前方撮像部または前記後方撮像部の少なくともいずれか一方により撮像された撮像画像を表示する、情報表示端末。
    An information display terminal that can be worn on the user's head,
    The information display terminal, in a state of being worn on the user's head, a display unit disposed in front of the user's eyes;
    A front imaging unit that captures an imaging region in the user's line-of-sight direction in a state of being worn on the user's head;
    A rear imaging unit that images an imaging region in a direction opposite to the line-of-sight direction;
    Have
    The said display part is an information display terminal which displays the captured image imaged by at least any one of the said front imaging part or the said back imaging part.
  8.  ユーザの頭部に装着可能な情報表示端末であって、
     前記情報表示端末は、前記ユーザの頭部に装着された状態にて、前記ユーザの目の前に配置される表示部と、
     前記ユーザの頭部に装着された状態にて、前記ユーザの視線方向の撮像領域を撮像する撮像部と、
     揺れる前の撮像画像と揺れた後の撮像画像との両方に共通して含まれるコア領域画像を、前記撮像部が撮像した揺れる前の撮像画像と揺れた後の撮像画像とから抽出する画像音声処理部と、
     前記画像音声処理部が抽出した前記コア領域画像を表示するためのコア領域画像データを外部端末へ送信する通信部と、
     を有する、情報表示端末。
    An information display terminal that can be worn on the user's head,
    The information display terminal, in a state of being worn on the user's head, a display unit disposed in front of the user's eyes;
    An imaging unit that captures an imaging region in the user's line-of-sight direction in a state of being mounted on the user's head;
    Image sound that extracts a core area image that is commonly included in both the captured image before shaking and the captured image after shaking from the captured image before shaking and the captured image captured by the imaging unit. A processing unit;
    A communication unit that transmits core region image data for displaying the core region image extracted by the image and sound processing unit to an external terminal;
    An information display terminal.
  9.  ユーザの頭部に装着可能であって、前記ユーザの頭部に装着された状態にて、前記ユーザの目の前に配置される表示部を有する情報表示端末における情報表示方法であって、
     表示部が、検出された前記情報表示端末の周辺の状況に応じて、輝度または色彩の少なくともいずれか一方を変化させて情報を表示する、情報表示方法。
     
     
     
    An information display method in an information display terminal that can be mounted on a user's head and has a display unit arranged in front of the user's eyes in a state of being mounted on the user's head,
    An information display method in which the display unit displays information by changing at least one of luminance and color in accordance with the detected situation around the information display terminal.


PCT/JP2015/052486 2015-01-29 2015-01-29 Information display terminal and information display method WO2016121049A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2015/052486 WO2016121049A1 (en) 2015-01-29 2015-01-29 Information display terminal and information display method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2015/052486 WO2016121049A1 (en) 2015-01-29 2015-01-29 Information display terminal and information display method

Publications (1)

Publication Number Publication Date
WO2016121049A1 true WO2016121049A1 (en) 2016-08-04

Family

ID=56542701

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/052486 WO2016121049A1 (en) 2015-01-29 2015-01-29 Information display terminal and information display method

Country Status (1)

Country Link
WO (1) WO2016121049A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005094615A (en) * 2003-09-19 2005-04-07 Sanyo Electric Co Ltd Camera-shake correcting apparatus, camera-shake correcting method and computer-readable recording medium with camera-shake correction program recorded thereon
JP2008096868A (en) * 2006-10-16 2008-04-24 Sony Corp Imaging display device, and imaging display method
JP2011071884A (en) * 2009-09-28 2011-04-07 Brother Industries Ltd Work supporting system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005094615A (en) * 2003-09-19 2005-04-07 Sanyo Electric Co Ltd Camera-shake correcting apparatus, camera-shake correcting method and computer-readable recording medium with camera-shake correction program recorded thereon
JP2008096868A (en) * 2006-10-16 2008-04-24 Sony Corp Imaging display device, and imaging display method
JP2011071884A (en) * 2009-09-28 2011-04-07 Brother Industries Ltd Work supporting system

Similar Documents

Publication Publication Date Title
US11344196B2 (en) Portable eye tracking device
TWI597623B (en) Wearable behavior-based vision system
JP6030582B2 (en) Optical device for individuals with visual impairment
CN103091843B (en) See-through display brilliance control
CA2750287C (en) Gaze detection in a see-through, near-eye, mixed reality display
US20180227470A1 (en) Gaze assisted field of view control
CN104838326B (en) Wearable food nutrition feedback system
CN109814719B (en) Method and equipment for displaying information based on wearing glasses
JP2013521576A (en) Local advertising content on interactive head-mounted eyepieces
KR20160048801A (en) Methods and systems for augmented reality
KR20140059213A (en) Head mounted display with iris scan profiling
US11830494B2 (en) Wearable speech input-based vision to audio interpreter
US11783582B2 (en) Blindness assist glasses
US20210390882A1 (en) Blind assist eyewear with geometric hazard detection
CN110389447B (en) Transmission type head-mounted display device, auxiliary system, display control method, and medium
JP2020077271A (en) Display unit, learning device, and method for controlling display unit
JP2017146726A (en) Movement support device and movement support method
US20220365354A1 (en) Segmented illumination display
CN117321547A (en) Contextual vision and voice search from electronic eyewear devices
WO2016121049A1 (en) Information display terminal and information display method
KR20180116044A (en) Augmented reality device and method for outputting augmented reality therefor
US11803058B1 (en) Blind assist glasses with remote assistance
US11792371B2 (en) Projector with field lens

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15879946

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: JP

122 Ep: pct application non-entry in european phase

Ref document number: 15879946

Country of ref document: EP

Kind code of ref document: A1