WO2016121049A1 - Terminal d'affichage d'informations et procédé d'affichage d'informations - Google Patents

Terminal d'affichage d'informations et procédé d'affichage d'informations Download PDF

Info

Publication number
WO2016121049A1
WO2016121049A1 PCT/JP2015/052486 JP2015052486W WO2016121049A1 WO 2016121049 A1 WO2016121049 A1 WO 2016121049A1 JP 2015052486 W JP2015052486 W JP 2015052486W WO 2016121049 A1 WO2016121049 A1 WO 2016121049A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
image
information display
user
display
Prior art date
Application number
PCT/JP2015/052486
Other languages
English (en)
Japanese (ja)
Inventor
大内 敏
瀬尾 欣穂
川村 友人
俊輝 中村
佑哉 大木
将史 山本
Original Assignee
日立マクセル株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日立マクセル株式会社 filed Critical 日立マクセル株式会社
Priority to PCT/JP2015/052486 priority Critical patent/WO2016121049A1/fr
Publication of WO2016121049A1 publication Critical patent/WO2016121049A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/02Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/10Intensity circuits

Definitions

  • the present invention relates to an information display terminal and an information display method.
  • Wearable information display terminals such as glasses are known.
  • Various methods for controlling a wearable information display terminal have been proposed.
  • a glasses-type information display terminal displays superimposed display information (for example, exhibition contents at a public facility) related to an object imaged by an imaging unit (for example, a camera).
  • display information for example, exhibition contents at a public facility
  • an imaging unit for example, a camera
  • Patent Document 1 discloses a technique in which a wearable information display terminal combines and displays an icon corresponding to given data on image data of a current field of view captured by a CCD video camera or the like. Is described.
  • the display unit of the information display terminal that displays the superimposed display information is composed of a translucent member. For this reason, in the conventional technology, for example, in the evening in fine weather, amber light is transmitted through the display unit, and the visibility of information displayed on the display unit may be reduced.
  • An object of the present invention is to provide a technique that can improve the visibility of information displayed on a display unit.
  • An information display terminal is an information display terminal that can be worn on a user's head, and the information display terminal is attached to the user's head in the state of the user's head.
  • a display portion is arranged in front of the eyes.
  • the display unit displays information by changing at least one of luminance and color according to the detected situation around the information display terminal.
  • the information display terminal is an information display terminal that can be worn on a user's head, and the information display terminal is attached to the user's head, It has a display part arranged in front of the user's eyes.
  • a front imaging unit that captures an imaging region in the user's line-of-sight direction in a state of being mounted on the user's head.
  • a rear imaging unit that images an imaging region in a direction opposite to the line-of-sight direction;
  • the display unit displays a captured image captured by at least one of the front imaging unit and the rear imaging unit.
  • the information display terminal is an information display terminal that can be worn on a user's head, and the information display terminal is attached to the user's head, It has a display part arranged in front of the user's eyes. Moreover, it has an imaging part which images the imaging area
  • a communication unit configured to transmit core region image data for displaying the core region image extracted by the audio / video processing unit to an external terminal;
  • the information display method is a display that can be worn on the user's head and is placed in front of the user's eyes while being worn on the user's head.
  • the visibility of information displayed on the display unit is improved.
  • FIG. 2 is a diagram illustrating an outline of a hardware configuration example of the information display terminal according to Embodiment 1.
  • FIG. (A) is a perspective view which shows the state with which the information display terminal which concerns on Embodiment 1 was mounted
  • (B) is a perspective view which shows the state by which the information display terminal which concerns on Embodiment 1 was removed. It is a figure which shows the outline
  • FIG. 6 is a diagram showing an outline of a configuration example of a color parameter table stored in a memory of the information display terminal according to Embodiment 1.
  • FIG. FIG. 3 is a diagram showing an outline of overall processing of the information display terminal according to Embodiment 1; 6 is a diagram illustrating an outline of a hardware configuration example of an information display terminal according to Embodiment 2.
  • FIG. It is a perspective view which shows the state with which the information display terminal which concerns on Embodiment 2 was mounted
  • FIG. FIG. 10 is a diagram showing an outline of overall processing of an information display terminal according to Embodiment 3.
  • FIG. 10 is a diagram for describing processing in which an information display terminal according to Embodiment 3 generates a corrected image.
  • FIG. 1 is a diagram showing an outline of a configuration example of an information display system having the information display terminal 10 according to the first embodiment.
  • the information display system is connected to the information display terminal 10, the portable terminal 300 connected to the information display terminal 10 via the network 200, and the information display terminal 10 via the network 200.
  • the mobile phone network 11 is connected to the display terminal 10 via the network 200.
  • the information display terminals 10 and 140, the portable terminal 300, the facility terminal 400, and the server 500 are implemented by predetermined hardware and software.
  • the information display terminals 10 and 140, the portable terminal 300, the facility terminal 400, and the server 500 are configured by a processor and a memory, and the information display terminals 10 and 140, The computer of the portable terminal 300, the facility terminal 400, and the server 500 is caused to function.
  • the information display terminal 10 is a wearable glasses-type terminal.
  • the information display terminal 10 is worn on the head of the user 1000.
  • the information display terminal 10 includes an imaging unit (for example, a camera) 109 that captures an object that exists in the line-of-sight direction of the user 1000.
  • the display unit 110 of the information display terminal 10 displays a background image.
  • the background image is, for example, an image of one color (for example, white painting) or an image captured by the imaging unit 109 (hereinafter sometimes referred to as a captured image).
  • the display unit 110 displays the superimposed display information related to the object imaged by the imaging unit 109 in a superimposed manner on the background image.
  • the superimposed display information is information useful to the user 1000 such as, for example, store sales information, discount coupons, exhibition contents at public facilities, admission coupons, and points that can be used as money at the store. It is.
  • the information display terminal 10 acquires superimposed display information related to an object imaged from the mobile terminal 300, the facility terminal 400, the server 500, and the like via the network 200.
  • the storage 510 of the server 500 stores captured image data for displaying a captured image of an object captured by the imaging unit 109 and superimposed display information related to the object in association with each other. Then, the server 500 transmits the superimposed display information stored in the storage 510 to the information display terminal 10 via the network 200. The display unit 110 of the information display terminal 10 displays the superimposed display information transmitted from the server 500.
  • the facility terminal 400 is installed in a store or public facility.
  • the facility terminal 400 transmits the superimposed display information to the mobile terminal 300.
  • the facility terminal 400 accepts an operation by a person in charge such as a store or a public facility.
  • the superimposed display information is not limited to the information stored in the storage 510 of the server 500.
  • the superimposed display information may be stored by an unspecified number of devices connected to the network 200 and disclosed to an external device.
  • the information display terminal 10 is not necessarily compatible with all communication environments where the user 1000 is placed.
  • the information display terminal 10 includes a short-distance communication unit but does not include a unit for enabling communication with a mobile phone
  • the superimposed display information is acquired from the server 500 via the network 200. Can not do it.
  • the user 1000 has a mobile terminal 300 (for example, a mobile phone or a tablet terminal).
  • the information display terminal 10 can be connected to the mobile terminal 300 via short-range communication, and the mobile terminal 300 can be connected to the network 200 as a router.
  • the facility terminal 400 that transmits the superimposed display information is not connected to the network 200.
  • the information display terminal 10 is connected to the portable terminal 300 by short-range communication, and the portable terminal 300 is connected to the facility terminal 400 via the wireless LAN as a wireless LAN unit.
  • FIG. 2 is a diagram illustrating an outline of a hardware configuration example of the information display terminal 10 according to the first embodiment.
  • the information display terminal 10 has a head mounted display 100 and a holder 130, one end connected to the head mounted display 100 and the other end connected to the holder 130. Cable 121 of a flexible member. That is, the holder 130 and the head mounted display 100 are connected via the cable 121.
  • the head mounted display 100 includes a main control unit 101, a communication unit 103, an audio input / output unit 104, a position detection unit 106, a power management unit 107, an image audio processing unit 108, An imaging unit 109, a display unit 110, a sensor unit 111, an audio processing unit 114, and a time measuring unit 115.
  • the head mounted display 100 detects changes in the color detected by the color sensor 119, the illuminance level detected by the illuminance sensor 120, and the situation around the information display terminal 10. Then, the display unit 110 displays a display image by changing at least one of luminance and color according to the specified surrounding situation.
  • the main control unit 101 expands the basic operation program stored in the memory 102 in the RAM. Then, the main control unit 101 executes the basic operation program expanded in the RAM. Accordingly, the main control unit 101 controls the head mounted display 100 and performs various determinations and arithmetic processing.
  • a semiconductor element memory such as a flash ROM, an SSD (Solid State Drive), or a CPU (Central Integrated Circuit) built-in memory is used.
  • a device such as a magnetic disk drive such as an HDD (Hard Disc Drive) may be used.
  • a RAM Random Access Memory included in the memory 102 serves as a work area for executing a basic operation program and other operation programs.
  • the RAM may be configured separately from the memory 102.
  • the RAM may be configured integrally with the main control unit 101.
  • Time measuring unit 115 measures the current time.
  • the timing unit 115 inputs the current time measured to the main control unit 101.
  • the sensor unit 111 includes an attitude sensor 105 (for example, a gyro sensor), a color sensor 119, and an illuminance sensor 120.
  • attitude sensor 105 for example, a gyro sensor
  • color sensor 119 for example, a color sensor 119
  • illuminance sensor 120 for example, a gyro sensor
  • the attitude sensor 105 detects the tilt of the head mounted display 100.
  • the attitude sensor 105 inputs the detected tilt of the head mounted display 100 to the main control unit 101.
  • the color sensor 119 detects the color of the periphery (periphery where the user 1000 exists). The color sensor 119 inputs the detected color to the main control unit 101.
  • the illumination sensor 120 detects the ambient illumination level.
  • the illuminance sensor 120 inputs the detected illuminance level of the illuminance to the main control unit 101.
  • the illuminance sensor 120 may detect surrounding colors.
  • the imaging unit 109 (for example, a camera) has a size and weight (for example, 100 g or less) that can be accepted as a device included in the wearable head mounted display 100.
  • the imaging unit 109 is a small camera unit that is attached to the head mounted display 100.
  • an optical system is arranged so as to capture the direction of the line of sight of the user 1000 wearing the head mounted display 100. And the imaging part 109 images the imaging area of a user's gaze direction in the state with which the user's head was mounted
  • the captured image data of the captured image captured by the imaging unit 109 is input to the main control unit 101.
  • the captured image data of the captured image captured by the imaging unit 109 is stored in the memory 102 of the main control unit 101.
  • the imaging unit 109 includes an imaging element 113 that can receive high-sensitivity far infrared, near infrared, ultraviolet, X-ray, terahertz wave, muon wave, and the like in addition to normal visible light.
  • the display unit 110 has a display formed of a translucent member or a reflecting member such as a prism or a mirror.
  • the audio / video processing unit 108 outputs the captured image and the superimposed display information to the display of the display unit 110 based on the captured image data input from the main control unit 101 and the superimposed display information. Thereby, the display unit 110 displays the captured image and the superimposed display information on the display.
  • the main control unit 101 detects the situation around the information display terminal 10 based on the illuminance level detected by the sensor unit 111 (color sensor 119, illuminance sensor 120).
  • the main control unit 101 detects the situation around the information display terminal based on the current time measured by the time measuring unit 115.
  • the main control unit 101 detects a surrounding illuminance level and color by analyzing a captured image captured by the imaging unit 109 using a known technique, and detects a surrounding situation based on the detected illuminance level and color. To do. Specifically, the main control unit 101 changes the surrounding situation when the illuminance level corresponding to the luminance of the display image currently displayed on the display unit 110 is different from the illuminance level input from the illuminance sensor 120. Detect that In addition, the main control unit 101 analyzes the luminance level corresponding to the luminance of the display image currently displayed on the display unit 110 and the captured image data of the captured image captured by the imaging unit 109 by a known technique. When the specified illumination level is different, it is detected that the surrounding situation has changed.
  • the memory 102 stores image analysis data for matching illuminance levels and colors.
  • the main control unit 101 When detecting that the surrounding situation has changed based on the input illuminance level, the main control unit 101 acquires the luminance corresponding to the illuminance level from the luminance table (described later, FIG. 4) stored in the memory 102. To do.
  • FIG. 4 is a diagram showing an outline of the luminance table stored in the memory 102 according to Embodiment 1 of the present invention.
  • the luminance table has data items such as [illuminance level], [luminance], and [time].
  • [Illuminance level] indicates the value of ambient illuminance.
  • [Luminance] indicates the luminance of the display image displayed on the display unit.
  • [Time] indicates a time measured by the time measuring unit 115 included in the information display terminal 10.
  • the main control unit 101 acquires the luminance corresponding to the illuminance level from the memory 102, and sets the luminance of the display image displayed by the display unit 110 based on the acquired luminance.
  • the main control unit 101 acquires the luminance corresponding to the time input from the time measuring unit 115 from the memory 102, and sets the luminance of the display image displayed on the display unit 110 based on the acquired luminance.
  • the audio / video processing unit 108 changes the luminance of the display image displayed on the display unit 110 to the luminance acquired by the main control unit 101.
  • the audio / video processing unit 108 increases the luminance of the display image displayed on the display unit 110 if the periphery is dark and increases the brightness if the periphery is bright. Specifically, the audio / video processing unit 108 controls the current value of the LED of the optical module and the laser light source based on the luminance set by the main control unit 101, and thereby the luminance of the image displayed on the display unit 110. To change.
  • the user 1000 can visually recognize the display image displayed on the display unit 110 with appropriate luminance even when the head mounted display 100 is used at night or in the evening.
  • the display unit 110 sets the luminance of the display image to be displayed to 1 cd / m 2 or less.
  • the surrounding illuminance level is sunny in the daytime with 100,000 lux, the iris of the pupil is closed and small. Therefore, the display unit 110 is difficult to see the display image unless the brightness of the display image to be displayed is increased. Therefore, the display unit 110 sets the luminance of the display image to be displayed to 5000 cd / m 2 .
  • the main control unit 101 determines the situation in the surroundings when the color parameter corresponding to the color of the display image currently displayed on the display unit 110 is different from the color parameter corresponding to the color input from the color sensor 119. Detect that has changed. Further, the main control unit 101 analyzes the color parameters corresponding to the color of the display image currently displayed on the display unit 110 and the captured image data of the captured image captured by the imaging unit 109 by a known technique. When the specified color parameter is different, it is detected that the surrounding situation has changed.
  • the main control unit 101 calculates the changed color parameter based on a known image processing algorithm that performs a reverse calculation process on the input color parameter. To do. Then, the audio / video processing unit 108 displays a display image whose color has been changed based on the calculated color parameter on the display unit 110.
  • the audio / video processing unit 108 changes the color of the display image displayed on the display unit 110 according to the surrounding colors by controlling the LEDs provided on the display unit 110.
  • the main control unit 101 controls the LED provided in the display unit 110 to control the green value of the color parameter. Increase, with blue value.
  • transmits the display part 110 turns into white, and visibility improves.
  • the main control unit 101 controls the LED provided in the display unit 110 so that the red value of the color parameter and the green color Increase the value.
  • the blue color transmitted through the display unit 110 becomes white, and the visibility is improved.
  • the display unit 110 can display a display image with adjusted colors even in a special environment such as a room, a tunnel, or a camera development darkroom.
  • the main control unit 101 When detecting that the surrounding situation has changed based on the color input from the color sensor 119, the main control unit 101 stores the color parameters corresponding to the color input from the color sensor 119 in the memory. Obtained from a parameter table (described later, FIG. 5).
  • FIG. 5 is a diagram showing an outline of the color parameter table stored in the memory 102 according to Embodiment 1 of the present invention.
  • the color parameter table has data items such as [color], [color parameter], and [time].
  • [Color] indicates surrounding colors.
  • [Color parameter] indicates a parameter for setting the color of the display image displayed by the display unit.
  • [Color Parameter] indicates values of three primary colors of red, green, and blue.
  • [Time] indicates a time measured by the time measuring unit 115 included in the information display terminal 10.
  • the main control unit 101 acquires the color parameter corresponding to the color from the memory 102, and sets the color parameter of the display image displayed by the display unit 110 based on the acquired color parameter. And the main control part 101 changes the color of the display image which the display part 110 displays by controlling LED provided in the display part 110.
  • the display unit 110 includes a case where the background image is a captured image captured by the image capturing unit 109 and a case where the background image is other than the captured image captured by the image capturing unit 109 (for example, a white-painted image).
  • the object background image, superimposed display image
  • the luminance or color is to be changed is switched.
  • the main control unit 101 causes the display unit 110 to display at least one of luminance and / or color of only the background image.
  • the main control unit 101 has at least either the luminance or the color of both the background image and the superimposed display image. Change one of them to display.
  • the main control unit 101 acquires the color parameter corresponding to the time input from the time measuring unit 115 from the memory 102, and sets the color parameter of the display image displayed by the display unit 110 based on the acquired color parameter.
  • the display unit 110 may change and display at least one of luminance and color according to an operation from the user 1000.
  • the display unit 110 learns the past surrounding situation and the changed brightness and color parameters, and thereafter automatically changes and displays at least one of the brightness and the color according to the surrounding situation. You may make it do.
  • the voice input / output unit 104 includes a microphone that senses the voice of the user 1000, an earphone that is converted into voice data via the voice processing unit 114, and outputs voice based on the voice data input / output from the image / voice processing unit 108. Consists of.
  • the communication unit 103 exchanges display images with the server, the mobile terminal 300, the facility terminal 400, the server 500, and the like using various communication environments such as a mobile phone network, a wireless LAN, and short-range communication.
  • the communication unit 103 includes a wireless LAN unit, a mobile phone unit, a short-range communication unit, and the like.
  • a short-range communication unit WiFi (registered trademark), Bluetooth (registered trademark), and the like are applicable.
  • a position detector 106 which is a GPS (Global Positioning System) unit, receives radio waves from a plurality of positioning satellites orbiting around the earth, and detects the current position coordinates of the information display terminal 10 on the earth.
  • GPS Global Positioning System
  • the power management unit 107 manages the battery that drives the head mounted display 100, monitors the state of the battery, and periodically detects the remaining amount.
  • the main control unit 101 performs pattern matching on the captured image captured by the imaging unit 109 based on the image data of the captured image and the image data stored in the memory 102. Then, the main control unit 101 determines whether image data that matches or approximates the image data of the image captured by the imaging unit as a result of pattern matching is stored in the memory 102. Accordingly, the main control unit 101 detects that a specific product is included in the captured image.
  • the main control unit 101 When the captured image includes a product, the main control unit 101 requests the server 500 for a superimposed display image corresponding to the product. Then, the main control unit 101 causes the display unit 110 to display the superimposed display image acquired from the server 500 together with the captured image.
  • the main control unit 101 manages the battery that drives the holder 130 and monitors the state of the battery.
  • the power management unit 132 periodically detects the remaining amount.
  • the communication unit 131 transmits information read by the reader (not shown) 133 to the head mounted display 100.
  • the communication unit 131 corresponds to standards such as WiFi (registered trademark), BlueTooth (registered trademark), and LTE (Long Term Evolution).
  • the imaging part 109 may have a camera image pick-up element which functions as a night vision camera or a heat sensing camera like a near-infrared camera, for example. Even in this case, the imaging unit 109 can capture a normal captured image of visible light.
  • the imaging unit 109 includes at least two imaging elements 113 and can capture a normal visible light captured image and a captured image by night vision imaging.
  • the audio / video processing unit 108 converts the image data input from the imaging unit 109 into a night vision image (an image in which the surrounding state of the user 1000 can be confirmed even when the surrounding of the user 1000 becomes dark). ). Then, the audio / video processing unit 108 causes the display unit 110 to display the processed night vision image. Specifically, the display unit 110 displays an image of a person included in an image captured by a night vision camera while changing at least one of luminance and color.
  • the audio / video processing unit 108 synthesizes a normal captured image of visible light and a captured image obtained by night vision imaging using a known image processing algorithm. Then, the audio / video processing unit 108 causes the display unit 110 to display the synthesized image. Accordingly, the display unit 110 can display a night vision image with improved visibility in a night or dark environment. Moreover, since the captured image of normal visible light and the captured image by night vision imaging are combined, the display unit 110 can more clearly display information that needs to be recognized, such as the color of the signs and signal lights.
  • the imaging unit 109 functioning as a night vision camera may be a dedicated night vision camera, and detects light of other wavelengths, for example, far infrared, ultraviolet, X-ray, terahertz, muon, yellow, 1500 nm You may make it have the image pick-up element 113 which senses light and waves of all wavelengths, such as infrared.
  • a head-mounted display that can identify an approaching object at an intersection can be realized.
  • a head-mounted display that can identify blood vessels, lesions, cell mutations, and the like can be realized in the medical field.
  • the imaging unit 109 detects a specific wavelength it is possible to realize a head-mounted display that can identify moisture, bases, and the like by inspection.
  • the imaging unit 109 detects a specific wavelength, it is possible to realize a head-mounted display that can identify defects due to aging caused on an outer wall or an inner wall of a social infrastructure such as a tunnel or a pipe. Further, when the imaging unit 109 detects a specific wavelength, it is possible to realize a head mounted display that can identify human compounds, foreign substances, oxides, contaminants, organic substances, organic compounds, and the like. Further, when the imaging unit 109 detects a specific wavelength, a head-mounted display that can identify brain waves, electromagnetic waves, cerebral blood flow, and the like can be realized. The imaging unit 109 may have these functions.
  • the imaging unit 109 may be capable of imaging from a close place to a fine space by focus control such as a zoom function, an enlargement function, and an approach image.
  • focus control such as a zoom function, an enlargement function, and an approach image.
  • FIG. 3A is a perspective view showing a state in which the information display terminal 10 is mounted.
  • FIG. 3B is a perspective view showing a state where the information display terminal 10 is detached.
  • the head mounted display 100 of the information display terminal 10 can be mounted on the head of the user 1000.
  • the head mounted display 100 is arranged in front of the user 1000 in a state where the head mounted display 100 is attached to the user's head while being attached to the user's 1000 ear. Rim portion 123 to be formed.
  • the temple portion 112 has one end connected to the rim portion 123 and the other end connected to the cable 121.
  • the rim portion 123 is provided with an imaging unit 109 and a display unit 110. Note that the imaging unit 109 and the display unit 110 may be attached to the rim unit 123.
  • the imaging unit 109 images the imaging region 202 in the line-of-sight direction of the user 1000 in a state where the head mounted display 100 is mounted on the user 1000's head.
  • the display unit 110 is disposed in front of the user 1000 with the head mounted display 100 mounted on the head of the user 1000.
  • the display unit 110 is configured by a transparent body, a half mirror, or a total reflection mirror.
  • the cable 121 is formed of, for example, a flexible member. Specifically, the cable 121 is a flexible wire or a shape memory tube. The shape of the cable 121 is changed by applying an external force. Moreover, the cable 121 maintains the shape after changing.
  • the user 1000 does not use the head mounted display 100, as shown in FIG. 3B, the user 1000 carries the information display terminal 10 with the cable 121 wound around the neck.
  • a pair of hook-and-loop fasteners or magnets may be provided at one end and the other end of the cable 121. In this case, the cable 121 is wound around the neck while the one end and the other end of the cable 121 are in contact with each other.
  • the cable 121 includes an optical fiber, an electric wire, a hard cover, and the like.
  • the imaging unit 109 images the imaging area 202.
  • the display unit 110 is provided in the range of the virtual image display area 203 with respect to the imaging area 202.
  • the display unit 110 displays a captured image of the captured imaging region 202.
  • the display unit 110 may enlarge and display an image of an area in the viewpoint direction of the user in the imaging area 202.
  • the display unit 110 may be provided in a range of about 1/10 to 1/2 of the area of the imaging region.
  • the focal position of the virtual image may be varied, and this may be adjusted either automatically or manually.
  • a material having a refractive index different from that of air such as glass or plastic for adjusting the lens position or optical length of the optical module may be used and inserted into the optical axis for adjustment. It is also possible to change the size and angle of view of the virtual image by changing the lens or changing the focal length by the zoom function.
  • the head mounted display 100 includes a display unit 110 disposed in front of the user 1000 and an imaging unit 109 that captures the user's 1000 line-of-sight direction while being mounted on the user 1000's head. If so, it may be a goggle type.
  • a method for displaying a display image on the display unit 110 a method using a half mirror, a method for realizing see-through by dividing only one direction using a mirror or a prism, or a virtual image directly on the retina of the user 1000 is displayed.
  • a projection method may be applied.
  • the holder 130 includes a communication unit 131, a power management unit 132, and a reader 133.
  • the holder 130 detachably holds a terminal (for example, a portable terminal) held by the user.
  • a key mobile port may be provided in the holder 130.
  • the key mobile stores an authentication key, an access code, and a security code.
  • a predetermined key mobile is inserted into the holder 130 to authenticate the user.
  • the holder 130 and the head mounted display 100 can be used.
  • the holder 130 may be provided with fingerprint authentication or vein authentication. In this case, when the holder 130 has succeeded in fingerprint authentication or vein authentication, the holder 130 and the head mounted display 100 can be used.
  • the holder 130 itself may function as a key mobile.
  • SPC Security Personal Computer
  • the holder 130 may have a reader.
  • the reader is a barcode reader, an RFID reader, an imaging device that reads a QR code, or the like.
  • the reader reads information from an IC chip mounted on an ID card, for example, in a non-contact manner or in a contact manner using a communication coil, a transmission coil, an induction current coil, or the like.
  • the holder 130 can perform individual authentication, ID authentication, settlement, and the like.
  • a battery that supplies power to the head mounted display 100 via the cable 121 may be detachably attached to the other end of the cable 121.
  • an input device such as a keyboard, a mouse, or a touch pad may be detachably attached to the other end of the cable 121.
  • the input device transmits the input instruction received from the user 1000 to the head mounted display 100 via the cable 121.
  • FIG. 6 is a diagram showing an overview of the overall processing of the information display terminal 10 according to the first embodiment.
  • the overall processing according to Embodiment 1 starts when the display unit 110 starts displaying a display image, for example.
  • description will be made on the assumption that the imaging area 202 is captured by the imaging unit 109 and the display unit 110 displays a captured image captured by the imaging unit 109.
  • the display unit 110 displays a display image based on the set color parameter and luminance.
  • the color sensor 119 inputs the color to the main control unit. Further, the illuminance sensor 120 inputs the illuminance level to the main control unit 101.
  • the main control unit 101 determines whether the surrounding situation has changed based on the illuminance level and the color input in S602. When the main control unit 101 determines that the surrounding situation has not changed (No in S603), the process returns to S601. On the other hand, when the main control unit 101 determines that the surrounding situation has changed (S603-Yes), the process proceeds to S604.
  • step S604 the main control unit 101 acquires the luminance corresponding to the illuminance level input from the illuminance sensor 120 from the memory 102, and sets the acquired luminance as the luminance of the display image displayed on the display unit. .
  • the main control unit 101 acquires color parameters corresponding to the colors input from the color sensor 119 from the memory 102, and sets the color parameters of the display image displayed on the display unit 110 based on the acquired color parameters.
  • the display unit 110 displays information by changing at least one of luminance and color in accordance with the detected situation around the information display terminal 10.
  • the visibility of information displayed on the display unit 110 can be improved.
  • the display unit 110 when the situation around the user 1000 becomes dark, the display unit 110 reduces the brightness of the display image. For this reason, the display unit 110 can display an optimal video matched with the surrounding darkness. In this case, it is possible to prevent the user from feeling that the video displayed on the display unit 110 is dazzling. Further, in the evening, in the morning glow, or in the blue background, the display unit 110 changes the color by reversing the color tone of the video to be displayed. Thereby, the visibility of the image displayed on the display unit 110 is improved.
  • an illumination sensor or the like is provided by the display unit 110 displaying information by changing at least one of luminance and color based on the color parameter corresponding to the current time measured by the time measuring unit 115.
  • the visibility of the display image displayed by the display unit 110 can be improved according to the surrounding situation.
  • the display unit 110 displays the human image included in the captured image captured by the night vision camera while changing at least one of the luminance and the color, so that it is included in the captured image in the dark surroundings. Can improve the visibility of the image of the person.
  • the display unit 110 displays the background image by the imaging unit 109
  • the background image is displayed by the imaging unit 109 by displaying at least one of the luminance and the color of the background image.
  • the visibility of the display image displayed on the display unit 110 can be improved.
  • the background image and / or the superimposed display information are displayed by changing at least one of luminance and color. Thereby, when the background image is other than the image captured by the imaging unit 109, the visibility of the display image displayed on the display unit 110 can be improved. (Embodiment 2)
  • the second embodiment is different from the first embodiment in that the front imaging unit that captures an imaging region in the user's gaze direction and the opposite direction to the gaze direction in a state where the second embodiment is mounted on the head. And a rear imaging unit that images the imaging region.
  • the difference between the second embodiment and the first embodiment will be described below mainly with reference to FIGS. ⁇ Configuration of information display terminal>
  • FIG. 7 is a diagram illustrating an outline of a hardware configuration example of the information display terminal 10 according to the second embodiment.
  • FIG. 8 is a perspective view showing a state where the information display terminal 10 according to Embodiment 2 is mounted.
  • the information display terminal 10 includes a front imaging unit 710 and a rear imaging unit 720.
  • the front imaging unit 710 images the imaging region 202 in the user's line-of-sight direction while being attached to the head.
  • the image of the imaging region 202 in the user's line-of-sight direction captured by the front imaging unit 710 may be referred to as a front captured image.
  • the front imaging unit 710 inputs front captured image data for displaying the captured front captured image to the main control unit 101. Then, the front captured image data input by the front imaging unit 710 is stored in the memory 102 of the main control unit 101.
  • the rear imaging unit 720 images the imaging area 204 in the direction opposite to the viewing direction.
  • the image of the imaging region 204 in the direction opposite to the line-of-sight direction captured by the rear imaging unit 720 may be referred to as a rear captured image.
  • the rear imaging unit 720 inputs rear captured image data for displaying the captured rear captured image to the main control unit 101. Then, the rear captured image data input by the rear imaging unit 720 is stored in the memory 102 by the main control unit 101.
  • the main control unit 101 causes the display unit 110 to display a front captured image based on the input front captured image data. Further, the main control unit 101 causes the display unit 110 to display a rear captured image based on the input rear captured image data.
  • the display unit 110 displays at least one of a front captured image based on the front captured image data input to the main control unit 101 and a rear captured image based on the rear captured image data.
  • the display unit 110 displays an image to be displayed every time the input device receives an input. Are switched from the front captured video to the rear captured video, or from the rear captured video to the front captured video.
  • the display unit 110 may divide the display area into two, display the front captured image in one display area, and display the rear captured image in the other display area.
  • the imaging area 202 of a gaze direction and the imaging area 204 of the opposite direction of a gaze direction are attached. You may enable imaging.
  • the front imaging unit 710 and the rear imaging unit 720 may have a camera imaging element that functions as a night vision camera or a heat sensing camera, such as a near infrared camera.
  • the front imaging unit 710 and the rear imaging unit 720 may be dedicated night vision cameras.
  • the front imaging unit 710 and the rear imaging unit 720 detect light of other wavelengths, for example, far infrared, ultraviolet, X-ray, terahertz, muon, yellow, You may make it have the image pick-up element 113 which detects the light and wave of all wavelengths, such as infrared of 1500 nm.
  • the information display terminal 10 may further include an imaging unit that captures the right direction of the user and an imaging unit that captures the left direction of the user. .
  • the audio / video processing unit 108 synthesizes each image captured by the front imaging unit 710, the rear imaging unit 720, the imaging unit that captures the user's right direction, and the imaging unit that captures the user's left direction. By doing so, an omnidirectional image in which the omnidirectional image can be confirmed may be generated.
  • the display unit 110 displays an omnidirectional image.
  • the user 1000 can operate during driving of a bicycle, a motorcycle, a car, or the like. , You can see the video behind.
  • the main control unit 101 analyzes the front captured image data and the rear captured image data input by the front image capturing unit 710 and the rear image capturing unit 720 by a known technique, so that the user 1000 (when the information display terminal 10 is being attached) is analyzed. It may be determined whether the user 1000) is in danger. For example, when it is detected that an object such as a motorcycle, a car, or a bicycle is approaching the user 1000 wearing the information display terminal 10, the main control unit 101 determines that the user 1000 is in danger. To do.
  • the information display terminal 10 may have sonar. In this case, the information display terminal 10 emits a sound wave and detects that the object is approaching based on the sound wave reflected from the object. The information display terminal 10 may detect the approach of an object using infrared rays. Further, the information display terminal 10 may include a vibration sensor that detects air vibration. In this case, the information display terminal 10 detects that an object is approaching based on the vibration of air detected by the vibration sensor.
  • the main control unit 101 that has determined that the danger is imminent to the user 1000 notifies the user 1000 that the danger is imminent by controlling the display unit 110.
  • the main control unit 101 causes the display unit 110 to display a single red background image.
  • main control unit 101 may cause the display unit 110 to display characters or marks indicating that danger is imminent as the superimposed display information. Further, the main control unit 101 may cause the voice input / output unit 104 to output a voice indicating that danger is imminent.
  • the rear imaging unit 720 may be attached to a helmet that is worn when the user 1000 rides on a motorcycle. In this case, the rear imaging unit 720 attached to the helmet transmits rear captured image data to the information display terminal 10 via wireless communication.
  • the rear imaging unit 720 attached to the helmet and the head mounted display 100 of the information display terminal 10 may be connected to each other via a cable. In this case, the rear imaging unit 720 transmits rear captured image data to the information display terminal 10 via a cable.
  • FIG. 9 is a diagram showing an overview of the overall processing of the information display terminal 10 according to the second embodiment.
  • the overall processing according to Embodiment 2 starts when the display unit 110 starts displaying a display image, for example.
  • the following description is based on the assumption that the front imaging unit 710 images the imaging area 202 and the rear imaging unit 720 images the imaging area 204.
  • step S ⁇ b> 901 the front imaging unit 710 inputs the front captured image data of the captured front captured image to the main control unit 101. Further, the rear imaging unit 720 inputs rear captured image data of the captured rear captured image to the main control unit 101.
  • step S902 the main control unit 101 analyzes at least one of the front captured image data and the rear captured image data input in step S901 to determine whether the user 1000 is in danger. judge.
  • the main control unit 101 determines that the user 1000 is not in danger (S902-No)
  • the process proceeds to S904.
  • the main control unit 101 determines that the danger is imminent to the user 1000 (S902-Yes)
  • the process proceeds to S903.
  • the main control unit 101 controls the display unit 110 to notify the user 1000 that the danger is imminent.
  • the display unit 110 displays at least one of a front captured image based on the front captured image data input in S901 and a rear captured image based on the rear captured image data. After S903, the process returns to S901.
  • step S904 the display unit 110 displays at least one of a front captured image based on the front captured image data input in step S901 and a rear captured image based on the rear captured image data. To do. After S903, the process returns to S901.
  • the display unit 110 displays a captured image captured by at least one of the front imaging unit and the rear imaging unit, thereby confirming the front imaging unit or the rear imaging unit and avoiding danger. It becomes like this. (Embodiment 3)
  • the head of the user 1000 may shake. there were.
  • the imaging unit 109 included in the information display terminal 10 also shakes.
  • the captured image captured by the imaging unit 109 is distributed to the information display terminal 140, the portable terminal 300, the facility terminal 400, and the like via wireless communication, the information display terminal 140, the portable terminal 300, the facility terminal 400, and the like ( Hereinafter, the captured image displayed on the external terminal may be shaken, and the person who views the captured image gets drunk.
  • the purpose of the third embodiment is to display an image that can be comfortably viewed even when the imaging unit 109 of the information display terminal 10 is shaken as the head of the user 1000 is shaken, the information display terminal 140, the portable terminal 300, providing a technique enabling display on the facility terminal 400 or the like.
  • FIG. 10 is a diagram showing an overview of the overall processing of the information display terminal 10 according to the third embodiment.
  • the overall processing according to Embodiment 3 starts when the imaging unit 109 starts imaging the imaging region 202, for example.
  • the following description is based on the assumption that the inclination of the head mounted display 100 detected by the attitude sensor 105 is periodically input to the main control unit 101. Further, the description will be made on the assumption that captured image data of a captured image captured by the imaging unit 109 is periodically input to the main control unit 101.
  • step S1001 the main control unit 101 determines whether the head mounted display 100 is shaken by a known technique based on the tilt of the head mounted display 100 input from the attitude sensor 105.
  • the main control unit 101 determines that the head mounted display 100 is not shaken (S1001-No)
  • the process returns to S1001.
  • the main control unit 101 determines that the head mounted display 100 is shaking (S1001-Yes)
  • the process proceeds to S1002.
  • step S ⁇ b> 1002 the main control unit 101 determines the degree of shaking (swaying) based on the captured image data of the captured image before shaking and the captured image data of the captured image input from the imaging unit 109.
  • the pixel movement vector and moving speed of the imaged image data of the imaged image after shaking are analyzed by a known technique with respect to the imaged image data of the previous imaged image.
  • the main control unit 101 causes the image / sound processing unit 108 to extract the core region image from the captured image of the captured image before shaking. Further, the main control unit 101 causes the image / sound processing unit 108 to extract the core region image from the captured image data of the captured image after shaking. For example, the main control unit 101 extracts each core region image based on the degree of shaking analyzed in S1002. Specifically, as shown in FIG. 11, the main control unit 101 sends the core area images 602 and 604 that are always imaged to the audio / video processing unit 108 even when the head mounted display 100 is shaking. Let it be extracted. That is, the audio / video processing unit 108 extracts the core area image that is included in both the captured image before shaking and the captured image after shaking from the captured image before shaking and the captured image after shaking. , Respectively.
  • the main control unit 101 causes the audio / video processing unit 108 to extract the core area image 602 from the captured image 601 before shaking. Further, the main control unit 101 causes the audio / video processing unit 108 to extract the core region image 604 from the captured image 603 after shaking.
  • the main control unit 101 causes the audio / video processing unit 108 to delete the images around the core area images 602 and 604 extracted in S1003, thereby displaying the corrected images 605 and 606. Corrected image data is generated.
  • the main control unit 101 inputs the generated corrected image data to the communication unit 103.
  • the communication unit 103 transmits the corrected image data generated in S1004 to the external terminal. Note that the communication unit 103 transmits each corrected image data to the external terminal only when it catches that the communication state of the external terminal is open. Of course, if the communication state of the external terminal is completed, the communication unit 103 does not transmit the corrected image data to the external terminal.
  • the external terminal receives the corrected image data transmitted in S1005. Then, the external terminal displays corrected images 605 and 606 based on the received corrected image data. After S1006, the process returns to S1001.
  • the imaging unit 109 is also shaken by transmitting the core region image data for displaying the core region images 602 and 604 extracted by the audio / video processing unit 108 to the external terminal. Even in such a case, the corrected image data that can be comfortably viewed can be displayed on the external terminal.
  • the external terminal cannot display all of the captured images, but can display a corrected image that can be comfortably viewed. Therefore, the external terminal can share the travel and experience of the user 1000 wearing the information display terminal 10, sports watching, book search, information search, conference, and the like without feeling sick.
  • Information display terminal DESCRIPTION OF SYMBOLS 100,140 ... Head mounted display, 101 ... Main control part, 102 ... Memory, 103 ... Communication part, 104 ... Voice input / output part, 105 ... Attitude sensor, 106 ... Position detection part, 107 ... Power supply management part, 108 ... Image Audio processing unit 109 ... Imaging unit 110 Display unit 111 Sensor unit 112 Temple unit 113 Image sensor 114 Audio processing unit 115 Timing unit 119 Color sensor 120 Illuminance sensor 121 ... cable, 130 ... Holder, 131 ... Communication unit, 132 ... Power management unit, 133 ... Reader, 200 ... Network, 300 ... mobile terminal, 400 ... Facility terminal, 500 ... server, 710 ... Front imaging unit, 720 ... Rear imaging unit, 1000: User.

Abstract

L'invention concerne un terminal d'affichage d'informations qui peut être monté sur la tête d'un utilisateur, le terminal d'affichage d'informations ayant une unité d'affichage qui est disposée devant les yeux de l'utilisateur lorsque le terminal est monté sur la tête de l'utilisateur, et l'unité d'affichage affiche des informations en changeant au moins la luminance ou la teinte des informations, conformément aux circonstances détectées de l'environnement entourant le terminal d'affichage d'informations, pour améliorer ainsi la visibilité des informations affichées par l'unité d'affichage.
PCT/JP2015/052486 2015-01-29 2015-01-29 Terminal d'affichage d'informations et procédé d'affichage d'informations WO2016121049A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2015/052486 WO2016121049A1 (fr) 2015-01-29 2015-01-29 Terminal d'affichage d'informations et procédé d'affichage d'informations

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2015/052486 WO2016121049A1 (fr) 2015-01-29 2015-01-29 Terminal d'affichage d'informations et procédé d'affichage d'informations

Publications (1)

Publication Number Publication Date
WO2016121049A1 true WO2016121049A1 (fr) 2016-08-04

Family

ID=56542701

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/052486 WO2016121049A1 (fr) 2015-01-29 2015-01-29 Terminal d'affichage d'informations et procédé d'affichage d'informations

Country Status (1)

Country Link
WO (1) WO2016121049A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005094615A (ja) * 2003-09-19 2005-04-07 Sanyo Electric Co Ltd 手ぶれ補正装置、手ぶれ補正方法および手ぶれ補正プログラムを記録したコンピュータ読み取り可能な記録媒体
JP2008096868A (ja) * 2006-10-16 2008-04-24 Sony Corp 撮像表示装置、撮像表示方法
JP2011071884A (ja) * 2009-09-28 2011-04-07 Brother Industries Ltd 作業支援システム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005094615A (ja) * 2003-09-19 2005-04-07 Sanyo Electric Co Ltd 手ぶれ補正装置、手ぶれ補正方法および手ぶれ補正プログラムを記録したコンピュータ読み取り可能な記録媒体
JP2008096868A (ja) * 2006-10-16 2008-04-24 Sony Corp 撮像表示装置、撮像表示方法
JP2011071884A (ja) * 2009-09-28 2011-04-07 Brother Industries Ltd 作業支援システム

Similar Documents

Publication Publication Date Title
US11344196B2 (en) Portable eye tracking device
US10686972B2 (en) Gaze assisted field of view control
TWI597623B (zh) 可穿戴基於行爲的視覺系統
JP6030582B2 (ja) 視覚障害を有する個人のための光学装置
CN103091843B (zh) 透视显示器亮度控制
CA2750287C (fr) Detection du regard dans un affichage transparent, pres de l'oeil et de realite mixte
CN104838326B (zh) 可佩戴的食物营养反馈系统
KR20180096434A (ko) 가상 이미지 표시 방법, 저장 매체 및 이를 위한 전자 장치
JP2013521576A (ja) 対話式ヘッド取付け型アイピース上での地域広告コンテンツ
KR20160048801A (ko) 증강 현실을 위한 방법 및 시스템
KR20140059213A (ko) 홍채 스캔 프로파일링을 이용하는 헤드 마운티드 디스플레이
US11830494B2 (en) Wearable speech input-based vision to audio interpreter
US20210390882A1 (en) Blind assist eyewear with geometric hazard detection
CN110389447B (zh) 透射型头部佩戴型显示装置、辅助系统、显示控制方法和介质
JP2020077271A (ja) 表示装置、学習装置、及び、表示装置の制御方法
US20220365354A1 (en) Segmented illumination display
JP2017146726A (ja) 移動支援装置、及び、移動支援方法
CN117321547A (zh) 来自电子眼睛佩戴设备的情境视觉和语音搜索
WO2016121049A1 (fr) Terminal d'affichage d'informations et procédé d'affichage d'informations
KR20180116044A (ko) 증강현실용 디바이스와 그 증강현실 출력 방법
US11792371B2 (en) Projector with field lens
CN109814719B (zh) 一种基于穿戴眼镜的显示信息的方法与设备
US20240045214A1 (en) Blind assist glasses with remote assistance

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15879946

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: JP

122 Ep: pct application non-entry in european phase

Ref document number: 15879946

Country of ref document: EP

Kind code of ref document: A1