WO2017073014A1 - Wearable display, image display device, and image display system - Google Patents

Wearable display, image display device, and image display system Download PDF

Info

Publication number
WO2017073014A1
WO2017073014A1 PCT/JP2016/004319 JP2016004319W WO2017073014A1 WO 2017073014 A1 WO2017073014 A1 WO 2017073014A1 JP 2016004319 W JP2016004319 W JP 2016004319W WO 2017073014 A1 WO2017073014 A1 WO 2017073014A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
display
information
image information
control unit
Prior art date
Application number
PCT/JP2016/004319
Other languages
French (fr)
Japanese (ja)
Inventor
博隆 石川
孝文 朝原
岩津 健
健 渋井
鈴木 謙治
智英 田辺
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2017073014A1 publication Critical patent/WO2017073014A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/64Constructional details of receivers, e.g. cabinets or dust covers

Definitions

  • the present technology relates to a wearable display, an image display device, and an image display system capable of displaying an image including specific information in a display visual field.
  • Patent Document 1 describes a see-through type head-mounted display (HMD: Head-Mounted-Display) that can superimpose and display an image (AR object) associated with an object existing in real space.
  • HMD Head-Mounted-Display
  • an object of the present technology is to provide a wearable display, an image display device, and an image display system capable of preferentially presenting information presumed to be useful to a user.
  • a wearable display includes a display unit and a control unit.
  • the display unit is configured to be attachable to the user, and includes a display area that provides the user with a real field of view.
  • the control unit is based on user information including information related to the position and behavior of the user and a plurality of pieces of image information related to a plurality of specific objects existing in a peripheral region within a first distance from the user.
  • the image information related to the specific object existing in the moving direction of the user is extracted from the plurality of image information, and the extracted image information is presented in the display area.
  • the wearable display is useful for the user because it is configured to extract image information related to a specific object existing in the moving direction of the user from a plurality of pieces of image information and present it in the visual field area. Can be preferentially presented.
  • the user information may include map information around the position of the user, and the control unit extracts image information related to a specific object existing along a road on which the user travels from the plurality of image information. It may be configured to. Thereby, only the information relevant to the road where the user is currently traveling or passing can be presented to the user.
  • the control unit obtains the moving speed of the user as the user information, and when the moving speed is equal to or lower than a predetermined value, the specific target existing within a second distance shorter than the first distance from the user.
  • Image information related to the object may be presented in the display area. In this way, the display range of the image may be changed according to the moving speed of the user.
  • the control unit may be configured to be able to present an image including the map information and the image information in the display area.
  • the map information may be displayed like a bird's eye view or a bird's eye view including image information.
  • the control unit may be configured to acquire the user's movement acceleration as the user information and change a display mode of the image information presented in the display area in accordance with the magnitude of the movement acceleration. .
  • the control unit it is possible to intentionally reduce the visibility of the image information presented in the display area so that the user's attention is directed to the real field of view instead of the image. .
  • control unit presents the image information at the first luminance in the display area when the user's moving acceleration is equal to or less than a predetermined value, and when the user's moving acceleration exceeds the predetermined value.
  • the image information is presented in the display area with a second luminance lower than the first luminance.
  • control unit may gradually decrease the second luminance as the user's movement acceleration increases.
  • control unit may be configured to change at least one of a position, a size, and an information amount of the image information presented in the display area according to the magnitude of the acceleration.
  • the wearable display may further include a detection unit capable of detecting an acceleration acting on the display unit as the user information.
  • the detection unit may be configured by a motion sensor such as an acceleration sensor or an angular velocity sensor, or may be configured by, for example, a camera installed in a display unit.
  • An image display device includes a display unit and a control unit.
  • the display unit has a display area.
  • the control unit is based on user information including information related to a user's position and behavior, and a plurality of image information related to a plurality of specific objects existing in a peripheral region within a first distance from the user, Image information related to a specific object existing in the moving direction of the user is extracted from the plurality of image information, and the extracted image information is presented in the display area.
  • An image display system includes a display unit, an acquisition unit, and a control unit.
  • the display unit has a display area.
  • the acquisition unit acquires user information including information related to a user's position and behavior, and a plurality of pieces of image information related to a plurality of specific objects existing in a peripheral region within a first distance from the user.
  • the control unit extracts image information related to a specific object existing in the moving direction of the user from the plurality of image information, and presents the extracted image information in the display area.
  • HMD wearable display
  • FIG. 1 is an overall view of a system including the HMD. It is a block diagram which shows the structure of the said system. It is a functional block diagram of the control unit in the HMD. It is explanatory drawing of the coordinate position in the cylindrical coordinate system in the said HMD.
  • FIG. 1 is a schematic diagram illustrating functions of a head mounted display (hereinafter referred to as “HMD”) as a wearable display according to an embodiment of the present technology.
  • HMD head mounted display
  • an X-axis direction and a Y-axis direction indicate horizontal directions orthogonal to each other, and a Z-axis direction indicates a vertical axis direction.
  • These XYZ orthogonal coordinate systems represent a coordinate system (real three-dimensional coordinate system) of the real space to which the user belongs, the X-axis arrow indicates the north direction, and the Y-axis arrow indicates the east direction.
  • the Z-axis arrow indicates the direction of gravity.
  • the HMD 100 is mounted on the head of the user U and can display a virtual image (AR object; hereinafter also referred to as an object) in the field of view V (display field of view) of the user U. Composed.
  • the object displayed in the field of view V relates to a specific object (A1, A2, A3, A4,..., Hereinafter collectively referred to as a specific object A unless otherwise described). Information other than the specific object A is included.
  • the specific object A corresponds to, for example, scenery, shops, products existing around the user U, and terminals of transportation such as bus stops, stations, and airports.
  • an object B10 that indicates that a specific coupon is available in a specific store A10 with a visual field V is displayed.
  • the information other than the specific object A includes, for example, information related to the route to the destination set by the user, and is related to the orientation of the display unit 10 as schematically shown in FIG. Then, an object B20 composed of “arrows” indicating the traveling direction of the road or passage is displayed.
  • a menu screen for setting the functions of the HMD 100, a pattern authentication screen, and the like are applicable.
  • HMD 100 includes objects (B 1, B 2, B 3, B 4,..., which are individually described below, surrounding a virtual world coordinate system surrounding user U wearing HMD. Except for object B).
  • the world coordinate system is a coordinate system equivalent to the real space to which the user belongs, and determines the position of the specific object A based on the position of the user U and a predetermined axial direction.
  • cylindrical coordinates C0 with the vertical axis as the center axis are adopted as the world coordinates, but other three-dimensional coordinates such as celestial coordinates centered on the user U may be adopted.
  • the radius R and height H of the cylindrical coordinates C0 can be set arbitrarily.
  • the radius R is set shorter than the distance from the user U to the specific object A, but may be longer than the distance.
  • the height H is set to a size equal to or greater than the height (length in the vertical direction) Hv of the user's U visual field V provided via the HMD 100.
  • the object B includes information related to the specific object A existing in the world coordinate system or information not related to the specific object A.
  • the object B may be an image including characters and pictures or an animation image.
  • the object B may be a two-dimensional image or a three-dimensional image.
  • the shape of the object B may be any arbitrary or significant geometric shape such as a rectangle, a circle, or the like, and can be appropriately set depending on the type (attribute) of the object B and the display content.
  • the coordinate position of the object B on the cylindrical coordinates C0 is associated with, for example, the intersection position between the user's eye line L watching the specific object A and the cylindrical coordinates C0.
  • the center position of each of the objects B1 to B4 is made coincident with the intersection position.
  • the present invention is not limited to this, and a part of the periphery of the objects B1 to B4 (for example, part of the four corners) You may match.
  • the coordinate positions of the objects B1 to B4 may be associated with any position away from the intersection position.
  • the cylindrical coordinate C0 is a height in the height direction that represents a vertical coordinate axis ( ⁇ ) that represents an angle around the vertical axis with the north direction being 0 °, and a horizontal line of sight Lh of the user U.
  • the coordinate axis ( ⁇ ) has a positive direction around the east, and the coordinate axis (h) has a depression angle as a positive direction and an elevation angle as a negative direction.
  • the HMD 100 has a detection unit for detecting the viewpoint direction of the user U. Based on the output of the detection unit, the HMD 100 corresponds to which region on the cylindrical coordinate C0 the user's visual field V corresponds to. Judge whether to do. When any object (for example, object B1) exists in the corresponding area of the xy coordinate system that forms the visual field V, the HMD 100 presents (draws) the object B1 in the visual field V in the corresponding area.
  • object B1 exists in the corresponding area of the xy coordinate system that forms the visual field V
  • the HMD 100 presents (draws) the object B1 in the visual field V in the corresponding area.
  • the HMD 100 of the present embodiment displays information related to the target object A1 to the user U by displaying the AR object (B1) in the field of view V superimposed on the specific target object A1 in the real space. To do. Further, the HMD 100 presents to the user U AR objects (B1 to B4) related to predetermined specific objects A1 to A4 according to the orientation or direction of the viewpoint of the user U.
  • FIG. 4 is an overall view showing the HMD 100
  • FIG. 5 is a block diagram showing the configuration thereof.
  • the HMD 100 includes a display unit 10, a detection unit 20 that detects the attitude of the display unit 10, and a control unit 30 that controls driving of the display unit 10.
  • the HMD 100 is configured as a see-through HMD that can provide a user with a visual field V in real space.
  • the display unit 10 is configured to be attachable to the user U's head.
  • the display unit 10 includes first and second display surfaces 11R and 11L, first and second image generation units 12R and 12L, and a support body 13.
  • the first and second display surfaces 11R and 11L are configured by an optical element having a light-transmissive display region 110 that can provide a real space visual field (external field visual field) to the right eye and the left eye of the user U, respectively. .
  • the first and second image generation units 12R and 12L are configured to be able to generate images to be presented to the user U via the first and second display surfaces 11R and 11L, respectively.
  • the support 13 supports the display surfaces 11R and 11L and the image generation units 12R and 12L, and the user's head so that the first and second display surfaces 11L and 11R face the right eye and the left eye of the user U, respectively. It has an appropriate shape that can be attached to the part.
  • the display unit 10 configured as described above is configured to provide the user U with a field of view V in which a predetermined image (or virtual image) is superimposed on the real space via the display surfaces 11R and 11L. Is done.
  • the cylindrical coordinates C0 for the right eye and the cylindrical coordinates C0 for the left eye are set, and the objects drawn on the cylindrical coordinates are respectively projected onto the display areas 110 of the display surfaces 11R and 11L.
  • the detection unit 20 is configured to be able to detect a change in orientation or posture around at least one axis of the display unit 10.
  • the detection unit 20 is configured to detect a change in orientation or posture of the display unit 10 around the X, Y, and Z axes.
  • the orientation of the display unit 10 typically means the front direction of the display unit 10.
  • the orientation of the display unit 10 is defined as the face direction of the user U.
  • the detection unit 20 detects a motion sensor such as an acceleration sensor or an angular velocity sensor that can detect the accelerations in the three axis directions acting on the display unit 10, and illuminance (brightness) around the display unit 10.
  • a motion sensor such as an acceleration sensor or an angular velocity sensor that can detect the accelerations in the three axis directions acting on the display unit 10, and illuminance (brightness) around the display unit 10.
  • An illuminance sensor or the like that can be used.
  • the detection unit 20 may include an image sensor (camera), and may be configured to detect acceleration, movement speed, illuminance, and the like acting on the display unit 10.
  • the detection unit 20 may be configured by a sensor unit in which each of the angular velocity sensor and the acceleration sensor is arranged in three axial directions, or may use different sensors depending on the respective axes.
  • an integrated value of the output of the angular velocity sensor can be used as the posture change of the display unit 10, the direction of the change, the amount of the change, and the like.
  • a geomagnetic sensor may be employed for detecting the orientation of the display unit 10 around the vertical axis (Z axis).
  • the geomagnetic sensor and the motion sensor may be combined. As a result, it is possible to detect the orientation or orientation change of the display unit 10 with high accuracy.
  • the detection unit 20 is arranged at an appropriate position on the display unit 10.
  • the position of the detection unit 20 is not particularly limited.
  • the detection unit 20 is disposed on one of the image generation units 12R and 12L or a part of the support 13.
  • the number of detection units 20 is not limited to one, and a plurality of detection units 20 may be arranged at appropriate positions on the display unit 10.
  • the control unit 30 generates a control signal for controlling the driving of the display unit 10 (image generation units 12R and 12L) based on the output of the detection unit 20.
  • the control unit 30 is electrically connected to the display unit 10 via a connection cable 30a.
  • the present invention is not limited to this, and the control unit 30 may be connected to the display unit 10 through a wireless communication line.
  • control unit 30 includes a CPU 301, a memory 302 (storage unit), a transmission / reception unit 303, an internal power supply 304, and an input operation unit 305.
  • the CPU 301 controls the operation of the entire HMD 100.
  • the memory 302 includes a ROM (Read Only Memory), a RAM (Random Access Memory), and the like, a program for executing control of the HMD 100 by the CPU 301, various parameters, an image (object) to be displayed on the display unit 10, and the like. Store the necessary data.
  • the transmission / reception unit 303 constitutes an interface for communication with the portable information terminal 200 described later.
  • the internal power supply 304 supplies power necessary for driving the HMD 100.
  • the input operation unit 305 is for controlling an image displayed on the display unit 10 by a user operation.
  • the input operation unit 305 may be configured with a mechanical switch or a touch sensor.
  • the input operation unit 305 may be provided in the display unit 10.
  • the HMD 100 may further include a sound output unit such as a speaker, a camera, and the like.
  • the audio output unit and the camera are typically provided in the display unit 10.
  • the control unit 30 may be provided with a display device that displays an input operation screen or the like of the display unit 10.
  • the input operation unit 305 may be configured by a touch panel provided in the display device.
  • the portable information terminal 200 is configured to be able to communicate with the control unit 30 via a wireless communication line.
  • the portable information terminal 200 has a function (acquisition unit) for acquiring an image (object) to be displayed on the display unit 10 and a function for transmitting the acquired image (object) to the control unit 30.
  • the portable information terminal 200 is organically combined with the HMD 100 to construct an HMD system (image display system).
  • the portable information terminal 200 is carried by a user U who wears the display unit 10 and includes an information processing device such as a personal computer (PC: Personal Computer), a smartphone, a mobile phone, a tablet PC, or a PDA (Personal Digital Assistant).
  • an information processing device such as a personal computer (PC: Personal Computer), a smartphone, a mobile phone, a tablet PC, or a PDA (Personal Digital Assistant).
  • a terminal device dedicated to the HMD 100 may be used.
  • the portable information terminal 200 includes a CPU 201, a memory 202, a transmission / reception unit 203, an internal power supply 204, a display unit 205, a camera 206, and a position information acquisition unit 207.
  • the CPU 201 controls the operation of the mobile information terminal 200 as a whole.
  • the memory 202 includes a ROM, a RAM, and the like, and stores programs and various parameters for executing control of the portable information terminal 200 by the CPU 201, images (objects) transmitted to the control unit 30, and other necessary data.
  • the internal power supply 204 supplies power necessary for driving the portable information terminal 200.
  • the transmission / reception unit 203 is connected to the server N, the control unit 30, other nearby portable information terminals, etc. using a wireless LAN (IEEE802.11, etc.) such as WiFi (Wireless Fidelity) or a 3G or 4G network for mobile communication. connect.
  • the portable information terminal 200 downloads an image (object) to be transmitted to the control unit 30 from the server N via the transmission / reception unit 203, an application for displaying the image, and the like, and stores them in the memory 202.
  • the display unit 205 is composed of, for example, an LCD or an OLED, and displays various menus, application GUIs, and the like. Typically, the display unit 205 is integrated with a touch panel and can accept a user's touch operation.
  • the portable information terminal 200 is configured to be able to input a predetermined operation signal to the control unit 30 by a touch operation on the display unit 205.
  • the location information acquisition unit 207 typically includes a GPS (Global Positioning System) receiver.
  • the portable information terminal 200 measures the current position (longitude, latitude, altitude) of the user U (display unit 10) using the position information acquisition unit 207, calculates the moving speed of the user U (display unit 10), and Map information such as necessary images (objects) and road maps can be acquired from N. That is, the server N acquires information (user information) regarding the current position and behavior of the user, and transmits image data, application software, map information, and the like corresponding to the information to the portable information terminal 200.
  • GPS Global Positioning System
  • the server N is typically configured by a computer including a CPU, a memory, and the like, and transmits predetermined information to the portable information terminal 200 in response to a request from the user U or automatically regardless of the intention of the user U. .
  • the server N stores a plurality of types of image data that can be displayed by the HMD 100.
  • the server N is configured to be able to transmit a plurality of image data selected according to the position, operation, etc. of the user U to the portable information terminal 200 collectively or sequentially as a part of the predetermined information. Is done.
  • the mobile information terminal 200 displays image information related to an AR object related to a plurality of specific objects existing in a peripheral area within a predetermined distance from the current position of the user U (display unit 10). Send to.
  • the predetermined distance may be fixed or variable, and typically, the mobile information terminal 200 voluntarily or based on an instruction from the control unit 30, the mobile information terminal 200 sends a server The predetermined distance is instructed to N.
  • FIG. 6 is a functional block diagram of the CPU 301.
  • the CPU 301 includes a coordinate setting unit 311, an image management unit 312, a coordinate determination unit 313, an image extraction unit 314, and a display control unit 315.
  • the CPU 301 executes processing in the coordinate setting unit 311, the image management unit 312, the coordinate determination unit 313, the image extraction unit 314, and the display control unit 315 according to the program stored in the memory 302.
  • the coordinate setting unit 311 is configured to execute processing for setting three-dimensional coordinates surrounding the user U (display unit 10).
  • cylindrical coordinates C0 (see FIG. 1) centered on the vertical axis Az are used as the three-dimensional coordinates.
  • the coordinate setting unit 311 sets the radius R and the height H of the cylindrical coordinates C0.
  • the coordinate setting unit 311 typically sets the radius R and the height H of the cylindrical coordinates C0 according to the number and type of objects to be presented to the user U.
  • the radius R of the cylindrical coordinates C0 may be a fixed value, but may be a variable value that can be arbitrarily set according to the size (pixel size) of the image to be displayed.
  • the height H of the cylindrical coordinates C0 is set to, for example, 1 to 3 times the height Hv (see FIG. 1) of the vertical direction (vertical direction) of the visual field V provided to the user U by the display unit 10. Is done.
  • the upper limit of the height H is not limited to 3 times Hv, but may be a size exceeding 3 times Hv.
  • the cylindrical coordinate C0 represents the angle in the vertical direction with reference to the circumferential coordinate axis ( ⁇ ) representing the angle around the vertical axis with the north direction being 0 ° and the horizontal line of sight Lh of the user U.
  • a coordinate axis (h) in the height direction The coordinate axis ( ⁇ ) has a positive direction around the east, and the coordinate axis (h) has a depression angle as a positive direction and an elevation angle as a negative direction.
  • the image management unit 312 has a function of managing the images stored in the memory 302. For example, the image management unit 312 stores one or a plurality of images displayed via the display unit 10 in the memory 302 and stores them in the memory 302. It is configured to execute a process of selectively deleting the processed image. An image stored in the memory 302 is transmitted from the portable information terminal 200. The image management unit 312 also requests the portable information terminal 200 to transmit an image via the transmission / reception unit 303.
  • the memory 302 is configured to store one or a plurality of images (objects) to be displayed in the visual field V in association with the cylindrical coordinates C0. That is, the memory 302 stores the individual objects B1 to B4 on the cylindrical coordinates C0 shown in FIG. 1 together with the coordinate positions on the cylindrical coordinates C0.
  • each of the objects B1 to B4 to be displayed corresponding to the azimuth or orientation of the visual field V occupies a coordinate area on a specific cylindrical coordinate C0, and specific coordinates in that area It is stored in the memory 302 together with the position P ( ⁇ , h).
  • the coordinates ( ⁇ , h) of the objects B1 to B4 on the cylindrical coordinates C0 are a straight line connecting the positions of the objects A1 to A4 defined by the orthogonal coordinate system (X, Y, Z) and the position of the user, Corresponding to the coordinates of the cylindrical coordinate system at the intersection of the cylindrical coordinates C0 with the cylindrical surface. That is, the coordinates of the objects B1 to B4 correspond to the coordinates of the objects A1 to A4 converted from the actual three-dimensional coordinates to the cylindrical coordinates C0, respectively.
  • the coordinate conversion of such an object is executed in the image management unit 312 and each object is stored in the memory 302 together with the coordinate position.
  • the coordinate positions of the objects B1 to B4 may be set to any position within the display area of each object B1 to B4, or may be set to one specific point (for example, the center position) for each object. Two or more points (for example, two diagonal points or four corner points) may be set.
  • the coordinate positions of the objects B1 to B4 are associated with the intersection position of the user's line of sight L that looks at the objects A1 to A4 and the cylindrical coordinates C0, the user U has the object B1.
  • To B4 are visually recognized at positions overlapping the objects A1 to A4.
  • the coordinate positions of the objects B1 to B4 may be associated with an arbitrary position away from the intersection position.
  • the objects B1 to B4 can be displayed or drawn at desired positions with respect to the objects A1 to A4.
  • the coordinate determination unit 313 is configured to execute a process of determining which region on the cylindrical coordinates C0 corresponds to the field of view V of the user U based on the output of the detection unit 20. That is, the visual field V moves on the cylindrical coordinates C0 due to a change in the posture of the user U (display unit 10), and the movement direction and movement amount are calculated based on the output of the detection unit 20.
  • the coordinate determination unit 313 calculates the movement direction and movement amount of the display unit 10 based on the output of the detection unit 20, and determines to which region on the cylindrical coordinates C0 the visual field V belongs.
  • FIG. 8 is a development view of the cylindrical coordinates C0 conceptually showing the relationship between the visual field V on the cylindrical coordinates C0 and the objects B1 to B4.
  • the field of view V is substantially rectangular and has xy coordinates (local coordinates) with the upper left corner as the origin OP2.
  • the x axis is an axis extending in the horizontal direction from the origin OP2, and the y axis is an axis extending in the vertical direction from the origin OP2.
  • the coordinate determination unit 313 is configured to execute processing for determining whether any of the objects B1 to B4 exists in the corresponding region of the visual field V.
  • the image extraction unit 314 extracts image information related to the specific object existing in the moving direction of the user U (display unit 10) from the plurality of pieces of image information (AR object) stored in the memory 302.
  • the extraction method is set as appropriate according to the function (type of application) executed by the HMD 100, and typically includes an AR object or the like of a specific object that exists along a road on which the user U runs or passes.
  • the image extraction unit 314 is configured to extract image information according to the type of user information acquired from the detection unit 20 or the portable information terminal 200. For example, when the user U is stationary, the image extraction unit 314 extracts all-around object data corresponding to the orientation of the display unit 10 (the line of sight of the user U), and the user U is moving. In this case, the object data along the moving direction is extracted.
  • the image extraction unit 314 may be configured to extract only the objects related to the specific object existing around the user U when the moving speed of the user U (display unit 10) is equal to or lower than a predetermined value. Good.
  • the image information extraction method in the image extraction unit 314 is preset according to the type of application executed by the HMD 100 as described above, but may be selected or set by an input operation by the user. Note that a function that can invalidate the function of the image extraction unit 314 described above may be set.
  • the display control unit 315 performs a process of presenting (drawing) the image information (object) extracted by the image extraction unit 314 in the visual field V based on the output of the detection unit 20 (that is, the determination result of the coordinate determination unit 313). Configured to do. For example, as shown in FIG. 8, when the current orientation of the visual field V overlaps the display areas of the objects B1 and B2 on the cylindrical coordinates C0, images corresponding to the overlapping areas B10 and B20 are displayed in the visual field V. (Local rendering: Local) Rendering).
  • 9A and 9B are diagrams for explaining a conversion method from the cylindrical coordinates C0 (world coordinates) to the visual field V (local coordinates).
  • the coordinates of the reference point of the visual field V on the cylindrical coordinates C0 are ( ⁇ v, hv), and the coordinates of the reference point of the object B located in the region of the visual field V are ( ⁇ 0, h0).
  • the reference point of the field of view V and the object B may be set to any point, and in this example, the reference point is set at the upper left corner of the field of view V and the object B which are rectangular.
  • ⁇ v [°] is the width angle of the visual field V on the world coordinates, and its value is determined by the design or specification of the display unit 10.
  • the display control unit 315 determines the display position of the object B in the visual field V by converting the cylindrical coordinate system ( ⁇ , h) to the local coordinate system (x, y). As shown in FIG. 9B, if the height and width of the field of view V in the local coordinate system are Hv and Wv, respectively, and the coordinates of the reference point of the object B in the local coordinate system (x, y) are (x0, y0), conversion is performed.
  • the display control unit 315 moves the object B in the field of view V in the direction opposite to the moving direction of the display unit 10 in accordance with the change in orientation of the display unit 10. That is, the display control unit 315 changes the display position of the object B within the field of view V following the change in the orientation or orientation of the display unit 10. This control is continued as long as at least part of the object B exists in the visual field V.
  • FIG. 10 is a flowchart for explaining an outline of the operation of the HMD system according to the present embodiment.
  • the current position of the user U is measured using the position information acquisition unit 207 of the portable information terminal 200 (step 101).
  • the position information of the display unit 10 is transmitted to the server N.
  • the portable information terminal 200 acquires from the server N object data, map information, and the like related to a specific object existing in a peripheral area (real space) within a predetermined distance (first distance) from the user U ( Step 102).
  • the predetermined distance is not particularly limited, and is set, for example, within a radius of several hundred meters to several kilometers centering on the user U.
  • the predetermined distance may be set by default or may be set by a user input operation.
  • the control unit 30 sets the height (H) and radius (R) of the cylindrical coordinates C0 as the world coordinate system according to the type of object data and the like (step 103).
  • control unit 30 detects the orientation of the visual field V based on the output of the detection unit 20 (step 104), acquires the object data from the portable information terminal 200, and stores it in the memory 302 (step 105).
  • the orientation of the visual field V is converted into the world coordinate system ( ⁇ , h), and it is monitored which position on the cylindrical coordinates C0 corresponds to.
  • control unit 30 extracts object data to be displayed on the display unit 10 (display area 110, field of view V) from the object data stored in the memory 302 (step 106).
  • control unit 30 image extraction unit 314. determines that the user U is stationary from the user information acquired from the detection unit 20 or the portable information terminal 200, the orientation of the display unit 10 (the user U's) All surrounding object data corresponding to the line of sight is extracted. Further, when the control unit 30 determines that the user U (display unit 10) is moving, the control unit 30 extracts object data existing in the moving direction.
  • the control unit 30 extracts the object related to the specific object existing relatively far in the moving direction, and moves the user U
  • the speed is less than or equal to a predetermined value, only object data related to a specific object existing around the user U may be extracted.
  • control unit 30 displays (draws) the object at the corresponding position of the visual field V via the display unit 10 ( Step 107).
  • the above operation is repeatedly executed. That is, the current position of the user U (display unit 10) is measured again (step 101), and when the current position of the user has changed by, for example, 50 meters or more since the object data was previously acquired from the server N, the current position Is sent from the server N to the portable information terminal 200 (step 102).
  • the control unit 30 object data related to the moving direction of the user U is extracted from the new object data transmitted from the portable information terminal 200, and is drawn on the display unit 10. Such control is repeatedly executed.
  • object data that is more than a predetermined distance (first distance) away from the user may be deleted from the memory 202 of the portable information terminal 200 and the memory 302 of the control unit 30 as the user moves. Thereby, the capacity of the memories 202 and 302 can be reduced.
  • FIG. 11 is a conceptual diagram showing a positional relationship between the current position of the user U who is riding a moving body such as an automobile or a motorcycle and traveling north on the road R101 at a predetermined speed, and a specific object around it.
  • the server N acquires information about the current position from the portable information terminal 200
  • the server N transmits object data and map information related to the specific objects A12 to A22 within a radius of, for example, 10 km to the portable information terminal 200.
  • the portable information terminal 200 transmits the received object data and map information related to the specific objects A12 to A22 to the control unit 30.
  • the control unit 30 extracts object data related to the specific objects A12, A13, A14, A16, A19, and A22 existing along the road R101 from the received object data, and displays the image (object) on the display unit 10. indicate.
  • the display unit 10 includes relatively distant information (objects B19, B22) within a radius of 1 km to 10 km, for example. May be presented. In this way, by changing the range of information to be displayed according to the moving speed of the user, it is necessary to see only nearby information when traveling at low speed, or to quickly check distant information when traveling at high speed. Can also be supported.
  • the user's travel route can be estimated from the set route. Therefore, in addition to objects related to the road along which the vehicle is currently traveling, objects related to the route along the set route can also be extracted and presented on the display unit 10.
  • an image including map information and image information shown as a bird's-eye view or a bird's-eye view as shown in FIG. It may be configured to present. Thereby, it is possible to present a list of objects related to the surrounding specific structure to the user.
  • the control unit 30 (display control unit 315) in the HMD of this embodiment acquires the user's movement acceleration as user information, and changes the display mode of the image information presented in the display area according to the magnitude of the movement acceleration. Configured to let
  • FIG. 13 is a flowchart showing an example of the operation of the control unit 30.
  • the control unit 30 obtains object data, map information, and moving speed information of the peripheral area based on the current position of the user from the portable information terminal 200, and the user presents the current data from these information.
  • Object data along the traveling road is extracted (steps 201 to 205).
  • the control unit 30 determines the display mode of the extracted object based on the user's movement acceleration information acquired from the detection part 20, and displays this on the display part 10 (steps 206 and 207).
  • the acceleration of the display unit 10 detected by an acceleration sensor built in the detection unit 20 is referred to as the user's moving acceleration.
  • the HMD of the present embodiment is configured to intentionally reduce the visibility of the displayed object so that the user's attention is directed to the field of view of the real space instead of the object.
  • the control unit 30 has at least one of brightness (brightness), color, contrast, position, size, information amount, etc. of the object according to the magnitude of the user's moving acceleration.
  • brightness brightness
  • contrast contrast
  • position size
  • information amount etc.
  • the control unit 30 has at least one of brightness (brightness), color, contrast, position, size, information amount, etc. of the object according to the magnitude of the user's moving acceleration.
  • the user's attention is directed to the real space environment.
  • the greater the acceleration the lower the object's visibility by reducing the display brightness of the object, changing to a lighter color, or reducing the contrast.
  • the front position of the real space is secured by moving the position of the object from the center to the peripheral edge of the field of view, reducing the size, or reducing the amount of information.
  • the control unit 30 presents the object to the display area 110 with the first luminance when the user's moving acceleration is below a predetermined value, and when the user's moving acceleration exceeds the predetermined value.
  • the object is displayed on the display area 110 with a second luminance lower than the first luminance.
  • the user's moving acceleration is typically the magnitude of a combined vector of accelerations along the three-axis directions of the X-axis, Y-axis, and Z-axis (see FIG. 1). Since gravitational acceleration is always applied in the Z-axis direction, the moving acceleration is a value exceeding 1 G except when the user is stationary or moving at a constant speed. Therefore, by setting the predetermined value to 1G, it is possible to reliably detect the movement of the user.
  • the luminance (brightness) of the object is not limited to two, ie, the first luminance and the second luminance, and the second luminance may be gradually decreased as the acceleration increases.
  • the aspect in which the second luminance is decreased is not particularly limited.
  • the second luminance may be decreased linearly as the acceleration increases, or may be decreased stepwise.
  • the luminance becomes zero at an acceleration of 2G.
  • the present invention is not limited to this, and the luminance may be zero at an acceleration exceeding 2G.
  • the display brightness of the object may be lowered according to the time change of the acceleration. For example, when the acceleration change rate is equal to or higher than a predetermined value, a technique such as temporarily reducing the display luminance can be applied.
  • the display brightness of the object may be recovered so as to follow this.
  • the display brightness may be recovered after a certain time has elapsed.
  • a low-pass filter may be interposed in the output of the acceleration sensor in order to eliminate the influence of vibration during driving of the vehicle, road gradient, and the like.
  • the above dimming control may be executed as part of the function of the display control unit 315, or a dimming unit 306 provided separately in the control unit 30 may be used in combination as shown in FIG.
  • the dimming unit 306 includes a manual dimming unit 41, an automatic dimming unit 42, a changeover switch 43, and a dimming correction unit 44.
  • the manual dimming unit 41 is a part where the user can arbitrarily set the luminance of the object, and is operated via the input operation unit 305 (see FIG. 5), for example.
  • the automatic light control unit 42 Based on the output of the illuminance sensor 21 built in the detection unit 20, the automatic light control unit 42 outputs a luminance value set in advance according to the output.
  • the change-over switch 43 is for switching between manual adjustment and automatic adjustment.
  • the output of the manual dimming unit 41 is input to the dimming correction unit 44, and in the case of automatic adjustment, automatic dimming is performed.
  • the output of the unit 42 is input to the dimming correction unit 44.
  • the dimming correction unit 44 Based on the output of the acceleration sensor 22 built in the detection unit 20, the dimming correction unit 44 performs correction to reduce the luminance value in the manner shown in FIG. 14, and outputs the correction to the CPU 301 (display control unit 315).
  • FIG. 16A is a schematic diagram of a visual field V showing an example of a change in the position and size of an object.
  • FIG. 16B is a schematic diagram of the field of view V illustrating an example of a change in the position and information amount of the object B30.
  • the acceleration exceeds 1G
  • the object B30 is moved to one corner or the lower part of the field of view V, or only a part of the object is displayed, thereby ensuring a wide front field of view. be able to.
  • the display luminance of the object may be decreased at the same time as described above.
  • the driving state and driving time of a vehicle are acquired as user information, and predetermined information is presented on the display unit 10 based on such information.
  • the control unit 30 is configured to be able to estimate the behavior of the vehicle from the acceleration information (user information) acquired by the detection unit 20. For example, as shown in FIG. 17, when the user is driving a motorcycle, the control unit 30 performs centrifugal force acting on the display unit 10 from acceleration information (for example, lateral acceleration) acquired by the detection unit 20. And the bank angle of the vehicle body at the time of turning or the like is estimated from the value.
  • the control unit 30 is configured to present warning information for notifying the user of the risk of falling on the display unit 10 when the bank angle is equal to or greater than a predetermined value.
  • the warning information may be character information, graphic information, or a combination thereof.
  • the remaining amount of fuel in the vehicle may be detected, and information indicating that may be presented to the user when it becomes below a predetermined level.
  • information indicating that may be presented to the user when it becomes below a predetermined level characters and figures may be used as described above, or information on the position of a nearby gas station may be displayed as an AR. This example can be useful information not only for general vehicle users but also for motorcycle users who are not equipped with fuel gauges.
  • the control unit 30 determines the degree of fatigue of the user who drives the vehicle from travel time information (user information) acquired from the portable information terminal 200 (position information acquisition unit 207). It is configured to estimate and present information related to a break point such as a service area or a convenience store on the display unit 10. According to this example, as described in the first embodiment, for example, information related to a break point existing along a traveling road is presented. Thereby, information useful for the user can be provided at a timely timing.
  • the control unit 30 estimates the user's moving destination from the moving direction acquired from the portable information terminal 200 (position information acquisition unit 207), and relates to the destination.
  • Information for example, station departure time and train operation information
  • Information regarding the weather etc. at the destination may be provided. Even in such a configuration, it is possible to provide useful information to the user at a timely timing.
  • the user's past action history may be referred to as the user information.
  • the information is stored in the memory 302 of the control unit 30 or the memory 202 of the portable information terminal 200.
  • estimation accuracy of a user's action or its purpose can be improved.
  • the behavior can be estimated from the movement start time, the movement speed, and the like, and the destination can be estimated from the traveling route.
  • information corresponding to the action or the destination can be provided to the user.
  • the present technology is applied to the HMD.
  • a head-up display mounted in a driver's seat of a vehicle or a cockpit of an aircraft or the like
  • the present technology can also be applied to other image display apparatuses.
  • a see-through type (transmission type) HMD has been described.
  • the present technology can also be applied to a non-transmission type HMD.
  • a predetermined object according to the present technology may be displayed in an external field of view captured by a camera attached to the display unit.
  • the wearable display has been described by taking the HMD attached to the user's head as an example.
  • the present invention is not limited to this, and for example, a display device that is used attached to the user's arm, wrist, or the like.
  • the present technology can also be applied to a display device that is directly attached to the eyeball, such as a contact lens.
  • a display unit configured to be wearable by a user and having a display area that provides a real-time visual field to the user; The plurality of images based on user information including information related to the user's position and behavior, and a plurality of pieces of image information related to a plurality of specific objects existing in a peripheral region within a first distance from the user.
  • a wearable display comprising: a control unit configured to extract image information related to a specific object existing in the moving direction of the user from the information, and present the extracted image information in the display area.
  • the wearable display according to (1) above The user information includes map information around the user's location
  • the said control unit is a wearable display which extracts the image information relevant to the specific target object which exists along the road where the said user travels from the said several image information.
  • the control unit acquires the moving speed of the user as the user information, and when the moving speed is equal to or less than a predetermined value, the specific target existing within a second distance shorter than the first distance from the user
  • the control unit is configured to be capable of presenting an image including the map information and the image information in the display area.
  • the wearable display according to any one of (1) to (4) above The said control unit acquires the said user's movement acceleration as the said user information, The wearable display which changes the display mode of the said image information shown to the said display area according to the magnitude
  • the wearable display according to (6) above, The control unit is a wearable display that gradually decreases the second luminance as the moving acceleration of the user increases.
  • the wearable display according to any one of (5) to (7) above, The control unit changes at least one of a position, a size, and an information amount of the image information presented in the display area according to the magnitude of the acceleration.
  • a wearable display further comprising a detection unit capable of detecting an acceleration acting on the display unit as the user information.
  • a display unit having a display area; The plurality of pieces of image information based on user information including information relating to a user's position and behavior and a plurality of pieces of image information relating to a plurality of specific objects existing in a peripheral area within a first distance from the user.
  • An image display device comprising: a control unit configured to extract image information related to a specific object existing in a moving direction of the user from the image and present the extracted image information in the display area.
  • a display unit having a display area (11) a display unit having a display area;
  • An acquisition unit that acquires user information including information related to a user's position and behavior, and a plurality of pieces of image information related to a plurality of specific objects existing in a peripheral region within a first distance from the user;
  • An image display system comprising: a control unit that extracts image information related to a specific object existing in the moving direction of the user from the plurality of image information, and presents the extracted image information in the display area.
  • DESCRIPTION OF SYMBOLS 10 ... Display part 20 ... Detection part 30 ... Control unit 100 ... Head mounted display (HMD) DESCRIPTION OF SYMBOLS 110 ... Display area 200 ... Portable information terminal 314 ... Display control part A1-A4, A12-A22 ... Specific object B1-B4, B12-B14, B16, B19, B22 ... Object V ... Visual field

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Hardware Design (AREA)
  • Theoretical Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

A wearable display according to an embodiment of the present technology includes a display unit and a control unit. The display unit is configured to be wearable by a user and has a display region that provides the user with a view of real space. The control unit extracts image information associated with specific objects existing in the direction the user moves from a plurality of image information on the basis of user information including the information associated with the position and action of the user, and the plurality of image information associated with a plurality of specific objects existing in a peripheral region within a first distance from the user, and displays the extracted image information in the display region. According to the present technology, the information expected to be useful to a user can be preferentially displayed.

Description

ウェアラブルディスプレイ、画像表示装置及び画像表示システムWearable display, image display device, and image display system
 本技術は、表示視野に特定の情報を含む画像を表示することが可能なウェアラブルディスプレイ、画像表示装置及び画像表示システムに関する。 The present technology relates to a wearable display, an image display device, and an image display system capable of displaying an image including specific information in a display visual field.
 現実空間もしくはそれを表示する画面に、対応する画像を付加する、拡張現実(AR:Augmented Reality)と呼ばれる技術が知られている。例えば特許文献1には、現実空間に存在する対象物に、それと関連する画像(ARオブジェクト)を重畳して表示することが可能な、シースルー型のヘッドマウントディスプレイ(HMD:Head Mounted Display)が記載されている。 A technology called augmented reality (AR) that adds a corresponding image to a real space or a screen that displays the real space is known. For example, Patent Document 1 describes a see-through type head-mounted display (HMD: Head-Mounted-Display) that can superimpose and display an image (AR object) associated with an object existing in real space. Has been.
国際公開第2014/128810号International Publication No. 2014/128810
 この種のヘッドマウントディスプレイにおいては、ユーザの周囲に存在する対象物に関連したARオブジェクトが表示されるため、ユーザにとって有用な情報だけでなく、不要な情報まで提示される場合がある。 In this type of head mounted display, since AR objects related to objects existing around the user are displayed, not only information useful to the user but also unnecessary information may be presented.
 例えば、目的地までの進路情報をユーザへ提供するナビゲーションモードの実行中において、現在走行している道路に関連した情報だけでなく、近くの他の道路に関連した情報が表示される場合がある。このような他の道路に関連した情報はユーザにとって不要であることが多く、また、表示された情報がどの道路に関連した情報であるかの判断をユーザに強いることになるため、ユーザに混乱や錯誤を生じさせるおそれがあった。 For example, during the execution of the navigation mode that provides the user with route information to the destination, not only information related to the road that is currently running, but also information related to other nearby roads may be displayed. . Such information related to other roads is often unnecessary for the user, and the user is forced to judge which road the displayed information is related to. There was a risk of errors.
 以上のような事情に鑑み、本技術の目的は、ユーザにとって有用と推定される情報を優先的に提示することができるウェアラブルディスプレイ、画像表示装置及び画像表示システムを提供することにある。 In view of the circumstances as described above, an object of the present technology is to provide a wearable display, an image display device, and an image display system capable of preferentially presenting information presumed to be useful to a user.
 本技術の一形態に係るウェアラブルディスプレイは、表示部と、制御ユニットとを具備する。
 上記表示部は、ユーザに装着可能に構成され、ユーザに実空間の視野を提供する表示領域を有する。
 上記制御ユニットは、上記ユーザの位置及び行動に関連する情報を含むユーザ情報と、上記ユーザから第1の距離以内の周辺領域に存在する複数の特定対象物に関連する複数の画像情報とに基づき、上記複数の画像情報から上記ユーザの移動方向に存在する特定対象物に関連する画像情報を抽出し、抽出された上記画像情報を上記表示領域に提示するように構成される。
A wearable display according to an embodiment of the present technology includes a display unit and a control unit.
The display unit is configured to be attachable to the user, and includes a display area that provides the user with a real field of view.
The control unit is based on user information including information related to the position and behavior of the user and a plurality of pieces of image information related to a plurality of specific objects existing in a peripheral region within a first distance from the user. The image information related to the specific object existing in the moving direction of the user is extracted from the plurality of image information, and the extracted image information is presented in the display area.
 上記ウェアラブルディスプレイにおいては、複数の画像情報の中からユーザの移動方向に存在する特定対象物に関連する画像情報を抽出してこれを視野領域に提示するように構成されているため、ユーザにとって有用と推定される情報を優先的に提示することが可能となる。 The wearable display is useful for the user because it is configured to extract image information related to a specific object existing in the moving direction of the user from a plurality of pieces of image information and present it in the visual field area. Can be preferentially presented.
 上記ユーザ情報は、上記ユーザの位置周辺の地図情報を含んでもよく、上記制御ユニットは、上記複数の画像情報から上記ユーザが走行する道路に沿って存在する特定対象物に関連する画像情報を抽出するように構成されてもよい。
 これにより、ユーザが現在走行あるいは通行している道路に関連した情報のみをユーザへ提示することができる。
The user information may include map information around the position of the user, and the control unit extracts image information related to a specific object existing along a road on which the user travels from the plurality of image information. It may be configured to.
Thereby, only the information relevant to the road where the user is currently traveling or passing can be presented to the user.
 上記制御ユニットは、上記ユーザ情報として上記ユーザの移動速度を取得し、上記移動速度が所定以下の場合には、上記ユーザから上記第1の距離よりも短い第2の距離以内に存在する特定対象物に関連する画像情報を上記表示領域に提示するように構成されてもよい。
 このようにユーザの移動速度に応じて画像の表示範囲を変化させてもよい。
The control unit obtains the moving speed of the user as the user information, and when the moving speed is equal to or lower than a predetermined value, the specific target existing within a second distance shorter than the first distance from the user. Image information related to the object may be presented in the display area.
In this way, the display range of the image may be changed according to the moving speed of the user.
 上記制御ユニットは、上記地図情報と上記画像情報とを含む画像を上記表示領域に提示することが可能に構成されてもよい。
 この場合、地図情報は、画像情報を含む鳥瞰図あるいは俯瞰図のように表示されてもよい。
The control unit may be configured to be able to present an image including the map information and the image information in the display area.
In this case, the map information may be displayed like a bird's eye view or a bird's eye view including image information.
 上記制御ユニットは、上記ユーザ情報として上記ユーザの移動加速度を取得し、上記移動加速度の大きさに応じて上記表示領域に提示される上記画像情報の表示態様を変化させるように構成されてもよい。
 ユーザが自動車や自動二輪車を運転している場合、加減速時等の所定の状況下では、ユーザの注意を運転に集中させる必要がある。この場合、上記制御ユニットの構成によれば、表示領域に提示された画像情報の視認性を意図的に低下させて、ユーザの注意を画像ではなく実空間の視野に向けさせることが可能となる。
The control unit may be configured to acquire the user's movement acceleration as the user information and change a display mode of the image information presented in the display area in accordance with the magnitude of the movement acceleration. .
When a user is driving an automobile or a motorcycle, the user's attention needs to be concentrated on driving under a predetermined situation such as acceleration / deceleration. In this case, according to the configuration of the control unit, it is possible to intentionally reduce the visibility of the image information presented in the display area so that the user's attention is directed to the real field of view instead of the image. .
 例えば、上記制御ユニットは、上記ユーザの移動加速度が所定値以下のときは、上記画像情報を第1の輝度で上記表示領域に提示し、上記ユーザの移動加速度が上記所定値を超えたときは、上記画像情報を上記第1の輝度よりも低い第2の輝度で上記表示領域に提示する。この場合、上記制御ユニットは、上記ユーザの移動加速度の増加に従って、上記第2の輝度を徐々に低下させてもよい。 For example, the control unit presents the image information at the first luminance in the display area when the user's moving acceleration is equal to or less than a predetermined value, and when the user's moving acceleration exceeds the predetermined value. The image information is presented in the display area with a second luminance lower than the first luminance. In this case, the control unit may gradually decrease the second luminance as the user's movement acceleration increases.
 あるいは、上記制御ユニットは、上記加速度の大きさに応じて、上記表示領域に提示される上記画像情報の位置、大きさ及び情報量の少なくとも1つを変化させるように構成されてもよい。 Alternatively, the control unit may be configured to change at least one of a position, a size, and an information amount of the image information presented in the display area according to the magnitude of the acceleration.
 上記ウェアラブルディスプレイは、上記ユーザ情報として、上記表示部に作用する加速度を検出することが可能な検出部をさらに具備してもよい。上記検出部は、加速度センサや角速度センサ等のモーションセンサで構成されてもよいし、例えば表示部に設置されたカメラ等で構成されてもよい。 The wearable display may further include a detection unit capable of detecting an acceleration acting on the display unit as the user information. The detection unit may be configured by a motion sensor such as an acceleration sensor or an angular velocity sensor, or may be configured by, for example, a camera installed in a display unit.
 本技術の一形態に係る画像表示装置は、表示部と、制御ユニットとを具備する。
 上記表示部は、表示領域を有する。
 上記制御ユニットは、ユーザの位置及び行動に関連する情報を含むユーザ情報と、上記ユーザから第1の距離以内の周辺領域に存在する複数の特定対象物に関連する複数の画像情報とに基づき、上記複数の画像情報から上記ユーザの移動方向に存在する特定対象物に関連する画像情報を抽出し、抽出された上記画像情報を上記表示領域に提示するように構成される。
An image display device according to an embodiment of the present technology includes a display unit and a control unit.
The display unit has a display area.
The control unit is based on user information including information related to a user's position and behavior, and a plurality of image information related to a plurality of specific objects existing in a peripheral region within a first distance from the user, Image information related to a specific object existing in the moving direction of the user is extracted from the plurality of image information, and the extracted image information is presented in the display area.
 本技術の一形態に係る画像表示システムは、表示部と、取得部と、制御ユニットとを具備する。
 上記表示部は、表示領域を有する。
 上記取得部は、ユーザの位置及び行動に関連する情報を含むユーザ情報と、上記ユーザから第1の距離以内の周辺領域に存在する複数の特定対象物に関連する複数の画像情報とを取得する。
 上記制御ユニットは、上記複数の画像情報から上記ユーザの移動方向に存在する特定対象物に関連する画像情報を抽出し、抽出された上記画像情報を上記表示領域に提示する。
An image display system according to an embodiment of the present technology includes a display unit, an acquisition unit, and a control unit.
The display unit has a display area.
The acquisition unit acquires user information including information related to a user's position and behavior, and a plurality of pieces of image information related to a plurality of specific objects existing in a peripheral region within a first distance from the user. .
The control unit extracts image information related to a specific object existing in the moving direction of the user from the plurality of image information, and presents the extracted image information in the display area.
 以上のように、本技術によれば、ユーザにとって有用と推定される情報を優先的に提示することができる。
 なお、ここに記載された効果は必ずしも限定されるものではなく、本開示中に記載されたいずれかの効果であってもよい。
As described above, according to the present technology, it is possible to preferentially present information that is estimated to be useful to the user.
Note that the effects described here are not necessarily limited, and may be any of the effects described in the present disclosure.
本技術の一実施形態に係るウェアラブルディスプレイ(HMD)の機能を説明する概略図である。It is the schematic explaining the function of the wearable display (HMD) which concerns on one Embodiment of this technique. 上記HMDの機能の一例を説明する図であって、表示部に提示される視野の概略図である。It is a figure explaining an example of the function of the above-mentioned HMD, and is a schematic diagram of a visual field presented on a display part. 上記HMDの機能の一例を説明する図であって、表示部に提示される視野の概略図である。It is a figure explaining an example of the function of the above-mentioned HMD, and is a schematic diagram of a visual field presented on a display part. 上記HMDを含むシステムの全体図である。1 is an overall view of a system including the HMD. 上記システムの構成を示すブロック図である。It is a block diagram which shows the structure of the said system. 上記HMDにおける制御ユニットの機能ブロック図である。It is a functional block diagram of the control unit in the HMD. 上記HMDにおける円筒座標系における座標位置の説明図である。It is explanatory drawing of the coordinate position in the cylindrical coordinate system in the said HMD. 視野とオブジェクトとの関係を概念的に示す上記円筒座標の展開図である。It is an expanded view of the said cylindrical coordinate which shows notionally the relationship between a visual field and an object. 円筒座標(ワールド座標)から視野(ローカル座標)への変換方法を説明する図である。It is a figure explaining the conversion method from a cylindrical coordinate (world coordinate) to a visual field (local coordinate). 円筒座標(ワールド座標)から視野(ローカル座標)への変換方法を説明する図である。It is a figure explaining the conversion method from a cylindrical coordinate (world coordinate) to a visual field (local coordinate). 上記システムの動作の概要を説明するフローチャートである。It is a flowchart explaining the outline | summary of operation | movement of the said system. 上記HMDの一作用を説明する図である。It is a figure explaining one effect | action of said HMD. 上記HMDの一作用を説明する視野の模式図である。It is a schematic diagram of the visual field explaining one effect | action of said HMD. 上記HMDの一作用を説明する視野の模式図である。It is a schematic diagram of the visual field explaining one effect | action of said HMD. 本技術の第2の実施形態に係るウェアラブルディスプレイ(HMD)の動作を説明するフローチャートである。12 is a flowchart for explaining the operation of a wearable display (HMD) according to a second embodiment of the present technology. 上記HMDの一作用を説明する図である。It is a figure explaining one effect | action of said HMD. 上記HMDの一部の構成例を示すブロック図である。It is a block diagram which shows the example of a structure of a part of said HMD. 上記HMDの一作用を説明する視野の模式図である。It is a schematic diagram of the visual field explaining one effect | action of said HMD. 上記HMDの一作用を説明する視野の模式図である。It is a schematic diagram of the visual field explaining one effect | action of said HMD. 本技術の第3の実施形態に係るウェアラブルディスプレイ(HMD)の適用例を説明する図である。It is a figure explaining the application example of the wearable display (HMD) which concerns on 3rd Embodiment of this technique. 上記HMDの他の適用例を説明する図である。It is a figure explaining the other application example of the said HMD. 上記HMDの他の適用例を説明する図である。It is a figure explaining the other application example of the said HMD.
 以下、本技術に係る実施形態を、図面を参照しながら説明する。 Hereinafter, embodiments of the present technology will be described with reference to the drawings.
<第1の実施形態>
[HMDシステムの概略構成]
 図1は、本技術の一実施形態に係るウェアラブルディスプレイとしてのヘッドマウントディスプレイ(以下「HMD」と称する。)の機能を説明する概略図である。
 図1において、X軸方向及びY軸方向は相互に直交する水平方向を示し、Z軸方向は鉛直軸方向を示している。これらXYZ直交座標系は、ユーザの属する実空間の座標系(実3次元座標系)を表し、X軸の矢印は北方向を示し、Y軸の矢印は東方向を示している。またZ軸の矢印は重力方向を示している。
<First Embodiment>
[Schematic configuration of HMD system]
FIG. 1 is a schematic diagram illustrating functions of a head mounted display (hereinafter referred to as “HMD”) as a wearable display according to an embodiment of the present technology.
In FIG. 1, an X-axis direction and a Y-axis direction indicate horizontal directions orthogonal to each other, and a Z-axis direction indicates a vertical axis direction. These XYZ orthogonal coordinate systems represent a coordinate system (real three-dimensional coordinate system) of the real space to which the user belongs, the X-axis arrow indicates the north direction, and the Y-axis arrow indicates the east direction. The Z-axis arrow indicates the direction of gravity.
 まず図1を参照して、本実施形態に係るHMD100の基本的な機能の概要について説明する。 First, an outline of basic functions of the HMD 100 according to the present embodiment will be described with reference to FIG.
[HMDの機能の概要]
 本実施形態のHMD100は、ユーザUの頭部に装着され、ユーザUの実空間の視野V(表示視野)に仮想的な画像(ARオブジェクト。以下、オブジェクトともいう)を表示することが可能に構成される。視野Vに表示されるオブジェクトには、当該視野Vに存在する特定対象物(A1,A2,A3,A4,…。以下、個別に説明する場合を除き、特定対象物Aと総称する)に関連する情報のほか、これら特定対象物A以外の情報が含まれる。
[Overview of HMD functions]
The HMD 100 according to the present embodiment is mounted on the head of the user U and can display a virtual image (AR object; hereinafter also referred to as an object) in the field of view V (display field of view) of the user U. Composed. The object displayed in the field of view V relates to a specific object (A1, A2, A3, A4,..., Hereinafter collectively referred to as a specific object A unless otherwise described). Information other than the specific object A is included.
 より具体的に、特定対象物Aとしては、例えば、ユーザUの周囲に存在する風景、店舗、商品のほか、バス停、駅、空港等の交通機関のターミナルが該当する。特定対象物に関連する情報としては、図2に模式的に示すように、視野Vの特定の店舗A10において特定のクーポンが利用可能であることを知らせるオブジェクトB10が表示される。 More specifically, the specific object A corresponds to, for example, scenery, shops, products existing around the user U, and terminals of transportation such as bus stops, stations, and airports. As information related to the specific object, as schematically illustrated in FIG. 2, an object B10 that indicates that a specific coupon is available in a specific store A10 with a visual field V is displayed.
 一方、特定対象物A以外の情報としては、例えば、ユーザにより設定された目的地への進路に関連する情報等が該当し、図3に模式的に示すように、表示部10の方位に関連して道路あるいは通路の進行方向を示す「矢印」等で構成されたオブジェクトB20が表示される。その他、HMD100の機能を設定するメニュー画面や、パターン認証画面等が該当する。 On the other hand, the information other than the specific object A includes, for example, information related to the route to the destination set by the user, and is related to the orientation of the display unit 10 as schematically shown in FIG. Then, an object B20 composed of “arrows” indicating the traveling direction of the road or passage is displayed. In addition, a menu screen for setting the functions of the HMD 100, a pattern authentication screen, and the like are applicable.
 図1を参照して、HMD100は、HMDを装着したユーザUを包囲する仮想上のワールド座標系に対応付けられたオブジェクト(B1,B2,B3,B4,…。以下、個別に説明する場合を除き、オブジェクトBと総称する)を予め記憶する。ワールド座標系は、ユーザの属する実空間と等価な座標系であって、ユーザUの位置及び所定の軸方向を基準とした特定対象物Aの位置を定める。本実施形態においてワールド座標には、鉛直軸を軸心とする円筒座標C0が採用されるが、これ以外にユーザUを中心とする天球座標等の他の3次元座標が採用されてもよい。 Referring to FIG. 1, HMD 100 includes objects (B 1, B 2, B 3, B 4,..., Which are individually described below, surrounding a virtual world coordinate system surrounding user U wearing HMD. Except for object B). The world coordinate system is a coordinate system equivalent to the real space to which the user belongs, and determines the position of the specific object A based on the position of the user U and a predetermined axial direction. In the present embodiment, cylindrical coordinates C0 with the vertical axis as the center axis are adopted as the world coordinates, but other three-dimensional coordinates such as celestial coordinates centered on the user U may be adopted.
 円筒座標C0の半径R、高さHは任意に設定可能である。ここでは、半径Rは、ユーザUから特定対象物Aまでの距離よりも短く設定されるが、上記距離よりも長くてもよい。また高さHは、HMD100を介して提供されるユーザUの視野Vの高さ(縦方向の長さ)Hv以上の大きさに設定される。 The radius R and height H of the cylindrical coordinates C0 can be set arbitrarily. Here, the radius R is set shorter than the distance from the user U to the specific object A, but may be longer than the distance. The height H is set to a size equal to or greater than the height (length in the vertical direction) Hv of the user's U visual field V provided via the HMD 100.
 オブジェクトBは、上述のように、上記ワールド座標系に存在する特定対象物Aに関連する情報、又は特定対象物Aに関連しない情報を含む。オブジェクトBは、文字や絵柄等を含む画像であってもよいし、アニメーション画像であってもよい。またオブジェクトBは、2次元画像であってもよいし、3次元画像であってもよい。さらにオブジェクトBの形状は、矩形、円形その他の任意の又は有意の幾何学的形状であってもよく、オブジェクトBの種類(属性)や表示内容によって適宜設定可能である。 As described above, the object B includes information related to the specific object A existing in the world coordinate system or information not related to the specific object A. The object B may be an image including characters and pictures or an animation image. The object B may be a two-dimensional image or a three-dimensional image. Furthermore, the shape of the object B may be any arbitrary or significant geometric shape such as a rectangle, a circle, or the like, and can be appropriately set depending on the type (attribute) of the object B and the display content.
 円筒座標C0上におけるオブジェクトBの座標位置は、例えば、特定対象物Aを注視するユーザの目線Lと円筒座標C0との交差位置にそれぞれ対応付けられる。図1の例では、オブジェクトB1~B4各々の中心位置を上記交差位置に一致させたが、これに限られず、オブジェクトB1~B4の周縁の一部(例えば四隅の一部)を上記交差位置に一致させてもよい。あるいは、オブジェクトB1~B4の座標位置が上記交差位置から離れた任意の位置に対応付けられてもよい。 The coordinate position of the object B on the cylindrical coordinates C0 is associated with, for example, the intersection position between the user's eye line L watching the specific object A and the cylindrical coordinates C0. In the example of FIG. 1, the center position of each of the objects B1 to B4 is made coincident with the intersection position. However, the present invention is not limited to this, and a part of the periphery of the objects B1 to B4 (for example, part of the four corners) You may match. Alternatively, the coordinate positions of the objects B1 to B4 may be associated with any position away from the intersection position.
 円筒座標C0は、北方向を0°とした鉛直軸周りの角度を表す周方向の座標軸(θ)と、ユーザUの水平方向の目線Lhを基準とした上下方向の角度を表す高さ方向の座標軸(h)とを有する。座標軸(θ)は、東周りを正方向とし、座標軸(h)は、俯角を正方向、仰角を負方向としている。 The cylindrical coordinate C0 is a height in the height direction that represents a vertical coordinate axis (θ) that represents an angle around the vertical axis with the north direction being 0 °, and a horizontal line of sight Lh of the user U. Coordinate axis (h). The coordinate axis (θ) has a positive direction around the east, and the coordinate axis (h) has a depression angle as a positive direction and an elevation angle as a negative direction.
 後述するように、HMD100は、ユーザUの視点方向を検出するための検出部を有しており、当該検出部の出力に基づいて、ユーザUの視野Vが円筒座標C0上のどの領域に対応するかを判定する。そしてHMD100は、視野Vを形成するxy座標系の対応領域に何れかのオブジェクト(例えばオブジェクトB1)が存在するときは、上記対応領域にオブジェクトB1を視野Vに提示(描画)する。 As will be described later, the HMD 100 has a detection unit for detecting the viewpoint direction of the user U. Based on the output of the detection unit, the HMD 100 corresponds to which region on the cylindrical coordinate C0 the user's visual field V corresponds to. Judge whether to do. When any object (for example, object B1) exists in the corresponding area of the xy coordinate system that forms the visual field V, the HMD 100 presents (draws) the object B1 in the visual field V in the corresponding area.
 以上のように本実施形態のHMD100は、実空間の特定対象物A1に重畳してARオブジェクト(B1)を視野Vに表示することで、ユーザUに対して対象物A1に関連する情報を提示する。またHMD100は、ユーザUの視点の方位あるいは方向に応じて、所定の特定対象物A1~A4に関するARオブジェクト(B1~B4)をユーザUに提示する。 As described above, the HMD 100 of the present embodiment displays information related to the target object A1 to the user U by displaying the AR object (B1) in the field of view V superimposed on the specific target object A1 in the real space. To do. Further, the HMD 100 presents to the user U AR objects (B1 to B4) related to predetermined specific objects A1 to A4 according to the orientation or direction of the viewpoint of the user U.
 続いて、HMD100の詳細について説明する。図4は、HMD100を示す全体図であり、図5は、その構成を示すブロック図である。 Subsequently, details of the HMD 100 will be described. 4 is an overall view showing the HMD 100, and FIG. 5 is a block diagram showing the configuration thereof.
[HMDの構成]
 HMD100は、表示部10と、表示部10の姿勢を検出する検出部20と、表示部10の駆動を制御する制御ユニット30とを有する。本実施形態においてHMD100は、ユーザに実空間の視野Vを提供可能なシースルー型のHMDで構成されている。
[Configuration of HMD]
The HMD 100 includes a display unit 10, a detection unit 20 that detects the attitude of the display unit 10, and a control unit 30 that controls driving of the display unit 10. In the present embodiment, the HMD 100 is configured as a see-through HMD that can provide a user with a visual field V in real space.
 (表示部)
 表示部10は、ユーザUの頭部に装着可能に構成される。表示部10は、第1及び第2の表示面11R,11Lと、第1及び第2の画像生成部12R,12Lと、支持体13とを有する。
(Display section)
The display unit 10 is configured to be attachable to the user U's head. The display unit 10 includes first and second display surfaces 11R and 11L, first and second image generation units 12R and 12L, and a support body 13.
 第1及び第2の表示面11R,11Lは、それぞれユーザUの右眼及び左眼に実空間の視野(外界視野)を提供可能な光透過性の表示領域110を有する光学素子で構成される。第1及び第2の画像生成部12R,12Lは、それぞれ第1及び第2の表示面11R,11Lを介してユーザUへ提示される画像を生成可能に構成される。支持体13は、表示面11R,11L及び画像生成部12R,12Lを支持し、第1及び第2の表示面11L,11RがユーザUの右眼及び左眼にそれぞれ対向するようにユーザの頭部に装着されることが可能な適宜の形状を有する。 The first and second display surfaces 11R and 11L are configured by an optical element having a light-transmissive display region 110 that can provide a real space visual field (external field visual field) to the right eye and the left eye of the user U, respectively. . The first and second image generation units 12R and 12L are configured to be able to generate images to be presented to the user U via the first and second display surfaces 11R and 11L, respectively. The support 13 supports the display surfaces 11R and 11L and the image generation units 12R and 12L, and the user's head so that the first and second display surfaces 11L and 11R face the right eye and the left eye of the user U, respectively. It has an appropriate shape that can be attached to the part.
 以上のように構成される表示部10は、ユーザUに対して、表示面11R,11Lを介して実空間に所定の画像(あるいは虚像)が重畳された視野Vを提供することが可能に構成される。この場合、右眼用の円筒座標C0と左眼用の円筒座標C0とがそれぞれ設定され、各円筒座標に描画されたオブジェクトが表示面11R,11Lの表示領域110にそれぞれ投影される。 The display unit 10 configured as described above is configured to provide the user U with a field of view V in which a predetermined image (or virtual image) is superimposed on the real space via the display surfaces 11R and 11L. Is done. In this case, the cylindrical coordinates C0 for the right eye and the cylindrical coordinates C0 for the left eye are set, and the objects drawn on the cylindrical coordinates are respectively projected onto the display areas 110 of the display surfaces 11R and 11L.
 (検出部)
 検出部20は、表示部10の少なくとも一軸周りの方位あるいは姿勢変化を検出することが可能に構成される。本実施形態において検出部20は、X,Y及びZ軸周りの表示部10の方位あるいは姿勢変化をそれぞれ検出するように構成されている。
(Detection unit)
The detection unit 20 is configured to be able to detect a change in orientation or posture around at least one axis of the display unit 10. In the present embodiment, the detection unit 20 is configured to detect a change in orientation or posture of the display unit 10 around the X, Y, and Z axes.
 ここで、表示部10の方位とは、典型的には、表示部10の正面方向を意味する。本実施形態では、表示部10の方位は、ユーザUの顔の向きと定義される。 Here, the orientation of the display unit 10 typically means the front direction of the display unit 10. In the present embodiment, the orientation of the display unit 10 is defined as the face direction of the user U.
 また、検出部20は、表示部10に作用する上記3軸方向の加速度をそれぞれ検出することが可能な加速度センサや角速度センサ等のモーションセンサ、表示部10の周囲の照度(明るさ)を検出することが可能な照度センサ等を備える。また、検出部20は撮像素子(カメラ)を内蔵していてもよく、これにより表示部10に作用する加速度や移動速度、あるいは照度等を検出するように構成されてもよい。 In addition, the detection unit 20 detects a motion sensor such as an acceleration sensor or an angular velocity sensor that can detect the accelerations in the three axis directions acting on the display unit 10, and illuminance (brightness) around the display unit 10. An illuminance sensor or the like that can be used. In addition, the detection unit 20 may include an image sensor (camera), and may be configured to detect acceleration, movement speed, illuminance, and the like acting on the display unit 10.
 検出部20は、角速度センサ及び加速度センサの各々を3軸方向に配置したセンサユニットで構成されてもよいし、各軸に応じて使用するセンサを異ならせてもよい。表示部10の姿勢変化、変化の方向及びその変化の量等は、例えば角速度センサの出力の積分値を用いることができる。 The detection unit 20 may be configured by a sensor unit in which each of the angular velocity sensor and the acceleration sensor is arranged in three axial directions, or may use different sensors depending on the respective axes. For example, an integrated value of the output of the angular velocity sensor can be used as the posture change of the display unit 10, the direction of the change, the amount of the change, and the like.
 また、鉛直軸(Z軸)周りの表示部10の方位の検出には、地磁気センサが採用されてもよい。あるいは、地磁気センサと上記モーションセンサとが組み合わされてもよい。これにより精度の高い表示部10の方位あるいは姿勢変化の検出が可能となる。 Further, a geomagnetic sensor may be employed for detecting the orientation of the display unit 10 around the vertical axis (Z axis). Alternatively, the geomagnetic sensor and the motion sensor may be combined. As a result, it is possible to detect the orientation or orientation change of the display unit 10 with high accuracy.
 検出部20は、表示部10の適宜の位置に配置されている。検出部20の位置は特に限定されず、例えば画像生成部12R,12Lのいずれか一方、あるいは支持体13の一部に配置される。検出部20の数も1つに限られず、表示部10の適宜の位置に複数配置されてもよい。 The detection unit 20 is arranged at an appropriate position on the display unit 10. The position of the detection unit 20 is not particularly limited. For example, the detection unit 20 is disposed on one of the image generation units 12R and 12L or a part of the support 13. The number of detection units 20 is not limited to one, and a plurality of detection units 20 may be arranged at appropriate positions on the display unit 10.
 (制御ユニット)
 制御ユニット30は、検出部20の出力に基づいて、表示部10(画像生成部12R,12L)の駆動を制御する制御信号を生成する。本実施形態において制御ユニット30は、接続ケーブル30aを介して表示部10と電気的に接続されている。勿論これに限られず、制御ユニット30は表示部10と無線通信回線を通じて接続されてもよい。
(Controller unit)
The control unit 30 generates a control signal for controlling the driving of the display unit 10 ( image generation units 12R and 12L) based on the output of the detection unit 20. In the present embodiment, the control unit 30 is electrically connected to the display unit 10 via a connection cable 30a. Of course, the present invention is not limited to this, and the control unit 30 may be connected to the display unit 10 through a wireless communication line.
 図5に示すように制御ユニット30は、CPU301と、メモリ302(記憶部)と、送受信部303と、内部電源304と、入力操作部305とを有する。 As shown in FIG. 5, the control unit 30 includes a CPU 301, a memory 302 (storage unit), a transmission / reception unit 303, an internal power supply 304, and an input operation unit 305.
 CPU301は、HMD100全体の動作を制御する。メモリ302は、ROM(Read Only Memory)及びRAM(Random Access Memory)等を有し、CPU301によるHMD100の制御を実行するためのプログラムや各種パラメータ、表示部10で表示するべき画像(オブジェクト)、その他必要なデータを記憶する。送受信部303は、後述する携帯情報端末200との通信のためのインターフェースを構成する。内部電源304は、HMD100の駆動に必要な電力を供給する。 CPU 301 controls the operation of the entire HMD 100. The memory 302 includes a ROM (Read Only Memory), a RAM (Random Access Memory), and the like, a program for executing control of the HMD 100 by the CPU 301, various parameters, an image (object) to be displayed on the display unit 10, and the like. Store the necessary data. The transmission / reception unit 303 constitutes an interface for communication with the portable information terminal 200 described later. The internal power supply 304 supplies power necessary for driving the HMD 100.
 入力操作部305は、ユーザ操作によって表示部10で表示される画像を制御するためのものである。入力操作部305は、メカニカルスイッチで構成されてもよいし、タッチセンサで構成されてもよい。入力操作部305は、表示部10に設けられてもよい。 The input operation unit 305 is for controlling an image displayed on the display unit 10 by a user operation. The input operation unit 305 may be configured with a mechanical switch or a touch sensor. The input operation unit 305 may be provided in the display unit 10.
 HMD100はさらに、スピーカ等の音響出力部、カメラ等を備えていてもよい。この場合、上記音声出力部及びカメラは、典型的には表示部10に設けられる。さらに制御ユニット30には、表示部10の入力操作画面等を表示する表示デバイスが設けられてもよい。この場合、入力操作部305は、当該表示デバイスに設けられたタッチパネルで構成されてもよい。 The HMD 100 may further include a sound output unit such as a speaker, a camera, and the like. In this case, the audio output unit and the camera are typically provided in the display unit 10. Furthermore, the control unit 30 may be provided with a display device that displays an input operation screen or the like of the display unit 10. In this case, the input operation unit 305 may be configured by a touch panel provided in the display device.
 (携帯情報端末)
 携帯情報端末200は、制御ユニット30と無線通信回線を介して相互に通信可能に構成されている。携帯情報端末200は、表示部10で表示するべき画像(オブジェクト)を取得する機能(取得部)と、取得した画像(オブジェクト)を制御ユニット30へ送信する機能とを有する。携帯情報端末200は、HMD100と有機的に組み合わされることで、HMDシステム(画像表示システム)を構築する。
(Personal digital assistant)
The portable information terminal 200 is configured to be able to communicate with the control unit 30 via a wireless communication line. The portable information terminal 200 has a function (acquisition unit) for acquiring an image (object) to be displayed on the display unit 10 and a function for transmitting the acquired image (object) to the control unit 30. The portable information terminal 200 is organically combined with the HMD 100 to construct an HMD system (image display system).
 携帯情報端末200は、表示部10を装着するユーザUによって携帯され、パーソナルコンピュータ(PC:Personal Computer)、スマートフォン、携帯電話機、タブレットPC、PDA(Personal Digital Assistant)等の情報処理装置で構成されるが、HMD100専用の端末装置であってもよい。 The portable information terminal 200 is carried by a user U who wears the display unit 10 and includes an information processing device such as a personal computer (PC: Personal Computer), a smartphone, a mobile phone, a tablet PC, or a PDA (Personal Digital Assistant). However, a terminal device dedicated to the HMD 100 may be used.
 図5に示すように携帯情報端末200は、CPU201と、メモリ202と、送受信部203と、内部電源204と、表示部205と、カメラ206と、位置情報取得部207とを有する。 As shown in FIG. 5, the portable information terminal 200 includes a CPU 201, a memory 202, a transmission / reception unit 203, an internal power supply 204, a display unit 205, a camera 206, and a position information acquisition unit 207.
 CPU201は、携帯情報端末200全体の動作を制御する。メモリ202は、ROM及びRAM等を有し、CPU201による携帯情報端末200の制御を実行するためのプログラムや各種パラメータ、制御ユニット30へ送信される画像(オブジェクト)、その他必要なデータを記憶する。内部電源204は、携帯情報端末200の駆動に必要な電力を供給する。 The CPU 201 controls the operation of the mobile information terminal 200 as a whole. The memory 202 includes a ROM, a RAM, and the like, and stores programs and various parameters for executing control of the portable information terminal 200 by the CPU 201, images (objects) transmitted to the control unit 30, and other necessary data. The internal power supply 204 supplies power necessary for driving the portable information terminal 200.
 送受信部203は、WiFi(Wireless Fidelity)等の無線LAN(IEEE802.11等)や移動通信用の3Gや4Gのネットワークを用いて、サーバNや制御ユニット30、近隣の他の携帯情報端末等と通信する。携帯情報端末200は、送受信部203を介してサーバNから、制御ユニット30へ送信するべき画像(オブジェクト)やそれを表示するためのアプリケーション等をダウンロードし、メモリ202へ格納する。 The transmission / reception unit 203 is connected to the server N, the control unit 30, other nearby portable information terminals, etc. using a wireless LAN (IEEE802.11, etc.) such as WiFi (Wireless Fidelity) or a 3G or 4G network for mobile communication. connect. The portable information terminal 200 downloads an image (object) to be transmitted to the control unit 30 from the server N via the transmission / reception unit 203, an application for displaying the image, and the like, and stores them in the memory 202.
 表示部205は、例えばLCDやOLEDで構成され、各種メニューやアプリケーションのGUI等を表示する。典型的には、表示部205は、タッチパネルと一体とされており、ユーザのタッチ操作を受け付け可能である。携帯情報端末200は、表示部205のタッチ操作によって制御ユニット30へ所定の操作信号を入力することが可能に構成されている。 The display unit 205 is composed of, for example, an LCD or an OLED, and displays various menus, application GUIs, and the like. Typically, the display unit 205 is integrated with a touch panel and can accept a user's touch operation. The portable information terminal 200 is configured to be able to input a predetermined operation signal to the control unit 30 by a touch operation on the display unit 205.
 位置情報取得部207は、典型的にはGPS(Global Positioning System)受信機を含む。携帯情報端末200は、位置情報取得部207を用いてユーザU(表示部10)の現在位置(経度、緯度、高度)を測位し、ユーザU(表示部10)の移動速度を算出し、サーバNから必要な画像(オブジェクト)、ロードマップ等の地図情報を取得することが可能に構成される。すなわちサーバNは、ユーザの現在位置や行動に関する情報(ユーザ情報)を取得し、これらの情報に応じた画像データやアプリケーションソフトウェア、地図情報等を携帯情報端末200へ送信する。 The location information acquisition unit 207 typically includes a GPS (Global Positioning System) receiver. The portable information terminal 200 measures the current position (longitude, latitude, altitude) of the user U (display unit 10) using the position information acquisition unit 207, calculates the moving speed of the user U (display unit 10), and Map information such as necessary images (objects) and road maps can be acquired from N. That is, the server N acquires information (user information) regarding the current position and behavior of the user, and transmits image data, application software, map information, and the like corresponding to the information to the portable information terminal 200.
 サーバNは、典型的にはCPU、メモリ等を含むコンピュータで構成され、ユーザUの要求に応じて、あるいはユーザUの意図に依らず自動的に、所定の情報を携帯情報端末200へ送信する。サーバNは、HMD100が表示し得る複数種の画像データを蓄積する。サーバNは、上記所定の情報の一部として、ユーザUの位置、操作等に応じて選択される複数の画像データを携帯情報端末200へ一括的に又は逐次的に送信することが可能に構成される。 The server N is typically configured by a computer including a CPU, a memory, and the like, and transmits predetermined information to the portable information terminal 200 in response to a request from the user U or automatically regardless of the intention of the user U. . The server N stores a plurality of types of image data that can be displayed by the HMD 100. The server N is configured to be able to transmit a plurality of image data selected according to the position, operation, etc. of the user U to the portable information terminal 200 collectively or sequentially as a part of the predetermined information. Is done.
 より具体的に、本実施形態では、ユーザU(表示部10)の現在位置から所定の距離以内の周辺領域に存在する複数の特定対象物に関連するARオブジェクトに係る画像情報を携帯情報端末200へ送信する。上記所定の距離は、固定であってもよいし可変であってもよく、典型的には、携帯情報端末200が自発的に又は制御ユニット30からの指示に基づいて、携帯情報端末200からサーバNへ上記所定距離が指示される。 More specifically, in the present embodiment, the mobile information terminal 200 displays image information related to an AR object related to a plurality of specific objects existing in a peripheral area within a predetermined distance from the current position of the user U (display unit 10). Send to. The predetermined distance may be fixed or variable, and typically, the mobile information terminal 200 voluntarily or based on an instruction from the control unit 30, the mobile information terminal 200 sends a server The predetermined distance is instructed to N.
 (制御ユニットの詳細)
 次に、制御ユニット30の詳細について説明する。
(Details of control unit)
Next, details of the control unit 30 will be described.
 図6は、CPU301の機能ブロック図である。CPU301は、座標設定部311と、画像管理部312と、座標判定部313と、画像抽出部314と、表示制御部315とを有する。CPU301は、メモリ302に格納されたプログラムに従って、これら座標設定部311、画像管理部312、座標判定部313、画像抽出部314及び表示制御部315における処理を実行する。 FIG. 6 is a functional block diagram of the CPU 301. The CPU 301 includes a coordinate setting unit 311, an image management unit 312, a coordinate determination unit 313, an image extraction unit 314, and a display control unit 315. The CPU 301 executes processing in the coordinate setting unit 311, the image management unit 312, the coordinate determination unit 313, the image extraction unit 314, and the display control unit 315 according to the program stored in the memory 302.
 座標設定部311は、ユーザU(表示部10)を包囲する3次元座標を設定する処理を実行するように構成される。本例では、上記3次元座標は、鉛直軸Azを中心とする円筒座標C0(図1参照)が用いられる。座標設定部311は、円筒座標C0の半径Rと高さHをそれぞれ設定する。座標設定部311は、典型的には、ユーザUに提示するべきオブジェクトの数や種類等に応じて円筒座標C0の半径R及び高さHを設定する。 The coordinate setting unit 311 is configured to execute processing for setting three-dimensional coordinates surrounding the user U (display unit 10). In this example, cylindrical coordinates C0 (see FIG. 1) centered on the vertical axis Az are used as the three-dimensional coordinates. The coordinate setting unit 311 sets the radius R and the height H of the cylindrical coordinates C0. The coordinate setting unit 311 typically sets the radius R and the height H of the cylindrical coordinates C0 according to the number and type of objects to be presented to the user U.
 円筒座標C0の半径Rは固定値でもよいが、表示するべき画像の大きさ(ピクセルサイズ)等に応じて任意に設定可能な可変値であってもよい。円筒座標C0の高さHは、表示部10によってユーザUへ提供される視野Vの縦方向(垂直方向)の高さHv(図1参照)の例えば1倍以上3倍以下の大きさに設定される。高さHの上限は、Hvの3倍に限られず、Hvの3倍を超える大きさであってもよい。 The radius R of the cylindrical coordinates C0 may be a fixed value, but may be a variable value that can be arbitrarily set according to the size (pixel size) of the image to be displayed. The height H of the cylindrical coordinates C0 is set to, for example, 1 to 3 times the height Hv (see FIG. 1) of the vertical direction (vertical direction) of the visual field V provided to the user U by the display unit 10. Is done. The upper limit of the height H is not limited to 3 times Hv, but may be a size exceeding 3 times Hv.
 上述のように円筒座標C0は、北方向を0°とした鉛直軸周りの角度を表す周方向の座標軸(θ)と、ユーザUの水平方向の目線Lhを基準とした上下方向の角度を表す高さ方向の座標軸(h)とを有する。座標軸(θ)は、東周りを正方向とし、座標軸(h)は、俯角を正方向、仰角を負方向としている。高さhは、視野Vの高さHvの大きさを100%としたときの大きさを表し、円筒座標C0の原点OP1は、北方向の方位(0°)とユーザUの水平方向の目線Lh(h=0%)との交点に設定される。 As described above, the cylindrical coordinate C0 represents the angle in the vertical direction with reference to the circumferential coordinate axis (θ) representing the angle around the vertical axis with the north direction being 0 ° and the horizontal line of sight Lh of the user U. And a coordinate axis (h) in the height direction. The coordinate axis (θ) has a positive direction around the east, and the coordinate axis (h) has a depression angle as a positive direction and an elevation angle as a negative direction. The height h represents the size when the height Hv of the visual field V is 100%, and the origin OP1 of the cylindrical coordinates C0 is the north direction (0 °) and the horizontal line of sight of the user U It is set at the intersection with Lh (h = 0%).
 画像管理部312は、メモリ302に格納された画像を管理する機能を有し、例えば、表示部10を介して表示される単数又は複数の画像をメモリ302へ格納し、及び、メモリ302に格納された画像を選択的に削除する処理を実行するように構成される。メモリ302へ格納される画像は、携帯情報端末200から送信される。画像管理部312はまた、送受信部303を介して携帯情報端末200に対して画像の送信を要求する。 The image management unit 312 has a function of managing the images stored in the memory 302. For example, the image management unit 312 stores one or a plurality of images displayed via the display unit 10 in the memory 302 and stores them in the memory 302. It is configured to execute a process of selectively deleting the processed image. An image stored in the memory 302 is transmitted from the portable information terminal 200. The image management unit 312 also requests the portable information terminal 200 to transmit an image via the transmission / reception unit 303.
 メモリ302は、視野Vに表示すべき単数又は複数の画像(オブジェクト)を円筒座標C0に対応付けて記憶することができるように構成される。すなわちメモリ302は、図1に示した円筒座標C0上の個々のオブジェクトB1~B4を円筒座標C0上の座標位置とともに記憶する。 The memory 302 is configured to store one or a plurality of images (objects) to be displayed in the visual field V in association with the cylindrical coordinates C0. That is, the memory 302 stores the individual objects B1 to B4 on the cylindrical coordinates C0 shown in FIG. 1 together with the coordinate positions on the cylindrical coordinates C0.
 図7に示すように、円筒座標系(θ,h)と直交座標系(X,Y,Z)とは、X=rcosθ、Y=rsinθ、Z=h、の関係を有する。図1に示したように視野Vの方位あるいは姿勢に対応して表示すべき個々のオブジェクトB1~B4は、それぞれ固有の円筒座標C0上の座標領域を占めており、その領域内の特定の座標位置P(θ,h)とともにメモリ302に格納される。 As shown in FIG. 7, the cylindrical coordinate system (θ, h) and the orthogonal coordinate system (X, Y, Z) have a relationship of X = r cos θ, Y = rsin θ, and Z = h. As shown in FIG. 1, each of the objects B1 to B4 to be displayed corresponding to the azimuth or orientation of the visual field V occupies a coordinate area on a specific cylindrical coordinate C0, and specific coordinates in that area It is stored in the memory 302 together with the position P (θ, h).
 円筒座標C0上におけるオブジェクトB1~B4の座標(θ,h)は、直交座標系(X,Y,Z)で各々定義される対象物A1~A4の位置とユーザの位置とを結ぶ直線と、円筒座標C0の円筒面との交点における円筒座標系の座標に対応付けられる。すなわちオブジェクトB1~B4の座標は、それぞれ、実3次元座標から円筒座標C0に変換された対象物A1~A4の座標に相当する。このようなオブジェクトの座標変換は、例えば、画像管理部312において実行され、当該座標位置とともに各オブジェクトがメモリ302へ格納される。ワールド座標系に円筒座標C0が採用されることにより、オブジェクトB1~B4を平面的に描画することができる。 The coordinates (θ, h) of the objects B1 to B4 on the cylindrical coordinates C0 are a straight line connecting the positions of the objects A1 to A4 defined by the orthogonal coordinate system (X, Y, Z) and the position of the user, Corresponding to the coordinates of the cylindrical coordinate system at the intersection of the cylindrical coordinates C0 with the cylindrical surface. That is, the coordinates of the objects B1 to B4 correspond to the coordinates of the objects A1 to A4 converted from the actual three-dimensional coordinates to the cylindrical coordinates C0, respectively. For example, the coordinate conversion of such an object is executed in the image management unit 312 and each object is stored in the memory 302 together with the coordinate position. By adopting the cylindrical coordinates C0 in the world coordinate system, the objects B1 to B4 can be drawn in a plane.
 オブジェクトB1~B4の座標位置は、各オブジェクトB1~B4の表示領域内であればどの位置に設定されてもよく、1つのオブジェクトにつき特定の1点(例えば中心位置)に設定されてもよいし、2点以上(例えば対角の2点、あるいは四隅の点)に設定されてもよい。 The coordinate positions of the objects B1 to B4 may be set to any position within the display area of each object B1 to B4, or may be set to one specific point (for example, the center position) for each object. Two or more points (for example, two diagonal points or four corner points) may be set.
 また図1に示したように、オブジェクトB1~B4の座標位置が対象物A1~A4を注視するユーザの目線Lと円筒座標C0との交差位置に対応付けられたとき、ユーザUは、オブジェクトB1~B4を対象物A1~A4と重なる位置で視認する。これに代えて、オブジェクトB1~B4の座標位置が上記交差位置から離れた任意の位置に対応付けられてもよい。これにより対象物A1~A4に対してオブジェクトB1~B4を所望の位置に表示あるいは描画することができる。 Also, as shown in FIG. 1, when the coordinate positions of the objects B1 to B4 are associated with the intersection position of the user's line of sight L that looks at the objects A1 to A4 and the cylindrical coordinates C0, the user U has the object B1. To B4 are visually recognized at positions overlapping the objects A1 to A4. Instead, the coordinate positions of the objects B1 to B4 may be associated with an arbitrary position away from the intersection position. Thus, the objects B1 to B4 can be displayed or drawn at desired positions with respect to the objects A1 to A4.
 座標判定部313は、検出部20の出力に基づいてユーザUの視野Vが円筒座標C0上のどの領域に対応するかを判定する処理を実行するように構成される。すなわち、視野Vは、ユーザU(表示部10)の姿勢変化によって円筒座標C0上を移動し、その移動方向や移動量は、検出部20の出力に基づいて算出される。座標判定部313は、検出部20の出力に基づいて表示部10の移動方向及び移動量を算出し、円筒座標C0上のどの領域に視野Vが属するかを判定する。 The coordinate determination unit 313 is configured to execute a process of determining which region on the cylindrical coordinates C0 corresponds to the field of view V of the user U based on the output of the detection unit 20. That is, the visual field V moves on the cylindrical coordinates C0 due to a change in the posture of the user U (display unit 10), and the movement direction and movement amount are calculated based on the output of the detection unit 20. The coordinate determination unit 313 calculates the movement direction and movement amount of the display unit 10 based on the output of the detection unit 20, and determines to which region on the cylindrical coordinates C0 the visual field V belongs.
 図8は、円筒座標C0上における視野VとオブジェクトB1~B4との関係を概念的に示す円筒座標C0の展開図である。視野Vは略矩形状であり、その左上の隅部を原点OP2とするxy座標(ローカル座標)を有する。x軸は、原点OP2から水平方向に延びる軸であり、y軸は、原点OP2から垂直方向に延びる軸である。そして座標判定部313は、視野Vの対応領域にオブジェクトB1~B4の何れかが存在するかどうかを判定する処理を実行するように構成される。 FIG. 8 is a development view of the cylindrical coordinates C0 conceptually showing the relationship between the visual field V on the cylindrical coordinates C0 and the objects B1 to B4. The field of view V is substantially rectangular and has xy coordinates (local coordinates) with the upper left corner as the origin OP2. The x axis is an axis extending in the horizontal direction from the origin OP2, and the y axis is an axis extending in the vertical direction from the origin OP2. The coordinate determination unit 313 is configured to execute processing for determining whether any of the objects B1 to B4 exists in the corresponding region of the visual field V.
 画像抽出部314は、メモリ302に格納された複数の画像情報(ARオブジェクト)からユーザU(表示部10)の移動方向に存在する特定対象物に関連する画像情報を抽出する。抽出方法は、HMD100が実行する機能(アプリケーションの種類)に応じて適宜設定され、典型的には、ユーザUが走行あるいは通行する道路沿いに存在する特定対象物のARオブジェクト等が挙げられる。 The image extraction unit 314 extracts image information related to the specific object existing in the moving direction of the user U (display unit 10) from the plurality of pieces of image information (AR object) stored in the memory 302. The extraction method is set as appropriate according to the function (type of application) executed by the HMD 100, and typically includes an AR object or the like of a specific object that exists along a road on which the user U runs or passes.
 また、画像抽出部314は、検出部20や携帯情報端末200から取得するユーザ情報の種類に応じて画像情報を抽出するように構成される。例えば、画像抽出部314は、ユーザUが静止している場合には、表示部10の方位(ユーザUの視線方向)に応じた全周囲のオブジェクトデータを抽出し、ユーザUが移動している場合には、その移動方向に沿ったオブジェクトデータを抽出する。 Also, the image extraction unit 314 is configured to extract image information according to the type of user information acquired from the detection unit 20 or the portable information terminal 200. For example, when the user U is stationary, the image extraction unit 314 extracts all-around object data corresponding to the orientation of the display unit 10 (the line of sight of the user U), and the user U is moving. In this case, the object data along the moving direction is extracted.
 また、画像抽出部314は、ユーザU(表示部10)の移動速度が所定以下の場合には、ユーザUの周辺に存在する特定対象物に関連するオブジェクトのみを抽出するように構成されてもよい。 Further, the image extraction unit 314 may be configured to extract only the objects related to the specific object existing around the user U when the moving speed of the user U (display unit 10) is equal to or lower than a predetermined value. Good.
 画像抽出部314における画像情報の抽出方法は、上述のようにHMD100が実行するアプリケーションの種類に応じて予め設定されるが、これ以外にも、ユーザによる入力操作で選択あるいは設定されてもよい。なお、上述した画像抽出部314の機能を無効化できる機能が設定されてもよい。 The image information extraction method in the image extraction unit 314 is preset according to the type of application executed by the HMD 100 as described above, but may be selected or set by an input operation by the user. Note that a function that can invalidate the function of the image extraction unit 314 described above may be set.
 表示制御部315は、検出部20の出力(すなわち座標判定部313の判定結果)に基づいて、画像抽出部314で抽出された画像情報(オブジェクト)を視野Vに提示(描画)する処理を実行するように構成される。例えば図8に示すように、視野Vの現在の方位が、円筒座標C0上のオブジェクトB1,B2の表示領域にそれぞれ重なる場合、それらの重なる領域B10,B20に相当する画像を視野Vに表示する(ローカルレンダリング:Local Rendering)。 The display control unit 315 performs a process of presenting (drawing) the image information (object) extracted by the image extraction unit 314 in the visual field V based on the output of the detection unit 20 (that is, the determination result of the coordinate determination unit 313). Configured to do. For example, as shown in FIG. 8, when the current orientation of the visual field V overlaps the display areas of the objects B1 and B2 on the cylindrical coordinates C0, images corresponding to the overlapping areas B10 and B20 are displayed in the visual field V. (Local rendering: Local) Rendering).
 図9A及び図9Bは、円筒座標C0(ワールド座標)から視野V(ローカル座標)への変換方法を説明する図である。 9A and 9B are diagrams for explaining a conversion method from the cylindrical coordinates C0 (world coordinates) to the visual field V (local coordinates).
 図9Aに示すように円筒座標C0上における視野Vの基準点の座標を(θv,hv)とし、視野Vの領域内に位置するオブジェクトBの基準点の座標を(θ0、h0)とする。視野V及びオブジェクトBの基準点はどの点に設定されてもよく、本例では矩形状である視野V及びオブジェクトBの左上のコーナ部に設定される。αv[°]は、ワールド座標上における視野Vの幅角度であり、その値は表示部10の設計あるいは仕様によって確定する。 As shown in FIG. 9A, the coordinates of the reference point of the visual field V on the cylindrical coordinates C0 are (θv, hv), and the coordinates of the reference point of the object B located in the region of the visual field V are (θ0, h0). The reference point of the field of view V and the object B may be set to any point, and in this example, the reference point is set at the upper left corner of the field of view V and the object B which are rectangular. αv [°] is the width angle of the visual field V on the world coordinates, and its value is determined by the design or specification of the display unit 10.
 表示制御部315は、円筒座標系(θ,h)をローカル座標系(x,y)に変換することで、視野VにおけるオブジェクトBの表示位置を決定する。図9Bに示すようにローカル座標系における視野Vの高さ及び幅をそれぞれHv及びWvとし、ローカル座標系(x,y)におけるオブジェクトBの基準点の座標を(x0,y0)とすると、変換式は、以下のとおりである。
  x0=(θ0-θv)・Wv/αv …(1)
  y0=(h0-hv)・Hv/100 …(2)
The display control unit 315 determines the display position of the object B in the visual field V by converting the cylindrical coordinate system (θ, h) to the local coordinate system (x, y). As shown in FIG. 9B, if the height and width of the field of view V in the local coordinate system are Hv and Wv, respectively, and the coordinates of the reference point of the object B in the local coordinate system (x, y) are (x0, y0), conversion is performed. The formula is as follows.
x0 = (θ0−θv) · Wv / αv (1)
y0 = (h0−hv) · Hv / 100 (2)
 表示制御部315は、表示部10の方位の変化に応じて、オブジェクトBを視野V内において上記表示部10の移動方向とは逆方向に移動させる。つまり、表示制御部315は、表示部10の方位あるいは姿勢の変化に追従して、視野V内でオブジェクトBの表示位置を変化させる。この制御は、視野VにオブジェクトBの少なくとも一部が存在する限り継続される。 The display control unit 315 moves the object B in the field of view V in the direction opposite to the moving direction of the display unit 10 in accordance with the change in orientation of the display unit 10. That is, the display control unit 315 changes the display position of the object B within the field of view V following the change in the orientation or orientation of the display unit 10. This control is continued as long as at least part of the object B exists in the visual field V.
[HMDの動作]
 次に、HMD100の典型的な動作について説明する。
[Operation of HMD]
Next, typical operations of the HMD 100 will be described.
 図10は、本実施形態に係るHMDシステムの動作の概要を説明するフローチャートである。 FIG. 10 is a flowchart for explaining an outline of the operation of the HMD system according to the present embodiment.
 まず、携帯情報端末200の位置情報取得部207を用いて、ユーザU(表示部10)の現在位置が測定される(ステップ101)。表示部10の位置情報は、サーバNに送信される。そして携帯情報端末200は、サーバNから、ユーザUから所定の距離(第1の距離)以内の周辺領域(実空間)に存在する特定対象物に関連するオブジェクトデータや地図情報等を取得する(ステップ102)。 First, the current position of the user U (display unit 10) is measured using the position information acquisition unit 207 of the portable information terminal 200 (step 101). The position information of the display unit 10 is transmitted to the server N. Then, the portable information terminal 200 acquires from the server N object data, map information, and the like related to a specific object existing in a peripheral area (real space) within a predetermined distance (first distance) from the user U ( Step 102).
 上記所定距離は特に限定されず、例えば、ユーザUを中心とする半径数百メートルから数キロメートル以内に設定される。なお、当該所定距離は、デフォルトで設定されていてもよいし、ユーザの入力操作によって設定されてもよい。 The predetermined distance is not particularly limited, and is set, for example, within a radius of several hundred meters to several kilometers centering on the user U. The predetermined distance may be set by default or may be set by a user input operation.
 次に、携帯情報端末200は、制御ユニット30へオブジェクトデータの送信準備ができたことを通知する。制御ユニット30(本例では座標設定部311)は、オブジェクデータの種類等に応じてワールド座標系としての円筒座標C0の高さ(H)及び半径(R)を設定する(ステップ103)。 Next, the portable information terminal 200 notifies the control unit 30 that it is ready to send object data. The control unit 30 (in this example, the coordinate setting unit 311) sets the height (H) and radius (R) of the cylindrical coordinates C0 as the world coordinate system according to the type of object data and the like (step 103).
 続いて、制御ユニット30は、検出部20の出力に基づいて視野Vの方位を検出し(ステップ104)、携帯情報端末200からオブジェクトデータを取得してメモリ302へ格納する(ステップ105)。視野Vの方位は、ワールド座標系(θ,h)に換算されて、円筒座標C0上のどの位置に対応するかモニタリングされる。 Subsequently, the control unit 30 detects the orientation of the visual field V based on the output of the detection unit 20 (step 104), acquires the object data from the portable information terminal 200, and stores it in the memory 302 (step 105). The orientation of the visual field V is converted into the world coordinate system (θ, h), and it is monitored which position on the cylindrical coordinates C0 corresponds to.
 次に、制御ユニット30は、メモリ302に格納されたオブジェクトデータから表示部10(表示領域110、視野V)に表示すべきオブジェクトデータを抽出する(ステップ106)。 Next, the control unit 30 extracts object data to be displayed on the display unit 10 (display area 110, field of view V) from the object data stored in the memory 302 (step 106).
 このとき、制御ユニット30(画像抽出部314)は、検出部20あるいは携帯情報端末200から取得したユーザ情報から、ユーザUが静止していると判定したときには、表示部10の方位(ユーザUの視線方向)に応じた全周囲のオブジェクトデータを抽出する。また、制御ユニット30は、ユーザU(表示部10)が移動していると判定したときには、その移動方向に存在するオブジェクトデータを抽出する。 At this time, when the control unit 30 (image extraction unit 314) determines that the user U is stationary from the user information acquired from the detection unit 20 or the portable information terminal 200, the orientation of the display unit 10 (the user U's) All surrounding object data corresponding to the line of sight is extracted. Further, when the control unit 30 determines that the user U (display unit 10) is moving, the control unit 30 extracts object data existing in the moving direction.
 このとき、制御ユニット30は、ユーザUの移動速度が所定以上の場合には、その移動方向の比較的遠方に存在する特定対象物に関連するオブジェクトが含まれるように抽出し、ユーザUの移動速度が所定以下の場合には、ユーザUの周辺に存在する特定対象物に関連するオブジェクトデータのみを抽出するようにしてもよい。 At this time, if the moving speed of the user U is greater than or equal to a predetermined value, the control unit 30 extracts the object related to the specific object existing relatively far in the moving direction, and moves the user U When the speed is less than or equal to a predetermined value, only object data related to a specific object existing around the user U may be extracted.
 そして、制御ユニット30は、円筒座標C0における視野Vの対応領域に何れかのオブジェクトデータが存在する場合には、当該オブジェクトを視野Vの対応位置に表示部10を介して表示(描画)する(ステップ107)。 When any object data is present in the corresponding region of the visual field V at the cylindrical coordinates C0, the control unit 30 displays (draws) the object at the corresponding position of the visual field V via the display unit 10 ( Step 107).
 以後、上述の動作が繰り返し実行される。すなわち、ユーザU(表示部10)の現在位置が再度測定され(ステップ101)、先にサーバNからオブジェクトデータを取得したときからユーザの現在位置が例えば50メートル以上変化したときは、当該現在位置を基準とする新しいオブジェクデータがサーバNから携帯情報端末200へ送信される(ステップ102)。制御ユニット30においては、携帯情報端末200から送信された新しいオブジェクトデータの中から、ユーザUの移動方向に関連したオブジェクトデータを抽出し、これを表示部10に描画する。このような制御が、繰り返し実行される。 Thereafter, the above operation is repeatedly executed. That is, the current position of the user U (display unit 10) is measured again (step 101), and when the current position of the user has changed by, for example, 50 meters or more since the object data was previously acquired from the server N, the current position Is sent from the server N to the portable information terminal 200 (step 102). In the control unit 30, object data related to the moving direction of the user U is extracted from the new object data transmitted from the portable information terminal 200, and is drawn on the display unit 10. Such control is repeatedly executed.
 なお、ユーザの移動に伴って、ユーザから所定距離(第1の距離)以上離れたオブジェクトデータは、携帯情報端末200のメモリ202および制御ユニット30のメモリ302から削除されてもよい。これにより、メモリ202,302の容量の削減を図ることができる。 Note that object data that is more than a predetermined distance (first distance) away from the user may be deleted from the memory 202 of the portable information terminal 200 and the memory 302 of the control unit 30 as the user moves. Thereby, the capacity of the memories 202 and 302 can be reduced.
 次に、具体的な適用例について説明する。本例では、自動車を運転しているユーザUに提示される画像情報の適用例について説明する。 Next, a specific application example will be described. In this example, an application example of image information presented to a user U driving a car will be described.
 図11は、自動車やオートバイ等の移動体に搭乗して道路R101を所定の速度で北上しているユーザUの現在位置と、その周辺の特定対象物との位置関係を示す概念図である。サーバNは、携帯情報端末200から現在位置に関する情報を取得すると、その周辺の例えば半径10km以内の特定対象物A12~A22に関連するオブジェクトデータや地図情報を携帯情報端末200へ送信する。携帯情報端末200は、受信した特定対象物A12~A22に関するオブジェクトデータや地図情報を制御ユニット30へ送信する。 FIG. 11 is a conceptual diagram showing a positional relationship between the current position of the user U who is riding a moving body such as an automobile or a motorcycle and traveling north on the road R101 at a predetermined speed, and a specific object around it. When the server N acquires information about the current position from the portable information terminal 200, the server N transmits object data and map information related to the specific objects A12 to A22 within a radius of, for example, 10 km to the portable information terminal 200. The portable information terminal 200 transmits the received object data and map information related to the specific objects A12 to A22 to the control unit 30.
 制御ユニット30は、受信したオブジェクトデータから道路R101に沿って存在する特定対象物A12,A13,A14,A16,A19,A22に関連するオブジェクトデータを抽出し、その画像(オブジェクト)を表示部10に表示する。 The control unit 30 extracts object data related to the specific objects A12, A13, A14, A16, A19, and A22 existing along the road R101 from the received object data, and displays the image (object) on the display unit 10. indicate.
 このように本実施形態では、複数の画像情報の中からユーザの移動方向に存在する特定対象物に関連する画像情報のみがユーザに提示されるため、ユーザにとって有用と推定される情報が優先的に提示されることになる。したがって、ユーザが現在走行している道路R101以外の他の道路R100,R102,R103の沿線に関連した画像が表示されることはないため、ユーザに混乱や錯誤を生じさせることなく、必要な情報をユーザへ的確に提示することができる。また、ユーザUに徒に多くの情報が提示されることがなくなるため、ユーザによる情報の取捨選択といった労力を最小限に抑えることができる。 As described above, in the present embodiment, only image information related to a specific object existing in the moving direction of the user is presented to the user from among the plurality of image information, and thus information presumed to be useful to the user is preferential. Will be presented. Accordingly, images related to the roads other than the road R101 on which the user is currently traveling are not displayed, so that necessary information can be obtained without causing confusion or error to the user. Can be accurately presented to the user. In addition, since a large amount of information is not presented to the user U, it is possible to minimize the effort of selecting information by the user.
 このとき、ユーザUの移動速度が所定速度以下の場合には、図12Aに示すように、ユーザから上記第1の距離(半径10km)よりも短い第2の距離(例えば半径1km)以内の比較的近傍の情報(オブジェクトB12,B13,B14,B16)のみが表示部10に提示されてもよい。また、ユーザUの移動速度が上記所定速度を超える場合には、図12Bに示すように、ユーザから例えば半径1km以上10km以内の比較的遠方の情報(オブジェクトB19,B22)も含めて表示部10に提示されてもよい。このように、ユーザの移動速度により表示する情報の範囲を変化させることで、低速走行時には近くの情報だけを見たい、あるいは、高速走行時には遠方の情報をいち早く確認したい、などといったユーザの要求にも対応することが可能となる。 At this time, when the moving speed of the user U is equal to or lower than the predetermined speed, as shown in FIG. 12A, a comparison within a second distance (for example, a radius of 1 km) shorter than the first distance (radius of 10 km) from the user. Only information on the target neighborhood (objects B12, B13, B14, B16) may be presented on the display unit 10. When the moving speed of the user U exceeds the predetermined speed, as shown in FIG. 12B, the display unit 10 includes relatively distant information (objects B19, B22) within a radius of 1 km to 10 km, for example. May be presented. In this way, by changing the range of information to be displayed according to the moving speed of the user, it is necessary to see only nearby information when traveling at low speed, or to quickly check distant information when traveling at high speed. Can also be supported.
 また、アプリケーションプログラムとしてナビゲーションモードが実行されている場合は、設定ルートからユーザの移動経路を推定することができる。したがって、現在走行している道路沿いに関連するオブジェクトのほか、その設定ルートの沿線に関連するオブジェクトをも抽出し、表示部10に提示することも可能である。 Also, when the navigation mode is executed as an application program, the user's travel route can be estimated from the set route. Therefore, in addition to objects related to the road along which the vehicle is currently traveling, objects related to the route along the set route can also be extracted and presented on the display unit 10.
 さらに、ユーザUによる所定の操作あるいは所定の姿勢を検出することにより、図11に示したような鳥瞰図あるいは俯瞰図として示された地図情報と画像情報とを含む画像(画面)を表示部10に提示するように構成されてもよい。これにより、ユーザへ周辺の特定構造物に関連するオブジェクトを一覧的に提示することができる。 Furthermore, by detecting a predetermined operation or a predetermined posture by the user U, an image (screen) including map information and image information shown as a bird's-eye view or a bird's-eye view as shown in FIG. It may be configured to present. Thereby, it is possible to present a list of objects related to the surrounding specific structure to the user.
<第2の実施形態>
 続いて、本技術の第2の実施形態について説明する。以下、第1の実施形態と異なる構成について主に説明し、上述の実施形態と同様の構成についてはその説明を省略または簡略化する。
<Second Embodiment>
Subsequently, a second embodiment of the present technology will be described. Hereinafter, the configuration different from the first embodiment will be mainly described, and the description of the same configuration as the above-described embodiment will be omitted or simplified.
 HMDを装着したユーザが自動車やオートバイ等の車両(移動体)を運転する場合、表示画像の明るさによっては運転に集中できない場合がある。特に、加速時や減速時、旋回時などにおいては、ユーザを周囲の実空間環境に集中させる必要がある。
 そこで本実施形態では、ユーザが運転に集中できるような環境を提供することができるHMDの構成について説明する。
When a user wearing an HMD drives a vehicle (moving body) such as an automobile or a motorcycle, the user may not be able to concentrate on driving depending on the brightness of the display image. In particular, when accelerating, decelerating, or turning, it is necessary to focus the user on the surrounding real space environment.
Therefore, in the present embodiment, a configuration of the HMD that can provide an environment in which the user can concentrate on driving will be described.
 本実施形態のHMDにおける制御ユニット30(表示制御部315)は、ユーザ情報としてユーザの移動加速度を取得し、その移動加速度の大きさに応じて表示領域に提示される画像情報の表示態様を変化させるように構成される。 The control unit 30 (display control unit 315) in the HMD of this embodiment acquires the user's movement acceleration as user information, and changes the display mode of the image information presented in the display area according to the magnitude of the movement acceleration. Configured to let
 図13は、制御ユニット30の動作の一例を示すフロー図である。 FIG. 13 is a flowchart showing an example of the operation of the control unit 30.
 制御ユニット30は、第1の実施形態と同様に、携帯情報端末200からユーザの現在位置を基準とする周辺領域のオブジェクトデータや地図情報、移動速度情報を取得し、これらの情報からユーザが現在走行している道路沿いのオブジェクトデータを抽出する(ステップ201~205)。そして、制御ユニット30は、検出部20から取得したユーザの移動加速度情報に基づき、抽出されたオブジェクトの表示態様を決定し、これを表示部10に表示させる(ステップ206,207)。ユーザの移動加速度には、検出部20に内蔵された加速度センサにより検出される表示部10の加速度が参照される。 As in the first embodiment, the control unit 30 obtains object data, map information, and moving speed information of the peripheral area based on the current position of the user from the portable information terminal 200, and the user presents the current data from these information. Object data along the traveling road is extracted (steps 201 to 205). And the control unit 30 determines the display mode of the extracted object based on the user's movement acceleration information acquired from the detection part 20, and displays this on the display part 10 (steps 206 and 207). The acceleration of the display unit 10 detected by an acceleration sensor built in the detection unit 20 is referred to as the user's moving acceleration.
 ユーザが自動車や自動二輪車を運転している場合、加減速時等の所定の状況下では、ユーザの注意を運転に集中させる必要がある。そこで、本実施形態のHMDは、表示されるオブジェクトの視認性を意図的に低下させて、ユーザの注意をオブジェクトではなく実空間の視野に向けさせるように構成される。 When the user is driving a car or a motorcycle, it is necessary to concentrate the user's attention on the driving under certain conditions such as acceleration / deceleration. Therefore, the HMD of the present embodiment is configured to intentionally reduce the visibility of the displayed object so that the user's attention is directed to the field of view of the real space instead of the object.
 典型的には、制御ユニット30(表示制御部315)は、ユーザの移動加速度の大きさに応じて、オブジェクトの輝度(明るさ)、色、コントラスト、位置、大きさ、情報量等の少なくとも1つを変化させることで、実空間環境にユーザの注意を向けさせるように構成される。具体的には、上記加速度が大きいほど、オブジェクトの表示輝度を低下させたり、薄い色に変更したり、コントラストを低下させたりして、オブジェクトの視認性を低下させる。または、上記加速度が大きいほど、オブジェクトの位置を中央から視野周縁部へ移動させたり、大きさを小さくしたり、情報量を減らすなどして、実空間の前方視野を確保する。 Typically, the control unit 30 (display control unit 315) has at least one of brightness (brightness), color, contrast, position, size, information amount, etc. of the object according to the magnitude of the user's moving acceleration. By changing one, the user's attention is directed to the real space environment. Specifically, the greater the acceleration, the lower the object's visibility by reducing the display brightness of the object, changing to a lighter color, or reducing the contrast. Alternatively, as the acceleration increases, the front position of the real space is secured by moving the position of the object from the center to the peripheral edge of the field of view, reducing the size, or reducing the amount of information.
 (オブジェクトの輝度の変更)
 制御ユニット30(表示制御部315)は、ユーザの移動加速度が所定値以下のときは、オブジェクトを第1の輝度で表示領域110に提示し、ユーザの移動加速度が上記所定値を超えたときは、オブジェクトを上記第1の輝度よりも低い第2の輝度で表示領域110に提示するように構成される。
(Change object brightness)
The control unit 30 (display control unit 315) presents the object to the display area 110 with the first luminance when the user's moving acceleration is below a predetermined value, and when the user's moving acceleration exceeds the predetermined value. The object is displayed on the display area 110 with a second luminance lower than the first luminance.
 ユーザの移動加速度は、典型的には、X軸、Y軸及びZ軸の3軸方向(図1参照)に沿った加速度の合成ベクトルの大きさとされる。Z軸方向には常に重力加速度が作用しているため、ユーザが静止しているとき、あるいは等速度移動をしているとき以外は、移動加速度は1Gを超える値となる。そこで、上記所定値を1Gとすることで、ユーザの移動を確実に検出することができる。 The user's moving acceleration is typically the magnitude of a combined vector of accelerations along the three-axis directions of the X-axis, Y-axis, and Z-axis (see FIG. 1). Since gravitational acceleration is always applied in the Z-axis direction, the moving acceleration is a value exceeding 1 G except when the user is stationary or moving at a constant speed. Therefore, by setting the predetermined value to 1G, it is possible to reliably detect the movement of the user.
 また、オブジェクトの輝度(明るさ)は、第1の輝度と第2の輝度の2つに限定されず、上記第2の輝度を加速度の増加に従って徐々に低下させてもよい。上記第2の輝度を低下させる態様は特に限定されず、例えば図14に示すように、加速度の増加に従って直線的に低下させてもよいし、段階的に低下させてもよい。 Also, the luminance (brightness) of the object is not limited to two, ie, the first luminance and the second luminance, and the second luminance may be gradually decreased as the acceleration increases. The aspect in which the second luminance is decreased is not particularly limited. For example, as illustrated in FIG. 14, the second luminance may be decreased linearly as the acceleration increases, or may be decreased stepwise.
 図14の例では、加速度2Gで輝度がゼロとなるようにしたが、これに限られず、2Gを超える加速度で輝度がゼロとなるようにしてもよい。あるいは、加速度の時間変化に応じてオブジェクトの表示輝度を低下させてもよい。例えば、加速度変化率が所定以上の場合には表示輝度を一時的に非常に低くするなどの手法が適用可能である。 In the example of FIG. 14, the luminance becomes zero at an acceleration of 2G. However, the present invention is not limited to this, and the luminance may be zero at an acceleration exceeding 2G. Alternatively, the display brightness of the object may be lowered according to the time change of the acceleration. For example, when the acceleration change rate is equal to or higher than a predetermined value, a technique such as temporarily reducing the display luminance can be applied.
 なお、加速度が減少した場合、これに追従するようにオブジェクトの表示輝度を回復させてもよいが、加速度の減少後、一定時間経過後に表示輝度を回復させるようにしてもよい。これにより、加減速の切り替わりが比較的早い場合においてオブジェクトの輝度変化を鈍化させることができる。また、車両の運転時の振動や道路の勾配等による影響を排除するため、上記加速度センサの出力にローパスフィルタが介装されてもよい。 Note that when the acceleration decreases, the display brightness of the object may be recovered so as to follow this. However, after the acceleration decreases, the display brightness may be recovered after a certain time has elapsed. Thereby, when the acceleration / deceleration is switched relatively quickly, the change in luminance of the object can be slowed down. In addition, a low-pass filter may be interposed in the output of the acceleration sensor in order to eliminate the influence of vibration during driving of the vehicle, road gradient, and the like.
 上記の調光制御は、表示制御部315の機能の一部として実行されてもよいし、図15に示すように、制御ユニット30に別途設けられた調光ユニット306が併用されてもよい。同図に示すように、調光ユニット306は、手動調光部41と、自動調光部42と、切替スイッチ43と、調光補正部44とを有する。 The above dimming control may be executed as part of the function of the display control unit 315, or a dimming unit 306 provided separately in the control unit 30 may be used in combination as shown in FIG. As shown in the figure, the dimming unit 306 includes a manual dimming unit 41, an automatic dimming unit 42, a changeover switch 43, and a dimming correction unit 44.
 手動調光部41は、ユーザがオブジェクトの輝度を任意に設定することができる部位であり、例えば、入力操作部305(図5参照)を介して操作される。自動調光部42は、検出部20に内蔵された照度センサ21の出力に基づき、当該出力に応じて予め設定された輝度値を出力する。切替スイッチ43は、手動調整と自動調整とを切り替えるためのもので、手動調整の場合には手動調光部41の出力を調光補正部44へ入力し、自動調整の場合には自動調光部42の出力を調光補正部44へ入力する。調光補正部44は、検出部20に内蔵された加速度センサ22の出力に基づき、図14に示した態様で、輝度値を減じる補正を実行し、CPU301(表示制御部315)へ出力する。 The manual dimming unit 41 is a part where the user can arbitrarily set the luminance of the object, and is operated via the input operation unit 305 (see FIG. 5), for example. Based on the output of the illuminance sensor 21 built in the detection unit 20, the automatic light control unit 42 outputs a luminance value set in advance according to the output. The change-over switch 43 is for switching between manual adjustment and automatic adjustment. In the case of manual adjustment, the output of the manual dimming unit 41 is input to the dimming correction unit 44, and in the case of automatic adjustment, automatic dimming is performed. The output of the unit 42 is input to the dimming correction unit 44. Based on the output of the acceleration sensor 22 built in the detection unit 20, the dimming correction unit 44 performs correction to reduce the luminance value in the manner shown in FIG. 14, and outputs the correction to the CPU 301 (display control unit 315).
 (オブジェクトの位置、大きさ等の変更)
 図16Aは、オブジェクトの位置及び大きさの変更の一例を示す視野Vの模式図である。図16Bは、オブジェクトB30の位置及び情報量の変更の一例を示す視野Vの模式図である。これらの図に示すように、加速度が1Gを超えたとき、オブジェクトB30を視野Vの一隅部あるいは下部に移動させたり、オブジェクトの一部のみを表示させたりすることで、広い前方視野を確保することができる。
(Change of object position, size, etc.)
FIG. 16A is a schematic diagram of a visual field V showing an example of a change in the position and size of an object. FIG. 16B is a schematic diagram of the field of view V illustrating an example of a change in the position and information amount of the object B30. As shown in these figures, when the acceleration exceeds 1G, the object B30 is moved to one corner or the lower part of the field of view V, or only a part of the object is displayed, thereby ensuring a wide front field of view. be able to.
 なお、加速度の増加に従ってオブジェクトB30の位置、大きさ及び情報量を変更する制御に加えて、上述のようにオブジェクトの表示輝度も同時に低下させるようにしてもよい。 In addition to the control for changing the position, size, and information amount of the object B30 in accordance with the increase in acceleration, the display luminance of the object may be decreased at the same time as described above.
<第3の実施形態>
 続いて、本技術の第2の実施形態について説明する。以下、第1の実施形態と異なる構成について主に説明し、上述の実施形態と同様の構成についてはその説明を省略または簡略化する。
<Third Embodiment>
Subsequently, a second embodiment of the present technology will be described. Hereinafter, the configuration different from the first embodiment will be mainly described, and the description of the same configuration as the above-described embodiment will be omitted or simplified.
 本実施形態では、ユーザ情報として、車両の運転状態や運転時間等を取得し、これらの情報に基づいて表示部10に所定の情報を提示する例を説明する。 In the present embodiment, an example will be described in which the driving state and driving time of a vehicle are acquired as user information, and predetermined information is presented on the display unit 10 based on such information.
 本実施形態において、制御ユニット30は、検出部20で取得される加速度情報(ユーザ情報)から車両の挙動を推定することが可能に構成される。例えば図17に示すように、ユーザがオートバイを運転している場合において、制御ユニット30は、検出部20で取得される加速度情報(例えば、横方向の加速度)から表示部10に作用する遠心力を算出し、その値から旋回時等における車体のバンク角を推定するように構成される。そして、制御ユニット30は、上記バンク角が所定以上となると、転倒の危険をユーザへ報知する警告情報を表示部10に提示するように構成される。警告情報は、文字情報でもよいし、図形情報でもよいし、これらの組み合わせであってもよい。 In the present embodiment, the control unit 30 is configured to be able to estimate the behavior of the vehicle from the acceleration information (user information) acquired by the detection unit 20. For example, as shown in FIG. 17, when the user is driving a motorcycle, the control unit 30 performs centrifugal force acting on the display unit 10 from acceleration information (for example, lateral acceleration) acquired by the detection unit 20. And the bank angle of the vehicle body at the time of turning or the like is estimated from the value. The control unit 30 is configured to present warning information for notifying the user of the risk of falling on the display unit 10 when the bank angle is equal to or greater than a predetermined value. The warning information may be character information, graphic information, or a combination thereof.
 また、車両の燃料の残量を検出し、それが所定以下になったときに、ユーザへその旨を示す情報を提示してもよい。この種の情報としては、上述のように文字や図形であってもよいし、近隣のガソリンスタンドの位置に関する情報等がAR表示されてもよい。この例は、一般の車両のユーザだけでなく、燃料計を装備していないオートバイのユーザにとって有益な情報となり得る。 Further, the remaining amount of fuel in the vehicle may be detected, and information indicating that may be presented to the user when it becomes below a predetermined level. As this type of information, characters and figures may be used as described above, or information on the position of a nearby gas station may be displayed as an AR. This example can be useful information not only for general vehicle users but also for motorcycle users who are not equipped with fuel gauges.
 他の例として、制御ユニット30は、図18に示すように、携帯情報端末200(位置情報取得部207)から取得される移動時間情報(ユーザ情報)から、車両を運転するユーザの疲労度合を推定し、サービスエリアやコンビニエンスストア等の休憩ポイントに関する情報を表示部10に提示するように構成される。この例によれば、例えば第1の実施形態で説明したように、走行する道路沿いに存在する休憩ポイントに関する情報が提示される。これにより、ユーザにとって有用な情報を適時のタイミングで提供することができる。 As another example, as shown in FIG. 18, the control unit 30 determines the degree of fatigue of the user who drives the vehicle from travel time information (user information) acquired from the portable information terminal 200 (position information acquisition unit 207). It is configured to estimate and present information related to a break point such as a service area or a convenience store on the display unit 10. According to this example, as described in the first embodiment, for example, information related to a break point existing along a traveling road is presented. Thereby, information useful for the user can be provided at a timely timing.
 さらに他の例として、制御ユニット30は、図19に示すように、携帯情報端末200(位置情報取得部207)から取得される移動方向から、ユーザの移動目的地を推定し、その目的地に関する情報(例えば駅の出発時刻や列車の運行情報)を表示部10に提示するように構成される。目的地における天気等に関する情報が提供されてもよい。このような構成においても、ユーザに有用な情報を適時のタイミングで提供することが可能となる。 As yet another example, as shown in FIG. 19, the control unit 30 estimates the user's moving destination from the moving direction acquired from the portable information terminal 200 (position information acquisition unit 207), and relates to the destination. Information (for example, station departure time and train operation information) is configured to be displayed on the display unit 10. Information regarding the weather etc. at the destination may be provided. Even in such a configuration, it is possible to provide useful information to the user at a timely timing.
 なお本例では、ユーザ情報として、ユーザの過去の行動履歴が参照されてもよい。当該情報は、制御ユニット30のメモリ302あるいは携帯情報端末200のメモリ202に蓄積される。これにより、ユーザの行動又はその目的の推定精度を高めることができる。例えば、ジョギングを習慣にしているユーザに対しては、移動開始時刻や移動速度等からその行動を推定することができ、通行する道順から目的地を推定することができる。その結果、その行動又は目的地に対応した情報をユーザへ提供することが可能となる。 In this example, the user's past action history may be referred to as the user information. The information is stored in the memory 302 of the control unit 30 or the memory 202 of the portable information terminal 200. Thereby, estimation accuracy of a user's action or its purpose can be improved. For example, for a user who is in the habit of jogging, the behavior can be estimated from the movement start time, the movement speed, and the like, and the destination can be estimated from the traveling route. As a result, information corresponding to the action or the destination can be provided to the user.
 以上、本技術の実施形態について説明したが、本技術は上述の実施形態にのみ限定されるものではなく、本技術の要旨を逸脱しない範囲内において種々変更を加え得ることは勿論である。 As mentioned above, although embodiment of this technique was described, this technique is not limited only to the above-mentioned embodiment, Of course, various changes can be added within the range which does not deviate from the summary of this technique.
 例えば以上の実施形態では、HMDに本技術を適用した例を説明したが、HMD以外の画像表示装置として、例えば、車両の運転席や航空機等のコックピットに搭載されるヘッドアップディスプレイ(HUD)等の画像表示装置にも本技術は適用可能である。 For example, in the above embodiment, the example in which the present technology is applied to the HMD has been described. However, as an image display device other than the HMD, for example, a head-up display (HUD) mounted in a driver's seat of a vehicle or a cockpit of an aircraft or the like The present technology can also be applied to other image display apparatuses.
 また以上の実施形態では、シースルー型(透過型)のHMDへの適用例を説明したが、非透過型のHMDにも本技術は適用可能である。この場合、表示部に装着したカメラで撮像された外界視野に、本技術に係る所定のオブジェクトを表示すればよい。 In the above embodiment, an example of application to a see-through type (transmission type) HMD has been described. However, the present technology can also be applied to a non-transmission type HMD. In this case, a predetermined object according to the present technology may be displayed in an external field of view captured by a camera attached to the display unit.
 さらに以上の実施形態では、ウェアラブルディスプレイとして、ユーザの頭部に装着されるHMDを例に挙げて説明したが、これに限られず、例えばユーザの腕や手首等に装着されて使用される表示装置や、コンタクトレンズのように眼球に直接装着される表示装置についても、本技術は適用可能である。 Further, in the above-described embodiment, the wearable display has been described by taking the HMD attached to the user's head as an example. However, the present invention is not limited to this, and for example, a display device that is used attached to the user's arm, wrist, or the like. The present technology can also be applied to a display device that is directly attached to the eyeball, such as a contact lens.
 なお、本技術は以下のような構成もとることができる。
(1) ユーザに装着可能に構成され、ユーザに実空間の視野を提供する表示領域を有する表示部と、
 前記ユーザの位置及び行動に関連する情報を含むユーザ情報と、前記ユーザから第1の距離以内の周辺領域に存在する複数の特定対象物に関連する複数の画像情報とに基づき、前記複数の画像情報から前記ユーザの移動方向に存在する特定対象物に関連する画像情報を抽出し、抽出された前記画像情報を前記表示領域に提示するように構成された制御ユニットと
 を具備するウェアラブルディスプレイ。
(2)上記(1)に記載のウェアラブルディスプレイであって、
 前記ユーザ情報は、前記ユーザの位置周辺の地図情報を含み、
 前記制御ユニットは、前記複数の画像情報から前記ユーザが走行する道路に沿って存在する特定対象物に関連する画像情報を抽出する
 ウェアラブルディスプレイ。
(3)上記(2)に記載のウェアラブルディスプレイであって、
 前記制御ユニットは、前記ユーザ情報として前記ユーザの移動速度を取得し、前記移動速度が所定以下の場合には、前記ユーザから前記第1の距離よりも短い第2の距離以内に存在する特定対象物に関連する画像情報を前記表示領域に提示する
 ウェアラブルディスプレイ。
(4)上記(2)又は(3)に記載のウェアラブルディスプレイであって、
 前記制御ユニットは、前記地図情報と前記画像情報とを含む画像を前記表示領域に提示することが可能に構成される
 ウェアラブルディスプレイ。
(5)上記(1)~(4)のいずれか1つに記載のウェアラブルディスプレイであって、
 前記制御ユニットは、前記ユーザ情報として前記ユーザの移動加速度を取得し、前記移動加速度の大きさに応じて前記表示領域に提示される前記画像情報の表示態様を変化させる
 ウェアラブルディスプレイ。
(6)上記(5)に記載のウェアラブルディスプレイであって、
 前記制御ユニットは、前記ユーザの移動加速度が所定値以下のときは、前記画像情報を第1の輝度で前記表示領域に提示し、前記ユーザの移動加速度が前記所定値を超えたときは、前記画像情報を前記第1の輝度よりも低い第2の輝度で前記表示領域に提示する
 ウェアラブルディスプレイ。
(7)上記(6)に記載のウェアラブルディスプレイであって、
 前記制御ユニットは、前記ユーザの移動加速度の増加に従って、前記第2の輝度を徐々に低下させる
 ウェアラブルディスプレイ。
(8)上記(5)~(7)のいずれか1つに記載のウェアラブルディスプレイであって、
 前記制御ユニットは、前記加速度の大きさに応じて、前記表示領域に提示される前記画像情報の位置、大きさ及び情報量の少なくとも1つを変化させる
 ウェアラブルディスプレイ。
(9)上記(1)~(8)のいずれか1つに記載のウェアラブルディスプレイであって、
 前記ユーザ情報として、前記表示部に作用する加速度を検出することが可能な検出部をさらに具備する
 ウェアラブルディスプレイ。
(10) 表示領域を有する表示部と、
 ユーザの位置及び行動に関連する情報を含むユーザ情報と、前記ユーザから第1の距離以内の周辺領域に存在する複数の特定対象物に関連する複数の画像情報とに基づき、前記複数の画像情報から前記ユーザの移動方向に存在する特定対象物に関連する画像情報を抽出し、抽出された前記画像情報を前記表示領域に提示するように構成された制御ユニットと
 を具備する画像表示装置。
(11) 表示領域を有する表示部と、
 ユーザの位置及び行動に関連する情報を含むユーザ情報と、前記ユーザから第1の距離以内の周辺領域に存在する複数の特定対象物に関連する複数の画像情報とを取得する取得部と、
 前記複数の画像情報から前記ユーザの移動方向に存在する特定対象物に関連する画像情報を抽出し、抽出された前記画像情報を前記表示領域に提示する制御ユニットと
 を具備する画像表示システム。
In addition, this technique can also take the following structures.
(1) A display unit configured to be wearable by a user and having a display area that provides a real-time visual field to the user;
The plurality of images based on user information including information related to the user's position and behavior, and a plurality of pieces of image information related to a plurality of specific objects existing in a peripheral region within a first distance from the user. A wearable display comprising: a control unit configured to extract image information related to a specific object existing in the moving direction of the user from the information, and present the extracted image information in the display area.
(2) The wearable display according to (1) above,
The user information includes map information around the user's location,
The said control unit is a wearable display which extracts the image information relevant to the specific target object which exists along the road where the said user travels from the said several image information.
(3) The wearable display according to (2) above,
The control unit acquires the moving speed of the user as the user information, and when the moving speed is equal to or less than a predetermined value, the specific target existing within a second distance shorter than the first distance from the user A wearable display for presenting image information related to an object in the display area.
(4) The wearable display according to (2) or (3) above,
The control unit is configured to be capable of presenting an image including the map information and the image information in the display area.
(5) The wearable display according to any one of (1) to (4) above,
The said control unit acquires the said user's movement acceleration as the said user information, The wearable display which changes the display mode of the said image information shown to the said display area according to the magnitude | size of the said movement acceleration.
(6) The wearable display according to (5) above,
The control unit presents the image information at the first luminance on the display area when the user's moving acceleration is less than or equal to a predetermined value, and when the user's moving acceleration exceeds the predetermined value, A wearable display that presents image information in the display area with a second luminance lower than the first luminance.
(7) The wearable display according to (6) above,
The control unit is a wearable display that gradually decreases the second luminance as the moving acceleration of the user increases.
(8) The wearable display according to any one of (5) to (7) above,
The control unit changes at least one of a position, a size, and an information amount of the image information presented in the display area according to the magnitude of the acceleration.
(9) The wearable display according to any one of (1) to (8) above,
A wearable display further comprising a detection unit capable of detecting an acceleration acting on the display unit as the user information.
(10) a display unit having a display area;
The plurality of pieces of image information based on user information including information relating to a user's position and behavior and a plurality of pieces of image information relating to a plurality of specific objects existing in a peripheral area within a first distance from the user. An image display device comprising: a control unit configured to extract image information related to a specific object existing in a moving direction of the user from the image and present the extracted image information in the display area.
(11) a display unit having a display area;
An acquisition unit that acquires user information including information related to a user's position and behavior, and a plurality of pieces of image information related to a plurality of specific objects existing in a peripheral region within a first distance from the user;
An image display system comprising: a control unit that extracts image information related to a specific object existing in the moving direction of the user from the plurality of image information, and presents the extracted image information in the display area.
 10…表示部
 20…検出部
 30…制御ユニット
 100…ヘッドマウントディスプレイ(HMD)
 110…表示領域
 200…携帯情報端末
 314…表示制御部
 A1~A4,A12~A22…特定対象物
 B1~B4,B12~B14,B16、B19,B22…オブジェクト
 V…視野
DESCRIPTION OF SYMBOLS 10 ... Display part 20 ... Detection part 30 ... Control unit 100 ... Head mounted display (HMD)
DESCRIPTION OF SYMBOLS 110 ... Display area 200 ... Portable information terminal 314 ... Display control part A1-A4, A12-A22 ... Specific object B1-B4, B12-B14, B16, B19, B22 ... Object V ... Visual field

Claims (11)

  1.  ユーザに装着可能に構成され、ユーザに実空間の視野を提供する表示領域を有する表示部と、
     前記ユーザの位置及び行動に関連する情報を含むユーザ情報と、前記ユーザから第1の距離以内の周辺領域に存在する複数の特定対象物に関連する複数の画像情報とに基づき、前記複数の画像情報から前記ユーザの移動方向に存在する特定対象物に関連する画像情報を抽出し、抽出された前記画像情報を前記表示領域に提示するように構成された制御ユニットと
     を具備するウェアラブルディスプレイ。
    A display unit configured to be worn by the user and having a display area that provides the user with a field of view in real space;
    The plurality of images based on user information including information related to the user's position and behavior, and a plurality of pieces of image information related to a plurality of specific objects existing in a peripheral region within a first distance from the user. A wearable display comprising: a control unit configured to extract image information related to a specific object existing in the moving direction of the user from the information, and present the extracted image information in the display area.
  2.  請求項1に記載のウェアラブルディスプレイであって、
     前記ユーザ情報は、前記ユーザの位置周辺の地図情報を含み、
     前記制御ユニットは、前記複数の画像情報から前記ユーザが走行する道路に沿って存在する特定対象物に関連する画像情報を抽出する
     ウェアラブルディスプレイ。
    The wearable display according to claim 1,
    The user information includes map information around the user's location,
    The said control unit is a wearable display which extracts the image information relevant to the specific target object which exists along the road where the said user travels from the said several image information.
  3.  請求項2に記載のウェアラブルディスプレイであって、
     前記制御ユニットは、前記ユーザ情報として前記ユーザの移動速度を取得し、前記移動速度が所定以下の場合には、前記ユーザから前記第1の距離よりも短い第2の距離以内に存在する特定対象物に関連する画像情報を前記表示領域に提示する
     ウェアラブルディスプレイ。
    The wearable display according to claim 2,
    The control unit acquires the moving speed of the user as the user information, and when the moving speed is equal to or less than a predetermined value, the specific target existing within a second distance shorter than the first distance from the user A wearable display for presenting image information related to an object in the display area.
  4.  請求項2に記載のウェアラブルディスプレイであって、
     前記制御ユニットは、前記地図情報と前記画像情報とを含む画像を前記表示領域に提示することが可能に構成される
     ウェアラブルディスプレイ。
    The wearable display according to claim 2,
    The control unit is configured to be capable of presenting an image including the map information and the image information in the display area.
  5.  請求項1に記載のウェアラブルディスプレイであって、
     前記制御ユニットは、前記ユーザ情報として前記ユーザの移動加速度を取得し、前記移動加速度の大きさに応じて前記表示領域に提示される前記画像情報の表示態様を変化させる
     ウェアラブルディスプレイ。
    The wearable display according to claim 1,
    The said control unit acquires the said user's movement acceleration as the said user information, The wearable display which changes the display mode of the said image information shown to the said display area according to the magnitude | size of the said movement acceleration.
  6.  請求項5に記載のウェアラブルディスプレイであって、
     前記制御ユニットは、前記ユーザの移動加速度が所定値以下のときは、前記画像情報を第1の輝度で前記表示領域に提示し、前記ユーザの移動加速度が前記所定値を超えたときは、前記画像情報を前記第1の輝度よりも低い第2の輝度で前記表示領域に提示する
     ウェアラブルディスプレイ。
    The wearable display according to claim 5,
    The control unit presents the image information at the first luminance on the display area when the user's moving acceleration is less than or equal to a predetermined value, and when the user's moving acceleration exceeds the predetermined value, A wearable display that presents image information in the display area with a second luminance lower than the first luminance.
  7.  請求項6に記載のウェアラブルディスプレイであって、
     前記制御ユニットは、前記ユーザの移動加速度の増加に従って、前記第2の輝度を徐々に低下させる
     ウェアラブルディスプレイ。
    The wearable display according to claim 6,
    The control unit is a wearable display that gradually decreases the second luminance as the moving acceleration of the user increases.
  8.  請求項5に記載のウェアラブルディスプレイであって、
     前記制御ユニットは、前記加速度の大きさに応じて、前記表示領域に提示される前記画像情報の位置、大きさ及び情報量の少なくとも1つを変化させる
     ウェアラブルディスプレイ。
    The wearable display according to claim 5,
    The control unit changes at least one of a position, a size, and an information amount of the image information presented in the display area according to the magnitude of the acceleration.
  9.  請求項1に記載のウェアラブルディスプレイであって、
     前記ユーザ情報として、前記表示部に作用する加速度を検出することが可能な検出部をさらに具備する
     ウェアラブルディスプレイ。
    The wearable display according to claim 1,
    A wearable display further comprising a detection unit capable of detecting an acceleration acting on the display unit as the user information.
  10.  表示領域を有する表示部と、
     ユーザの位置及び行動に関連する情報を含むユーザ情報と、前記ユーザから第1の距離以内の周辺領域に存在する複数の特定対象物に関連する複数の画像情報とに基づき、前記複数の画像情報から前記ユーザの移動方向に存在する特定対象物に関連する画像情報を抽出し、抽出された前記画像情報を前記表示領域に提示するように構成された制御ユニットと
     を具備する画像表示装置。
    A display unit having a display area;
    The plurality of pieces of image information based on user information including information relating to a user's position and behavior and a plurality of pieces of image information relating to a plurality of specific objects existing in a peripheral area within a first distance from the user. An image display device comprising: a control unit configured to extract image information related to a specific object existing in a moving direction of the user from the image and present the extracted image information in the display area.
  11.  表示領域を有する表示部と、
     ユーザの位置及び行動に関連する情報を含むユーザ情報と、前記ユーザから第1の距離以内の周辺領域に存在する複数の特定対象物に関連する複数の画像情報とを取得する取得部と、
     前記複数の画像情報から前記ユーザの移動方向に存在する特定対象物に関連する画像情報を抽出し、抽出された前記画像情報を前記表示領域に提示する制御ユニットと
     を具備する画像表示システム。
    A display unit having a display area;
    An acquisition unit that acquires user information including information related to a user's position and behavior, and a plurality of pieces of image information related to a plurality of specific objects existing in a peripheral region within a first distance from the user;
    An image display system comprising: a control unit that extracts image information related to a specific object existing in the moving direction of the user from the plurality of image information, and presents the extracted image information in the display area.
PCT/JP2016/004319 2015-10-30 2016-09-23 Wearable display, image display device, and image display system WO2017073014A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015214469 2015-10-30
JP2015-214469 2015-10-30

Publications (1)

Publication Number Publication Date
WO2017073014A1 true WO2017073014A1 (en) 2017-05-04

Family

ID=58630207

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/004319 WO2017073014A1 (en) 2015-10-30 2016-09-23 Wearable display, image display device, and image display system

Country Status (1)

Country Link
WO (1) WO2017073014A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012216135A (en) * 2011-04-01 2012-11-08 Olympus Corp Image generation system, program, and information storage medium
JP2013072778A (en) * 2011-09-28 2013-04-22 Toyota Motor Corp Image displaying apparatus for vehicle use
WO2014038044A1 (en) * 2012-09-06 2014-03-13 パイオニア株式会社 Display device, display method, program, and recording medium
JP2014120109A (en) * 2012-12-19 2014-06-30 Kddi Corp Program for determining prospect area according to road information, information terminal, server and method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012216135A (en) * 2011-04-01 2012-11-08 Olympus Corp Image generation system, program, and information storage medium
JP2013072778A (en) * 2011-09-28 2013-04-22 Toyota Motor Corp Image displaying apparatus for vehicle use
WO2014038044A1 (en) * 2012-09-06 2014-03-13 パイオニア株式会社 Display device, display method, program, and recording medium
JP2014120109A (en) * 2012-12-19 2014-06-30 Kddi Corp Program for determining prospect area according to road information, information terminal, server and method

Similar Documents

Publication Publication Date Title
JP7268692B2 (en) Information processing device, control method and program
US10796669B2 (en) Method and apparatus to control an augmented reality head-mounted display
US20170329480A1 (en) Display control apparatus, display control method, and program
CN108351736B (en) Wearable display, image display device, and image display system
CN109990797A (en) A kind of control method of the augmented reality navigation display for HUD
US10568148B2 (en) Systems and methods for communicating notices within a vehicle using a mobile device
WO2017073014A1 (en) Wearable display, image display device, and image display system
JP7417907B2 (en) display system
US10876853B2 (en) Information presentation device, information presentation method, and storage medium
US10896547B1 (en) Systems and methods of augmented reality visualization based on sensor data
WO2020241094A1 (en) Display system
WO2014097406A1 (en) Display system, server device, display device, display method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16859254

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: JP

122 Ep: pct application non-entry in european phase

Ref document number: 16859254

Country of ref document: EP

Kind code of ref document: A1