WO2016088420A1 - 表示制御装置、表示制御方法およびプログラム - Google Patents
表示制御装置、表示制御方法およびプログラム Download PDFInfo
- Publication number
- WO2016088420A1 WO2016088420A1 PCT/JP2015/075492 JP2015075492W WO2016088420A1 WO 2016088420 A1 WO2016088420 A1 WO 2016088420A1 JP 2015075492 W JP2015075492 W JP 2015075492W WO 2016088420 A1 WO2016088420 A1 WO 2016088420A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- display control
- control unit
- roll angle
- display
- field
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/20—Finite element generation, e.g. wire-frame surface description, tesselation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/001—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2215/00—Indexing scheme for image rendering
- G06T2215/16—Using real world measurements to influence rendering
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/04—Changes in size, position or resolution of an image
- G09G2340/0492—Change of orientation of the displayed image, e.g. upside-down, mirrored
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/14—Solving problems related to the presentation of information to be displayed
Definitions
- This technology relates to a display control device, a display control method, and a program.
- AR augmented reality
- this technique proposes a technique that can improve the searchability of objects displayed in the user's field of view.
- the display control unit is configured to display an object corresponding to at least one of the yaw angle and the pitch angle of the display unit in a user's visual field, and the display control unit is configured to position the object in the visual field. Is provided as a first control mode that does not depend on the roll angle of the display unit.
- an object corresponding to at least one of the yaw angle and the pitch angle of the display unit is displayed in the user's visual field, and the position of the object in the visual field is displayed on the roll of the display unit.
- a display control method is provided that is operable as a first mode that does not depend on corners.
- the computer includes a display control unit that displays an object corresponding to at least one of the yaw angle and the pitch angle of the display unit in a user's visual field, and the display control unit includes the visual field
- a program for functioning as a display control device is provided that is operable as a first mode that does not depend on the roll angle of the display unit.
- FIG. 5A It is a development view of the cylindrical coordinates shown in FIG. 5B. It is explanatory drawing of the coordinate position in the said cylindrical coordinate system.
- FIG. 10 is a diagram for explaining a detailed example in which the orientation of the AR object with respect to the user's field of view does not depend on the roll angle when the display method of the AR object has no roll angle dependency. It is a figure for demonstrating the example of update of a roll restriction angle. It is a flowchart which shows the example of drawing operation of AR object. It is a flowchart which shows the other example of drawing operation of AR object. It is a figure which shows the example of a display of AR object in case the display method of AR object has roll angle dependence.
- FIG. 1 is a schematic diagram illustrating functions of a head mounted display (hereinafter referred to as “HMD”) according to an embodiment of the present technology.
- HMD head mounted display
- the X-axis direction and the Y-axis direction indicate horizontal directions orthogonal to each other, and the Z-axis direction indicates a vertical axis direction.
- These XYZ orthogonal coordinate systems represent a coordinate system (real three-dimensional coordinate system) of the real space to which the user belongs, the X-axis arrow indicates the north direction, and the Y-axis arrow indicates the east direction.
- the Z-axis arrow indicates the direction of gravity.
- the HMD 100 is mounted on the head of the user U and configured to display a virtual image in the visual field V (display visual field) of the real space of the user U.
- the image displayed in the visual field V includes information related to the predetermined objects A1, A2, A3, A4 existing in the visual field V.
- the predetermined object for example, a landscape, a store, a product, etc. existing around the user U are applicable.
- the HMD 100 stores in advance images (hereinafter also referred to as objects) B1, B2, B3, and B4 associated with a virtual world coordinate system surrounding the user U wearing the HMD.
- the world coordinate system is a coordinate system equivalent to the real space to which the user belongs, and determines the positions of the objects A1 to A4 based on the position of the user U and a predetermined axial direction.
- the world coordinates are the cylindrical coordinates C0 with the vertical axis as an axis, but other three-dimensional coordinates such as celestial coordinates centered on the user U may be adopted.
- the radius R and height H of the cylindrical coordinates C0 can be set arbitrarily.
- the radius R is set shorter than the distance from the user U to the objects A1 to A4, but may be longer than the above distance.
- the height H is set to a size equal to or greater than the height (length in the vertical direction) Hv of the user's U visual field V provided via the HMD 100.
- the objects B1 to B4 are images that display information related to the objects A1 to A4 existing in the world coordinate system, and may be images including characters, patterns, etc., or animation images. Good.
- the object may be a two-dimensional image or a three-dimensional image.
- the shape of the object may be a rectangle, a circle, or other geometric shape, and can be set as appropriate depending on the type of the object.
- the coordinate positions of the objects B1 to B4 on the cylindrical coordinates C0 are associated with, for example, the intersecting positions of the user's eye line L watching the objects A1 to A4 and the cylindrical coordinates C0.
- the center position of each of the objects B1 to B4 is made coincident with the intersection position.
- the present invention is not limited to this, and a part of the periphery of the object (for example, part of the four corners) may coincide with the intersection position.
- the coordinate positions of the objects B1 to B4 may be associated with any position away from the intersection position.
- the cylindrical coordinate C0 is a height in the height direction that represents a vertical coordinate axis ( ⁇ ) that represents an angle around the vertical axis with the north direction being 0 °, and a horizontal line of sight Lh of the user U.
- the coordinate axis ( ⁇ ) has a positive direction around the east, and the coordinate axis (h) has a depression angle as a positive direction and an elevation angle as a negative direction.
- the HMD 100 has a detection unit for detecting the viewpoint direction of the user U. Based on the output of the detection unit, the HMD 100 corresponds to which region on the cylindrical coordinate C0 the user's visual field V corresponds to. Judge whether to do. The HMD 100 displays (draws) the object B1 in the corresponding area when any object (for example, the object B1) exists in the corresponding area of the xy coordinate system forming the visual field V.
- the HMD 100 provides information related to the target A1 to the user U by displaying the object B1 in the field of view V so as to overlap the target A1 in the real space. Further, the HMD 100 can provide the user U with objects (B1 to B4) related to the predetermined objects A1 to A4 according to the orientation or direction of the viewpoint of the user U.
- FIG. 2 is an overall view showing the HMD 100
- FIG. 3 is a block diagram showing its configuration.
- the HMD 100 includes a display unit 10, a detection unit 20 that detects the attitude of the display unit 10, and a control unit 30 that controls driving of the display unit 10.
- the HMD 100 is configured as a see-through HMD that can provide a user with a visual field V in real space.
- the display unit 10 is configured to be attachable to the user U's head.
- the display unit 10 includes first and second display surfaces 11R and 11L, first and second image generation units 12R and 12L, and a support body 13.
- 1st and 2nd display surface 11R, 11L is comprised with the optical element which has transparency which can provide real space (external field visual field) to the right eye and left eye of the user U, respectively.
- the first and second image generation units 12R and 12L are configured to be able to generate images to be presented to the user U via the first and second display surfaces 11R and 11L, respectively.
- the support 13 supports the display surfaces 11R and 11L and the image generation units 12R and 12L, and the user's head so that the first and second display surfaces 11L and 11R face the right eye and the left eye of the user U, respectively. It has an appropriate shape that can be attached to the part.
- the display unit 10 configured as described above is configured to provide the user U with a field of view V in which a predetermined image (or virtual image) is superimposed on the real space via the display surfaces 11R and 11L. Is done. In this case, the cylindrical coordinates C0 for the right eye and the cylindrical coordinates C0 for the left eye are set, and the objects drawn on the cylindrical coordinates are projected on the display surfaces 11R and 11L.
- the detection unit 20 is configured to be able to detect a change in orientation or posture around at least one axis of the display unit 10.
- the detection unit 20 is configured to detect a change in orientation or posture of the display unit 10 around the X, Y, and Z axes.
- the orientation of the display unit 10 typically means the front direction of the display unit.
- the orientation of the display unit 10 is defined as the face direction of the user U.
- the detection unit 20 can be configured by a motion sensor such as an angular velocity sensor or an acceleration sensor, or a combination thereof.
- the detection unit 20 may be configured by a sensor unit in which each of the angular velocity sensor and the acceleration sensor is arranged in three axis directions, or the sensor to be used may be different depending on each axis.
- an integrated value of the output of the angular velocity sensor can be used as the posture change of the display unit 10, the direction of the change, the amount of the change, and the like.
- a geomagnetic sensor may be employed for detecting the orientation of the display unit 10 around the vertical axis (Z axis).
- the geomagnetic sensor and the motion sensor may be combined. Thereby, it is possible to detect a change in orientation or posture with high accuracy.
- the detection unit 20 is arranged at an appropriate position on the display unit 10.
- the position of the detection unit 20 is not particularly limited.
- the detection unit 20 is disposed on one of the image generation units 12R and 12L or a part of the support 13.
- the control unit 30 (first control unit) generates a control signal for controlling driving of the display unit 10 (image generation units 12R and 12L) based on the output of the detection unit 20.
- the control unit 30 is electrically connected to the display unit 10 via a connection cable 30a.
- the present invention is not limited to this, and the control unit 30 may be connected to the display unit 10 through a wireless communication line.
- control unit 30 includes a CPU 301, a memory 302 (storage unit), a transmission / reception unit 303, an internal power supply 304, and an input operation unit 305.
- the CPU 301 controls the operation of the entire HMD 100.
- the memory 302 includes a ROM (Read Only Memory), a RAM (Random Access Memory), and the like, a program for executing control of the HMD 100 by the CPU 301, various parameters, an image (object) to be displayed on the display unit 10, and the like. Store the necessary data.
- the transmission / reception unit 303 constitutes an interface for communication with the portable information terminal 200 described later.
- the internal power supply 304 supplies power necessary for driving the HMD 100.
- the input operation unit 305 is for controlling an image displayed on the display unit 10 by a user operation.
- the input operation unit 305 may be configured with a mechanical switch or a touch sensor.
- the input operation unit 305 may be provided in the display unit 10.
- the HMD 100 may further include a sound output unit such as a speaker, a camera, and the like.
- the audio output unit and the camera are typically provided in the display unit 10.
- the control unit 30 may be provided with a display device that displays an input operation screen or the like of the display unit 10.
- the input operation unit 305 may be configured by a touch panel provided in the display device.
- the portable information terminal 200 (second control unit) is configured to be able to communicate with the control unit 30 via a wireless communication line.
- the portable information terminal 200 has a function of acquiring an image to be displayed on the display unit 10 and a function of transmitting the acquired image to the control unit 30.
- the portable information terminal 200 is organically combined with the HMD 100 to construct an HMD system.
- the portable information terminal 200 is carried by a user U who wears the display unit 10 and includes an information processing device such as a personal computer (PC: Personal Computer), a smartphone, a mobile phone, a tablet PC, or a PDA (Personal Digital Assistant).
- an information processing device such as a personal computer (PC: Personal Computer), a smartphone, a mobile phone, a tablet PC, or a PDA (Personal Digital Assistant).
- a terminal device dedicated to the HMD 100 may be used.
- the portable information terminal 200 includes a CPU 201, a memory 202, a transmission / reception unit 203, an internal power supply 204, a display unit 205, a camera 206, and a position information acquisition unit 207.
- the CPU 201 controls the operation of the mobile information terminal 200 as a whole.
- the memory 202 includes a ROM, a RAM, and the like, and stores programs and various parameters for executing control of the portable information terminal 200 by the CPU 201, images (objects) transmitted to the control unit 30, and other necessary data.
- the internal power supply 204 supplies power necessary for driving the portable information terminal 200.
- the transmission / reception unit 203 is connected to the server N, the control unit 30, other nearby portable information terminals, etc. using a wireless LAN (IEEE802.11, etc.) such as WiFi (Wireless Fidelity) or a 3G or 4G network for mobile communication. connect.
- the portable information terminal 200 downloads an image (object) to be transmitted to the control unit 30 and an application for displaying the image from the server N via the transmission / reception unit 203 and stores them in the memory 202.
- the server N is typically configured by a computer including a CPU, a memory, and the like, and transmits predetermined information to the portable information terminal 200 in response to a request from the user U or automatically regardless of the intention of the user U. .
- the display unit 205 is composed of, for example, an LCD or an OLED, and displays various menus, application GUIs, and the like. Typically, the display unit 205 is integrated with a touch panel and can accept a user's touch operation.
- the portable information terminal 200 is configured to be able to input a predetermined operation signal to the control unit 30 by a touch operation on the display unit 205.
- the location information acquisition unit 207 typically includes a GPS (Global Positioning System) receiver.
- the portable information terminal 200 can measure the current position (longitude, latitude, altitude) of the user U (display unit 10) using the position information acquisition unit 207 and acquire a necessary image (object) from the server N. Configured. That is, the server N acquires information related to the current position of the user, and transmits image data, application software, and the like corresponding to the position information to the portable information terminal 200.
- FIG. 4 is a functional block diagram of the CPU 301.
- the CPU 301 includes a coordinate setting unit 311, an image management unit 312, a coordinate determination unit 313, and a display control unit 314.
- the CPU 301 executes processing in the coordinate setting unit 311, the image management unit 312, the coordinate determination unit 313, and the display control unit 314 according to the program stored in the memory 302.
- the coordinate setting unit 311 is configured to execute processing for setting three-dimensional coordinates surrounding the user U (display unit 10).
- cylindrical coordinates C0 (see FIG. 1) centered on the vertical axis Az are used as the three-dimensional coordinates.
- the coordinate setting unit 311 sets the radius R and the height H of the cylindrical coordinates C0.
- the coordinate setting unit 311 typically sets the radius R and the height H of the cylindrical coordinates C0 according to the number and type of objects to be presented to the user U.
- the radius R of the cylindrical coordinates C0 may be a fixed value, but may be a variable value that can be arbitrarily set according to the size (pixel size) of the image to be displayed.
- the height H of the cylindrical coordinates C0 is set to, for example, 1 to 3 times the height Hv (see FIG. 1) of the vertical direction (vertical direction) of the visual field V provided to the user U by the display unit 10. Is done.
- the upper limit of the height H is not limited to 3 times Hv, but may be a size exceeding 3 times Hv.
- FIG. 5A shows a cylindrical coordinate C0 having the same height H1 as the height Hv of the visual field V.
- FIG. 5B shows a cylindrical coordinate C0 having a height H2 that is three times the height Hv of the field of view V.
- FIGS. 6A and 6B are schematic diagrams showing the cylindrical coordinates C0 in an expanded manner.
- the cylindrical coordinate C0 represents the angle in the vertical direction with reference to the circumferential coordinate axis ( ⁇ ) representing the angle around the vertical axis with the north direction being 0 ° and the horizontal line of sight Lh of the user U.
- a coordinate axis (h) in the height direction a coordinate axis ( ⁇ ) has a positive direction around the east, and the coordinate axis (h) has a depression angle as a positive direction and an elevation angle as a negative direction.
- the coordinate setting unit 311 has a function as a region limiting unit capable of limiting the display region along the uniaxial direction of the field of view V in the three-dimensional coordinates azimuthing the display unit 10.
- the coordinate setting unit 311 limits the visual field region (Hv) in the height direction of the visual field V at the cylindrical coordinates C0 surrounding the display unit 10.
- the coordinate setting unit 311 determines that the height (H) of the cylindrical coordinates is an area in the height direction of the visual field V when the specified value of the height (H) is larger than the height Hv of the visual field V. Restrict according to Furthermore, the coordinate setting unit 311 limits the height of the cylindrical coordinates from H2 (FIG. 5B) to H1 (FIG.
- FIGS. 5A and 6A when the height (H1) of the cylindrical coordinates is the same as the height of the visual field V, an image (object) that can be seen at an elevation angle of ⁇ 90 ° to + 90 ° is an elevation angle ( It becomes the same as an image (object) that can be seen at a pitch angle of 0 °, and the searchability and visibility of the image (object) can be improved. At this time, an image (object) with little discomfort can be seen from an elevation angle of ⁇ 60 ° to + 60 °. Further, as shown in FIGS.
- the image management unit 312 has a function of managing the images stored in the memory 302. For example, the image management unit 312 stores one or a plurality of images displayed via the display unit 10 in the memory 302 and stores them in the memory 302. It is configured to execute a process of selectively deleting the processed image. An image stored in the memory 302 is transmitted from the portable information terminal 200. The image management unit 312 also requests the portable information terminal 200 to transmit an image via the transmission / reception unit 303.
- the memory 302 is configured to store one or a plurality of images (objects) to be displayed in the visual field V in association with the cylindrical coordinates C0. That is, the memory 302 stores the individual objects B1 to B4 on the cylindrical coordinates C0 shown in FIG. 1 together with the coordinate positions on the cylindrical coordinates C0.
- each of the objects B1 to B4 to be displayed corresponding to the azimuth or orientation of the visual field V occupies a coordinate area on a specific cylindrical coordinate C0, and specific coordinates in that area It is stored in the memory 302 together with the position P ( ⁇ , h).
- the coordinates ( ⁇ , h) of the objects B1 to B4 on the cylindrical coordinates C0 are a straight line connecting the positions of the objects A1 to A4 defined by the orthogonal coordinate system (X, Y, Z) and the position of the user, Corresponding to the coordinates of the cylindrical coordinate system at the intersection of the cylindrical coordinates C0 with the cylindrical surface. That is, the coordinates of the objects B1 to B4 correspond to the coordinates of the objects A1 to A4 converted from the actual three-dimensional coordinates to the cylindrical coordinates C0, respectively.
- the coordinate conversion of such an object is executed in the image management unit 312 and each object is stored in the memory 302 together with the coordinate position.
- the coordinate positions of the objects B1 to B4 may be set to any position within the display area of each object B1 to B4, or may be set to one specific point (for example, the center position) for each object. Two or more points (for example, two diagonal points or four corner points) may be set.
- the coordinate positions of the objects B1 to B4 are associated with the intersection position of the user's line of sight L that looks at the objects A1 to A4 and the cylindrical coordinates C0, the user U has the object B1.
- To B4 are visually recognized at positions overlapping the objects A1 to A4.
- the coordinate positions of the objects B1 to B4 may be associated with an arbitrary position away from the intersection position.
- the objects B1 to B4 can be displayed or drawn at desired positions with respect to the objects A1 to A4.
- the coordinate determination unit 313 is configured to execute a process of determining which region on the cylindrical coordinates C0 corresponds to the field of view V of the user U based on the output of the detection unit 20. That is, the visual field V moves on the cylindrical coordinates C0 due to a change in the posture of the user U (display unit 10), and the movement direction and movement amount are calculated based on the output of the detection unit 20.
- the coordinate determination unit 313 calculates the movement direction and movement amount of the display unit 10 based on the output of the detection unit 20, and determines to which region on the cylindrical coordinates C0 the visual field V belongs.
- FIG. 8 is a development view of the cylindrical coordinates C0 conceptually showing the relationship between the visual field V on the cylindrical coordinates C0 and the objects B1 to B4.
- the field of view V is substantially rectangular and has xy coordinates (local coordinates) with the upper left corner as the origin OP2.
- the x axis is an axis extending in the horizontal direction from the origin OP2, and the y axis is an axis extending in the vertical direction from the origin OP2.
- the coordinate determination unit 313 is configured to execute processing for determining whether any of the objects B1 to B4 exists in the corresponding region of the visual field V.
- the display control unit 314 executes processing for displaying (drawing) an object on the cylindrical coordinates C0 corresponding to the orientation of the display unit 10 in the visual field V based on the output of the detection unit 20 (that is, the determination result of the coordinate determination unit 313). Configured to do. For example, as shown in FIG. 8, when the current orientation of the visual field V overlaps the display areas of the objects B1 and B2 on the cylindrical coordinates C0, images corresponding to the overlapping areas B10 and B20 are displayed in the visual field V. (Local rendering: Local) Rendering).
- 9A and 9B are diagrams for explaining a conversion method from the cylindrical coordinates C0 (world coordinates) to the visual field V (local coordinates).
- the coordinates of the reference point of the field of view V on the cylindrical coordinates C0 are ( ⁇ v, hv), and the coordinates of the reference point of the object B located in the region of the field of view V are ( ⁇ 0, h0).
- the reference point of the field of view V and the object B may be set to any point, and in this example, the reference point is set at the upper left corner of the field of view V and the object B which are rectangular.
- ⁇ v [°] is the width angle of the visual field V on the world coordinates, and its value is determined by the design or specification of the display unit 10.
- the display control unit 314 determines the display position of the object B in the visual field V by converting the cylindrical coordinate system ( ⁇ , h) to the local coordinate system (x, y). As shown in FIG. 9B, if the height and width of the field of view V in the local coordinate system are Hv and Wv, respectively, and the coordinates of the reference point of the object B in the local coordinate system (x, y) are (x0, y0), conversion is performed.
- the display control unit 314 typically changes the display position of the object B within the field of view V following the change in the orientation or orientation of the display unit 10. This control is continued as long as at least part of the object B exists in the visual field V.
- the display area tends to be narrowed with the miniaturization of the HMD.
- the see-through type head mounted display for example, there is a case where it is desired to restrict the information display area while securing the see-through area.
- the HMD 100 of the present embodiment has an object display fixing function as described below.
- ⁇ Object display fixing function> (1) Introduction of Non Strict attribute
- the display control unit 314 moves an object within the field of view V according to the change of the azimuth or posture when the azimuth or posture of the display unit 10 changes by a predetermined angle or more.
- a process for fixing the display position of the object in the visual field V is configured to be executable.
- a non-strict attribute may be introduced into the object. That is, when the object B is not fixed at one place in the world coordinate system (cylindrical coordinates C0) but within a certain angle range in which the user U is looking, the local coordinate system (x, y) of the display unit 10 is displayed. ) May be fixedly displayed.
- the visibility of the object can be improved by restricting the movement of the object due to an inadvertent posture change around the vertical axis or the horizontal axis of the user U.
- the predetermined angle may be an angle around the vertical axis (Z axis), an angle around the horizontal axis (X axis and / or Y axis), or both of them.
- the value of the predetermined angle can be set as appropriate, for example, ⁇ 15 °.
- the predetermined angle may be the same around the vertical axis (first predetermined angle) and the horizontal axis (second predetermined angle), or may be different from each other.
- the display control unit 314 is configured to be able to execute a process of moving the object B to a predetermined position in the field of view V when the output change of the detection unit 20 is not more than a predetermined value over a predetermined time.
- the output of the detection unit 20 when the output of the detection unit 20 does not change over a predetermined time, it is highly likely that the user is referring to the object displayed in the field of view V. Therefore, the object is moved to a predetermined position in the field of view V. You may make it improve the visibility of an object.
- the predetermined time is not particularly limited, and is set to about 5 seconds, for example.
- the predetermined position is not particularly limited, and is, for example, a position that is biased to the center or corner of the field of view V, or to the top, bottom, left, or right. Furthermore, the moved object may be exaggerated such as enlarged.
- the function sets the object B to the predetermined of the local coordinate system (x, y) of the field of view V. You may make it carry out fixed display to a position. In this case, when the output of the detection unit 20 exceeds a predetermined value, the fixed display function of the object is canceled.
- the output value of the detection unit 20 at this time may be an output change amount corresponding to an attitude change of a predetermined angle or more around the predetermined axis of the display unit 10 described above, or other output change amount. May be.
- the display control unit 314 is configured to be able to execute a process of moving an object to a predetermined position in the visual field V when detecting an input of a predetermined signal generated by an operation of the user U. . Even in such a configuration, the visibility of the object can be improved as described above, and the display of the image can be controlled in accordance with the user's intention.
- the object is fixed to the local coordinate system (x, y) of the visual field V by performing a predetermined input operation on the input operation unit 305 or the portable information terminal 200 with the object aligned with the center of the visual field V. Is done. Then, by operating the input operation unit 305 again, the object returns to the world coordinate system, and the fixed display function of the object is released.
- the display control unit 31 when an object is displayed at a predetermined position of the visual field V, when the output change of the detection unit 20 is equal to or higher than a predetermined frequency, It is configured to be able to execute processing for invalidating frequency components above the frequency.
- the object in the field of view V moves following the change in the orientation or orientation of the display unit 10
- the user U may also follow a small facial shake, which may deteriorate the visibility of the object.
- the object is not allowed to follow the posture change of the display unit 10 for high-frequency components that are greater than or equal to a predetermined value, and the display position of the object for the low-frequency components that are less than a predetermined value is System).
- the predetermined frequency for example, a frequency corresponding to a user's face shake is set. Thereby, the visibility of the image can be ensured without being affected by the fine face shake of the user.
- V1 represents a local coordinate system at a certain point in time
- V2 represents an image stabilization coordinate system corresponding to V1.
- OP and OP ′ indicate the origins of V1 and V2.
- PD control is a kind of feedback control, and is generally proportional control (Proportional Control) and differential control (Differential control).
- Control is a control that converges to a set value.
- a point in the local coordinate system V1 at a certain time t is defined as (x (t), y (t)), and a corresponding point in the image stabilization coordinate system V2 is defined as (x '(t), y' (t)).
- the point of the local coordinate system V1 before the sampling period ( ⁇ t) is set to (x (t ⁇ t), y (t ⁇ t)), and the corresponding point of the image stabilization coordinate system V2 is set to (x ′ (t ⁇ t). ), Y ′ (t ⁇ t)).
- ⁇ p (t), ⁇ q (t) Px ⁇ ⁇ x (t) + Dx ⁇ ⁇ vx (t) (7)
- ⁇ q (t) Py ⁇ ⁇ y (t) + Dy ⁇ ⁇ vy (t) (8) It is expressed.
- Px and Py are differential gain constants for x and y
- Dx and Dy are velocity gain constants for x and y.
- the image stabilization coordinate system V1 ′ does not follow the rotation component (FIG. 10B). That is, the tilt of the object is restricted even when the face is tilted about the axis in the front-rear direction of the user.
- the object display fixing functions (1) to (4) described above may be applied individually, or may be applied by appropriately combining the functions. For example, a combination of any one of the above (1) to (3) and the above (4) is applicable.
- the present embodiment has a world coordinate system area restriction function for the purpose of improving the searchability of objects.
- the coordinate setting unit 311 can limit the region (H) along the Z-axis direction in the cylindrical coordinates C0 surrounding the display unit 10 according to the region (Hv) in the height direction of the visual field V. It has a function as an area limiter (see FIG. 5A). By limiting the height H of the cylindrical coordinates C0, it is possible to improve the searchability and visibility of images in the user's horizontal visual field.
- the amount of restriction in the height direction of the cylindrical coordinates C0 is not particularly limited, and in this embodiment, the height of the cylindrical coordinates C0 is limited to the same height (H1) as the height Hv of the visual field V.
- the display control unit 314 includes at least one of the cylindrical coordinate systems ( ⁇ , h) so that the objects B1 to B4 are positioned within the area-restricted cylindrical coordinates C0. It is configured such that the objects B1 to B4 can be displayed in the field of view V by changing the h coordinate.
- FIG. 11A and 11B are schematic diagrams showing the relative positional relationship between the objects B1 to B4 associated with the cylindrical coordinates C1 whose area is limited to the height H1 and the visual field V.
- FIG. Since the user U can visually recognize the objects B1 to B4 associated with all the directions only by changing the posture around the Z axis (vertical axis), the searchability of the objects B1 to B4 is dramatically improved. Will do.
- all the objects B1 to B4 are arranged in the cylindrical coordinates C1, but the present invention is not limited to this, and at least one object may be arranged in the cylindrical coordinates C1 as necessary. Further, the heights of the objects B1 to B4 arranged at the cylindrical coordinates C1 are not particularly limited and can be arbitrarily set.
- the entire objects B1 to B4 are arranged in the cylindrical coordinates C1, but at least a part of the objects B1 to B4 may be displayed in the field of view V.
- the image existing in a certain direction can be easily recognized.
- the height H1 of the cylindrical coordinates C1 may be configured to be changeable to a height higher than this by an input operation by the user U to the input operation unit 305 or the like. Thereby, the whole object can be visually recognized.
- Whether to enable or disable the area restriction function described above may be configured to be selectable by setting by the user U.
- the HMD 100 according to the present embodiment is set in a normal mode in which the area restriction function using the world coordinate system as the cylindrical coordinates C1 is enabled, and the area restriction function is changed by the user's spontaneous setting change (for example, high Change of the height H) or switching to an invalid state.
- control unit 30 detects the input of the predetermined signal generated by the operation of the user U, the control unit 30 limits the height direction area in the cylindrical coordinates according to the height direction area (Hv) of the visual field V.
- the processing for aligning all objects displayed in the field of view V to the same height in the field of view V may be executed.
- the area restriction function is in an invalid state or when the world coordinate system is set to a cylindrical coordinate other than the cylindrical coordinate C1
- the world coordinate system is changed by an input operation to the input operation unit 305 or the like by the user U. It is forcibly switched to the cylindrical coordinate C1.
- the objects B1 to B4 are arranged in the cylindrical coordinates C1 at positions where all the objects B1 to B4 are displayed at the same height in the field of view V as shown in FIG. 11B. Thereby, the further improvement of the visibility of the image displayed on the visual field can be aimed at.
- the portable information terminal 200 is used to transmit object data to the control unit 30.
- the portable information terminal 200 can acquire a position information acquisition unit 207 for measuring the position of the user U (display unit 10) and a plurality of objects (B1 to B4) to be stored in the memory 302 of the control unit 30 from the server N or the like.
- An image acquisition unit including a simple transmission / reception unit 203 and the like.
- control unit 30 requests the portable information terminal 200 to transmit one or more object data selected from the plurality of object data, and the portable information terminal 200 requests the control unit 30 to Sent object data.
- the control unit 30 (in this example, the image management unit 312) is configured as follows in order to avoid the communication speed and latency problems.
- control unit 30 is configured to acquire a plurality of necessary object data from the portable information terminal 200 in advance. Thereby, the drawing timing of the object in the visual field V can be controlled on the control unit 30 side, and a necessary object can be provided to the user U at an appropriate timing regardless of the communication environment.
- control unit 30 is configured to request the portable information terminal 200 to preferentially transmit an object associated with a coordinate position closer to the display area of the visual field V on the cylindrical coordinate C0. By preferentially acquiring object data that is likely to be presented to the visual field V in this way, it is possible to prevent a delay in displaying an object in the visual field V.
- the image management unit 312 is configured to first set one or a plurality of frames corresponding to the arrangement positions of the objects on the world coordinates, and then execute a process of arranging an object with a high priority in the frames. Is done. “Place a frame or object on world coordinates” means that a frame or object is associated on world coordinates.
- FIG. 12A, FIG. 12B, and FIG. 13 show a procedure for arranging the objects B3 and B4 on the cylindrical coordinates C1 whose area is limited to the height H1.
- the following procedure is also applicable to a world coordinate system composed of cylindrical coordinates C0 or other three-dimensional coordinates that are not limited in area.
- each of the image data (object data) of the object and the frame data that determines the coordinate position of the object are transmitted from the portable information terminal 200 to the control unit 30. Since the frame data has a smaller amount of data than the object data, it does not take time to obtain it compared to the object data. For this reason, communication for acquiring frame data is performed first, and communication for acquiring object data is performed later in order of priority.
- the portable information terminal 200 confirms whether or not it is necessary to transmit the frame F3 for arranging the object B3 to the control unit 30 (step 101).
- the control unit 30 sends the frame F3 to the portable information terminal 200. Request transmission (step 102).
- the control unit 30 stores the received frame F3 in the memory 302, thereby arranging the frame F3 at a corresponding position on the cylindrical coordinate C1.
- the portable information terminal 200 confirms whether or not it is necessary to transmit the frame F4 for arranging the object B4 to the control unit 30 (step 103), and the control unit 30 sends a frame F4 to the portable information terminal 200. Is requested (step 104).
- the control unit 30 stores the received frame F4 in the memory 302, thereby arranging the frame F4 at a corresponding position on the cylindrical coordinate C1.
- the portable information terminal 200 notifies the control unit 30 of permission to transmit object data (step 105).
- the control unit 30 proceeds to the data acquisition phase with the object data transmission permission notification as a trigger. Specifically, for example, based on the output of the detection unit 20, the control unit 30 determines a frame (in this example, the frame F4) that is closest to the orientation of the current visual field V (display unit 10), and objects belonging to the frame The transmission of the image data of (object B4 in this example) is requested (step 106). In response to this request, the portable information terminal 200 transmits the image data of the object B4 to the control unit 30 (step 107). The control unit 30 stores the received image data of the object B4 in the memory 302, thereby arranging the object B4 in the frame F4 on the cylindrical coordinates C1.
- the control unit 30 determines a frame (frame F3 in this example) closest to the direction of the field of view V next to the frame F4, and requests transmission of image data of an object (object B3 in this example) belonging to the frame. (Step 108).
- the portable information terminal 200 transmits the image data of the object B3 to the control unit 30 (step 109).
- the control unit 30 stores the received image data of the object B3 in the memory 302, thereby arranging the object B3 in the frame F3 on the cylindrical coordinates C1.
- control unit 30 can determine the object acquisition priority based on the current visual field V by registering the object frame data in advance on the cylindrical coordinates C1. Based on this, the image data is sequentially acquired from an object having a high priority (closest to the visual field V).
- the control unit 30 is configured to request the portable information terminal 200 to collectively transmit at least a part of all the images constituting the animation image. In this way, even if the object is an animation image, it can be dynamically handled by caching the required number of images (for example, images up to 1 second later) in consideration of the frame rate. it can.
- control unit 30 periodically evaluates the distance between the coordinate position of each object stored in the memory 302 and the display area of the visual field V, and stores the object at the coordinate position farthest from the display area of the visual field V. It may be configured to delete from 302. Specifically, the priorities of all the objects are evaluated based on the relative positional relationship between all the objects on the cylindrical coordinates C1 and the current azimuth of the visual field V, and the object data with a low priority is deleted. Thereby, a storage area for object data close to the visual field V can be secured.
- the priority evaluation method is not particularly limited, and can be evaluated based on, for example, the number of pixels between the center position of the visual field V and the center position of the object in the cylindrical coordinates C1.
- the evaluation value may be multiplied by a coefficient based on the reproduction time.
- FIG. 14 is a flowchart for explaining an outline of the operation of the HMD system according to the present embodiment.
- the current position of the user U is measured using the position information acquisition unit 207 of the portable information terminal 200 (step 201).
- the position information of the display unit 10 is transmitted to the server N.
- the portable information terminal 200 acquires object data related to a predetermined object existing in the real space around the user U from the server N (step 202).
- the control unit 30 sets the height (H) and radius (R) of the cylindrical coordinates C0 as the world coordinate system according to the type of object data and the like (step 203).
- the coordinate setting unit 311 sets the world coordinate system to, for example, the cylindrical coordinate C1 illustrated in FIG. 12A. Set.
- control unit 30 detects the azimuth of the visual field V based on the output of the detection unit 20 (step 204), acquires object data from the portable information terminal 200, and stores it in the memory 302 (step 205).
- FIG. 15 is a flowchart showing an example of a procedure for receiving object data by the control unit 30.
- the control unit 30 After receiving the object data transmission permission confirmation from the portable information terminal 200 (step 301), the control unit 30 determines whether or not the frame registration of all objects is completed (step 302). This is because if the frame registration of all objects is not completed, the coordinate position of the object cannot be determined, and the priority of the object cannot be evaluated. If the frame registration is incomplete, the process is terminated, and the above-described incomplete frame registration process is executed.
- step 303 when the frame registration of all the objects is completed, the presence / absence of an object not received and the capacity of the memory 302 are confirmed (step 303). If there is an unregistered object and the memory capacity is sufficient, the unregistered object is received and stored in the memory 302 (step 304).
- control unit 30 periodically evaluates the priority of the object in the memory 302, and deletes a low evaluation value as necessary.
- the control unit 30 displays (draws) the object at the corresponding position of the visual field V via the display unit 10 (step 206). ).
- any of the above-described object display fixing functions may be applied.
- FIG. 16 is a flowchart illustrating an example of a procedure for drawing an object on the field of view V by the control unit 30.
- the control unit 30 calculates the current orientation of the visual field V based on the output of the detection unit 20 (step 401).
- the orientation of the visual field V is converted into the world coordinate system ( ⁇ , h), and it is monitored which position on the cylindrical coordinates C0 corresponds to.
- control unit 30 determines whether or not there is an object for which scanning (processing after step 403) has not been completed among all objects stored in the memory 302 (step 402).
- the scan is performed on all objects stored in the memory 302 every time the screen is updated.
- step 403 If there is an unscanned object, it is determined whether the object is a world coordinate system object (step 403), and if “No”, the object is drawn in the field of view V (step 404).
- step 403 determines whether any of the object display fixing functions (for example, the first grab function) is applied to the object (step 405). If the function is applied, the object is fixedly displayed in the field of view V when a predetermined condition is satisfied (step 406). On the other hand, when any display fixing function is not applied, the object is drawn in the visual field V when the visual field V enters the object position (step 407).
- the object display fixing functions for example, the first grab function
- FIG. 17 is a diagram illustrating examples of the yaw angle, the pitch angle, and the roll angle.
- the orientation RY hereinafter also referred to as “yaw angle” around the vertical axis (Z axis) of the display unit 10 and the depression angle around the left and right axis (Y axis) of the display unit 10.
- An object hereinafter also referred to as “AR object” according to at least one of (or elevation angle) RP (hereinafter also referred to as “pitch angle”) can be provided to the user's visual field.
- the object may be associated with real world coordinates (may be associated with objects around the display unit 10).
- the display control unit 314 also makes the position and orientation of the AR object with respect to the visual field of the user also depend on an angle RO (hereinafter also referred to as “roll angle”) around the front-rear axis (X axis) of the display unit 10. It's okay.
- angle RO hereinafter also referred to as “roll angle”
- X axis front-rear axis
- FIG. 18 is a diagram for explaining an example in which the position and orientation of the AR object with respect to the user's field of view depend on the roll angle.
- the display control unit 314 displays the AR objects A and B in the user's visual field V2-1.
- the left and right central axis Da of the HMD 100 is horizontal.
- the horizontal axis Ha is a horizontal axis orthogonal to the horizontal line of sight Lh of the user U (see FIG. 1).
- the display control unit 314 may make the positions and orientations of the AR objects A and B with respect to the visual field V2-2 depend on the roll angle RO.
- the display control unit 314 may rotate the AR objects A and B by the same amount as the roll angle RO in the direction opposite to the roll angle RO and display the AR objects A and B in the field of view V2-2.
- the rotation axis may be an intersection of the left / right central axis Da and the horizontal axis Ha.
- various scenes are assumed as scenes suitable for the method in which the position and orientation of the AR object with respect to the visual field of the user depend on the roll angle (hereinafter, also simply referred to as “depending on the roll angle”). Is done.
- FIG. 19A is a diagram for explaining an example of a suitable scene with roll angle dependency.
- the display control unit 314 displays the AR object B3-1 that is a three-dimensional object in the visual field V3-1.
- the display control unit 314 rotates the AR object B3-1 in the field of view V3-1 by the same amount as the roll angle in the direction opposite to the roll angle and displays it in the field of view V3-2. You may let me.
- a three-dimensional object when displaying a three-dimensional object, the display control unit 314 gives priority to expressing the reality of the AR object (the position and orientation of the AR object with respect to the field of view). It is better to display the AR object depending on the roll angle, giving priority to accurate representation).
- a three-dimensional object can include a three-dimensional model (for example, a three-dimensional model with a parallax added to the left and right, and a three-dimensional object defined by a floor plan (three views)) can also be included.
- FIG. 19B is a diagram for explaining another example of a scene suitable for roll angle dependency.
- the display control unit 314 displays AR objects (AR objects B4-1 to B4-3) existing in an area where the density of displayed objects exceeds a threshold value in the visual field V4-1. .
- the display control unit 314 rotates the AR objects B4-1 to B4-3 in the field of view V4-1 by the same amount as the roll angle in the direction opposite to the roll angle. -2 may be displayed.
- the display control unit 314 displays an AR object (AR objects B4-1 to B4-3) that exists in an area where the density of objects to be displayed exceeds a threshold value. It is preferable to display the AR object depending on the roll angle in preference to accurately expressing the position of the AR object with respect to the visual field. It should be noted that other AR objects that place importance on accurately expressing the position of the AR object with respect to the field of view may also be displayed depending on the roll angle dependency.
- the position and orientation of the AR object relative to the user's field of view may depend on the roll angle.
- a part of each of the AR objects A and B has deviated from the user's visual field V2-2, and the searchability of the AR objects A and B has deteriorated. Therefore, in the following, techniques that can improve the searchability of AR objects displayed in the user's field of view are mainly proposed.
- the display control unit 314 may not make the position and orientation of the AR object with respect to the user's field of view depend on the roll angle.
- FIG. 20 is a diagram for explaining an example in which the position and orientation of the AR object with respect to the user's field of view do not depend on the roll angle.
- the display control unit 314 displays the AR objects A and B in the user's visual field V6-1.
- the visual field V6-1 when the user does not tilt his / her neck (when the roll angle is zero), the left and right central axis Da of the HMD 100 is horizontal.
- the display control unit 314 does not have to depend on the roll angle RO for the positions and orientations of the AR objects A and B with respect to the visual field V6-2.
- the display control unit 314 may make the positions and orientations of the AR objects A and B with respect to the visual field V6-2 unchanged before and after the roll angle changes.
- various scenes are assumed as scenes suitable for a method in which the position and orientation of the AR object with respect to the user's field of view do not depend on the roll angle (hereinafter also simply referred to as “no roll angle dependency”). Is done.
- no roll angle dependency a method in which the position and orientation of the AR object with respect to the user's field of view does not depend on the roll angle.
- the direction of the AR object with respect to the user's field of view depends on the roll angle. Also good.
- FIG. 21 is a diagram for explaining an example of a scene in which no dependency on the roll angle is suitable.
- the display control unit 314 displays the AR objects C and D in the visual field V7-1.
- the user wants to select the AR object B and the user rotates his / her head (assuming that the display unit 10 is rotated around the vertical axis).
- the AR objects A to C are displayed in the visual field V7-2.
- a predetermined operation for example, a button pressing operation
- the display control unit 314 does not have to depend on the roll angle for the position of the AR object in the field of view when the AR object that can accept the operation by the user is displayed. By doing so, since the possibility that the AR object will be out of the field of view is reduced, the operability of the AR object can be improved.
- the AR object that can accept an operation by the user is not particularly limited.
- the AR object may be an AR object that can accept a pressing operation by the user, and a part of a menu screen on which one or a plurality of AR objects are arranged. It may be configured.
- FIG. 22 is a diagram for explaining the searchability of an AR object when the AR object display method has roll angle dependency.
- FIG. 23 is a diagram for explaining the searchability of an AR object when the AR object display method has no roll angle dependency.
- the display control unit 314 cannot yet display the AR object A in the user's visual field V8-1. Subsequently, the display control unit 314 displays the AR object A in the visual field V8-2, but cannot display the AR object A in the subsequent visual field V8-3.
- the display control unit 314 can display the AR object A in the user's visual field V9-1.
- the display control unit 314 can also display the AR object A in the visual field V9-2, and can also display the AR object A in the subsequent visual field V9-3.
- the AR object A enters the field of view for a longer time than when the AR object display method has the roll angle dependency. It becomes possible to improve the searchability of A.
- FIG. 24 is a diagram for explaining an example of a method for limiting the rotation of the AR object according to the situation when the display method of the AR object has roll angle dependency.
- the display control unit 314 displays the AR objects A and B in the user's visual field V10-1.
- the field of view V10-1 when the user does not tilt his / her neck (when the roll angle is zero), the left and right central axis Da of the HMD 100 is horizontal.
- the display control unit 314 rolls the AR objects A and B in a direction opposite to the roll angle RO.
- the field of view V10-2 may be displayed after being rotated by the same size as RO.
- the roll limit angle Rr is exceeded, if the AR objects A and B are similarly rotated, a situation may occur in which the AR objects A and B do not fit in the visual field V10-3.
- the display control unit 314 may rotate the positions of the AR objects A and B by the same amount as the roll limit angle Rr in the direction opposite to the roll angle RO. . By doing so, it is possible to prevent a situation where the AR objects A and B do not fit in the visual field V10-3, and it is possible to suppress a decrease in searchability of the AR objects A and B.
- FIG. 25 is a diagram for explaining a detailed example of a method for limiting the rotation of the AR object depending on the situation when the display method of the AR object has roll angle dependency.
- the display control unit 314 displays the AR object B11 (AR object in which the station name is described) on the horizontal axis Ha of the user's visual field V11-1.
- the visual field V11-1 when the user does not tilt his / her neck (when the roll angle is zero), the left and right central axis Da of the HMD 100 is horizontal.
- the left and right central axis Da of the HMD 100 is tilted with respect to the horizontal axis Ha.
- the roll angle RO exceeds the roll limit angle Rr.
- the AR object B11 is rotated by the same amount as the roll angle RO in the direction opposite to the roll angle RO, the AR object B11 is moved from the position Wa to the position Pa. A situation may occur in which the AR object B11 does not fit.
- the display control unit 314 may rotate the position of the AR object B11 by the same amount as the roll limit angle Rr in the direction opposite to the roll angle RO. By doing so, it is possible to prevent a situation where the AR object B11 does not fit in the visual field V11-3, and it is possible to suppress a decrease in searchability of the AR object B11.
- FIG. 26 is a diagram for explaining a case where the function of limiting the visual field region is combined with no roll angle dependency.
- the user looks on the horizontal axis Ha while tilting the neck, and then rotates the neck to the right. (Field of view V12-1 to field of view V12-4).
- the AR object A is moving from one end of the visual field to the other.
- the AR object moves from one end of the visual field to the other while the user rotates the head 360 degrees around the vertical axis. Therefore, it is expected that the searchability of the AR object is further improved.
- the height of the cylindrical coordinates is set to be the same as the heights of the visual fields V12-1 to V12-4, and the three-dimensional coordinates corresponding to the AR object A are on the horizontal plane including the horizontal axis Ha. The case where it is set is assumed.
- the display positions of the AR object A with the roll angle dependency are shown as positions Pa-1 to Pa-4.
- FIG. 27 is a diagram for explaining the details when the function of limiting the visual field region is combined with no roll angle dependency.
- the display control unit 314 displays an AR object B13 (AR object in which a station name is described) on the horizontal axis Ha of the user's visual field V13-1.
- the visual field V13-1 when the user does not tilt his / her neck (when the roll angle is zero), the left and right central axis Da of the HMD 100 is horizontal.
- the left and right central axis Da of the HMD 100 moves upward.
- the magnitude of the movement MO of the left-right central axis Da exceeds the magnitude of the predetermined movement Mr.
- the AR object B13 is moved by the same amount as the movement MO of the left-right central axis Da in the opposite direction to the movement MO of the left-right central axis Da, the AR object B13 is moved from the position Wa to the position Pa. Therefore, a situation may occur in which the AR object B13 does not fit in the visual field V13-2.
- the display control unit 314 may move the position of the AR object B13 by the same amount as the predetermined movement Mr in the direction opposite to the movement MO of the left and right central axis Da. By doing so, it is possible to prevent a situation where the AR object B13 does not fit in the visual field V13-2, and it is possible to suppress a decrease in searchability of the AR object B13.
- FIG. 27 there is an AR object B13 on the line Ln that has been moved by the same magnitude as the predetermined movement Mr in the opposite direction to the movement MO of the left-right central axis Da.
- the display control unit 314 rotates the position of the AR object in the field of view by the same amount as the roll angle in the opposite direction to the roll angle.
- the position may be gradually moved from the position (position on the horizontal axis) to a position rotated by the same magnitude as the roll limit angle in the direction opposite to the roll angle. Then, it is possible to prompt the user to make a movement to make the roll angle zero (a movement to make the left and right central axis Da of the HMD 100 parallel to the horizontal axis Ha).
- FIG. 28 is a diagram for explaining a display example of the AR object when the roll angle exceeds the roll limit angle with the roll angle dependency.
- an AR object is an object Xa
- the first spring SP is on the horizontal axis Ha (the position Pa of the AR object with roll angle dependency).
- the second spring SP is connected to the left and right central axis Da (AR object position Wa without roll angle dependency).
- the position of the statically balanced object Xa may be calculated as the position where the AR object converges.
- the spring constants of the second springs SP may be the same or different.
- a spring and a damper may be used instead of the two springs SP.
- the position where the AR object converges may be controlled by PD control using a spring and a damper.
- a result of the AR object being pulled in the roll direction by the left / right center axis Da (or pressed) by a dynamic model in which the AR object is connected to the left / right center axis Da of the HMD 100 via a spring. Result) may be calculated as the position where the AR object converges.
- a line segment connecting the AR object position Pa with the roll angle dependency and the AR object position Wa without the roll angle dependency may be calculated as a position where the AR object converges.
- the position of the AR object should be immediately moved into the field of view without gradually converging.
- the display control unit 314 determines the position of the AR object in a direction opposite to the movement of the left and right central axis.
- the display control unit 314 changes the position of the AR object in a direction opposite to the movement of the left and right central axis. May be gradually converged to a position moved by the same size as
- the display control unit 314 determines the position of the AR object in the field of view and the position of the AR object as the movement of the horizontal central axis.
- the position moved from the position moved in the opposite direction by the same amount as the predetermined amount of movement (position on the horizontal axis) by the same amount as the predetermined amount of movement in the opposite direction to the movement of the left and right central axes It may be gradually converged. Then, it is possible to prompt the user to make a movement to make the pitch angle zero (a movement to make the center of the left and right central axis Da of the HMD 100 coincide with the horizontal axis Ha).
- the first spring is connected to the horizontal axis Ha (the position of the AR object when the visual field area in the height direction is limited) and connected to the object. It is assumed that the spring is connected to the left and right central axis Da (the position of the AR object when the visual field area in the height direction is not limited). In such a case, the position of the statically balanced object may be calculated as the position where the AR object converges. At this time, the spring constants of the second springs may be the same or different.
- the position where the AR object converges may be controlled by PD control using a spring and a damper.
- the AR object is pulled in the height direction by the left and right central axis of the HMD 100 by a dynamic model in which the AR object is connected to the left and right central axis of the HMD 100 via a spring (or pushed).
- the convergence position may be calculated as the convergence position of the AR object.
- an inner portion of the line segment connecting the position of the AR object when the visual field area in the height direction is restricted and the position of the AR object when the visual field area in the height direction is restricted is determined by a predetermined ratio.
- a minute point may be calculated as a position where the AR object converges.
- the position of the AR object immediately moves into the field of view without gradually converging. It is good to let them.
- the position of the AR object gradually converges as described above may be set as a dependent attribute with respect to roll angle dependency.
- the dependent attribute is referred to by the function of the application, and the drawing process of the AR object according to the dependent attribute can be performed.
- the dependent attribute may be set in any way, but may be set for each AR object by, for example, an application developer.
- FIG. 29 is a diagram for explaining an example in which the direction of the AR object depends on the roll angle when the display method of the AR object has no roll angle dependency.
- the display control unit 314 displays the AR objects A and B in the user's visual field V15-1.
- the visual field V15-1 when the user does not tilt his / her neck (when the roll angle is zero), the left and right central axis Da of the HMD 100 is horizontal.
- the display control unit 314 does not make the position of the AR objects A and B with respect to the visual field V15-2 depend on the roll angle RO, while making the direction of the AR objects A and B with respect to the visual field V15-2 depend on the roll angle RO. Good.
- the display control unit 314 may make the positions of the AR objects A and B with respect to the visual field V15-2 unchanged before and after the roll angle changes. Also, as shown in FIG. 29, the display control unit 314 rotates the direction of the AR objects A and B with respect to the visual field V15-2 by the same amount as the roll angle in the direction opposite to the roll angle, and displays the visual field V15-2. You may display.
- FIG. 30 is a diagram for describing a detailed example in which the orientation of the AR object with respect to the user's field of view does not depend on the roll angle when the display method of the AR object has no roll angle dependency.
- the display control unit 314 displays an AR object (for example, an AR object B16-1 illustrating the work procedure) that is collated with a real object (for example, the work target T16) in the field of view V16-1. ing.
- the display control unit 314 rotates the AR object B16-1 in the field of view V16-1 by the same amount as the roll angle in the opposite direction to the roll angle and displays it in the field of view V16-2. You may let me.
- the display control unit 314 displays an AR object (for example, an AR object B16-1 illustrating a work procedure) that is checked against a real object (for example, the work target T16).
- an AR object for example, an AR object B16-1 illustrating a work procedure
- a real object for example, the work target T16.
- the direction of the AR object depends on the roll angle as described above may be set as a dependent attribute for no roll angle dependency.
- the dependent attribute is referred to by the function of the application, and the drawing process of the AR object according to the dependent attribute can be performed.
- the dependent attribute may be set in any way, but may be set for each AR object by, for example, an application developer.
- the display control unit 314 has a first mode in which no roll angle dependency is used as an AR object display method (hereinafter, also referred to as a “roll angle independent mode”) and a roll angle dependency.
- a first mode in which no roll angle dependency is used as an AR object display method hereinafter, also referred to as a “roll angle independent mode”
- a roll angle dependency An example in which one of the second modes used (hereinafter also referred to as “roll angle dependent mode”) can be selected as the operation mode will be described.
- the display control unit 314 may select either the roll angle-independent mode or the roll angle-dependent mode as an operation mode based on an operation by the user. For example, when the user inputs an operation for selecting the mode without roll angle dependency, the display control unit 314 may select the mode without roll angle dependency as the operation mode. On the other hand, for example, when the user inputs an operation for selecting the roll angle dependent mode, the display control unit 314 may select the roll angle dependent mode as the operation mode.
- the display control unit 314 may select the mode without the roll angle dependency as the operation mode when the AR object can accept the operation by the user. This is because, when an AR object that can accept an operation by the user is displayed, it is considered that it is more desirable to improve the searchability of the AR object than to express the sense of reality of the AR object.
- the AR object that can accept an operation by the user is not particularly limited.
- the AR object may be an AR object that can accept a pressing operation by the user, and a menu in which one or more AR objects are arranged. A part of the screen may be configured.
- the display control unit 314 may select an operation mode based on information associated with the AR object.
- the information associated with the AR object may be information indicating whether to select the mode without roll angle dependency (information indicating whether there is roll angle dependency) or other information (for example, , Information indicating whether or not the AR object is a three-dimensional object).
- the display control unit 314 may select the roll angle dependent mode as the operation mode. This is because, when the AR object is a three-dimensional object, it is considered more desirable to express a sense of reality of the AR object than to improve the searchability of the AR object.
- the display control unit 314 may select either the roll angle-independent mode or the roll angle-dependent mode as the operation mode based on the capability of the HMD 100. For example, when the capability of the HMD 100 is lower than the threshold value, the display control unit 314 may select the no roll angle dependency mode as the operation mode. When the capability of the HMD 100 is lower than the threshold value, it is desirable that the processing load on the display control unit 314 should be reduced by selecting the mode without roll angle dependency that does not perform the process of rotating the AR object. Because it is.
- the display control unit 314 may select the roll angle dependent mode as the operation mode.
- the capability of the HMD 100 is higher than the threshold value, it is considered that the roll angle-independent mode for performing the process of rotating the AR object is selected, and the processing load on the display control unit 314 may not be reduced. Because it is.
- the capability of the HMD 100 may be the capability of performing a drawing process of the AR object by the display control unit 314.
- the ability to perform the AR object drawing process may be the ability of the computing device, or may be the ability of the ability of the computing device excluding the ability used by operations other than the AR object drawing process. .
- the capability of the HMD 100 may select an operation mode based on the remaining battery capacity of the HMD 100. For example, when the remaining battery level of the HMD 100 is lower than the threshold value, the display control unit 314 may select the mode without the roll angle dependency as the operation mode. When the remaining battery level of the HMD 100 is lower than the threshold value, it is desirable to reduce the power consumption by the display control unit 314 by selecting the roll angle-independent mode that does not perform the process of rotating the AR object. It is possible.
- the display control unit 314 may select the roll angle dependent mode as the operation mode.
- the mode without the roll angle dependency for performing the process of rotating the AR object is selected, and the power consumption by the display control unit 314 may not be reduced. It is possible.
- the position of the AR object may be gradually changed (the AR object may be changed by an animation expression).
- the switching of the operation mode may be realized by an application function. For example, since the drawing process of the AR object is performed based on the reference result of the operation mode, the switching of the operation mode in the drawing process of an AR object can be reflected in the drawing process of the next AR object.
- a processing block that gradually changes the position and orientation of the AR object may be prepared separately from the application.
- the position and orientation of the AR object may be gradually changed by the application function.
- FIG. 31 is a diagram for explaining an example of updating the roll limit angle.
- AR objects A to E may constitute a menu screen.
- the display control unit 314 rotates the AR objects A to E by the same amount as the roll angle in the opposite direction to the roll angle until the roll angle exceeds the roll limit angle Rr. May be displayed in the field of view V17-1.
- the display control unit 314 sets the positions of the AR objects A to E to be the same as the roll limit angle Rr in the direction opposite to the roll angle. It is good to rotate only the size. However, when the time when the roll angle exceeds a threshold value (for example, 60 degrees) continues for a predetermined time (for example, 10 seconds), it is highly likely that the user's posture itself has changed. It is considered that the searchability of the AR objects A to E is improved by updating Rr.
- a threshold value for example, 60 degrees
- a predetermined time for example, 10 seconds
- the display control unit 314 may update the roll limit angle Rr when the time during which the roll angle exceeds the threshold value r1 continues for a predetermined time.
- the display control unit 314 reduces the roll limit angle Rr (for example, sets the roll limit angle Rr to AR objects A to E are displayed in the visual field V17-3.
- the roll limit angle Rr is updated, the positions of the AR objects A to E may be gradually changed (the AR objects A to E may be changed by animation expression).
- the update of the roll limit angle may be realized by an application function.
- the drawing process of the AR object is performed based on the reference result of the roll limit angle
- the update result of the roll limit angle in the drawing process of a certain AR object can be reflected in the drawing process of the next AR object.
- a processing block for gradually changing the roll limit angle may be prepared separately from the application.
- the roll limit angle may be gradually changed depending on the application function.
- the update of a roll restriction angle is not limited to this example.
- the display control unit 314 may update the predetermined angle when the view form satisfies a predetermined condition. More specifically, the display control unit 314 displays the horizontal field of view (for example, when the pitch angle corresponding to the vertical length of the visual field is smaller than 20 ° and the horizontal length of the visual field is In order to prevent the AR object from being removed from the field of view, the roll limit angle may be made smaller.
- the display control unit 314 may make the characters written on the AR object visible to the user by preventing the AR object from protruding from the field of view (by narrowing the roll limit angle).
- the display control unit 314 allows a part of the object to protrude from the field of view, but at least places the end of the object in the field of view, and widens the roll limit angle so that only the presence of the object is recognized by the user. May be.
- the display control unit 314 may update the roll limit angle when the capability of the HMD 100 satisfies a predetermined condition. More specifically, when the capability of the HMD 100 is lower than the threshold value, the display control unit 314 may decrease the roll limit angle (for example, the roll limit angle may be zero). This is because when the capability of the HMD 100 is lower than the threshold value, it is considered desirable to reduce the processing load on the display control unit 314 by not performing the process of rotating the AR object.
- the capability of the HMD 100 may be the capability of performing a drawing process of the AR object by the display control unit 314.
- the ability to perform the AR object drawing process may be the ability of the computing device, or may be the ability of the ability of the computing device excluding the ability used by operations other than the AR object drawing process. .
- FIG. 32 is a flowchart illustrating an example of the drawing operation of the AR object.
- information indicating whether or not the mode without roll angle dependency is selected (information indicating whether or not there is roll angle dependency) and the roll limit angle are associated with each other.
- An example is assumed.
- the information indicating whether or not there is roll angle dependency and the roll limit angle may be associated in advance with the AR object according to the use case by the application developer. However, the information indicating whether or not there is roll angle dependency and the roll limit angle may be determined as values independent of the AR object.
- the display control unit 314 determines whether or not there is an AR object that is not drawn and enters the field of view (step 601).
- the drawing operation is terminated.
- the display control unit 314 determines whether the AR object has roll angle dependency ( Step 602).
- the display control unit 314 determines that the AR object has a roll angle dependency (“Yes” in step 602)
- the display control unit 314 selects a mode having a roll angle dependency and determines the AR depending on the roll angle.
- Object drawing processing is performed (S603), and the process returns to S601.
- the display control unit 314 determines that the AR object has no roll angle dependency (“No” in step 602)
- the display control unit 314 selects the mode without the roll angle dependency, and the AR object does not depend on the roll angle.
- Drawing processing is performed (S604), and the process returns to S601.
- FIG. 33 is a flowchart illustrating another example of the drawing operation of the AR object.
- the field of view An example is assumed in which information indicating whether or not to limit the height of an area (information indicating whether or not there is a height limiting attribute) is associated.
- information indicating whether or not to limit the height of the viewing area is also determined in advance by the application developer according to the use case. It may be associated with an object. However, in addition to the information indicating whether or not there is roll angle dependency and the roll limit angle, information indicating whether or not to limit the height of the viewing area may also be determined as a value independent of the AR object. .
- the display control unit 314 determines whether there is an AR object that is not yet drawn and enters the field of view (step 701).
- step 701 determines whether the AR object has a height restriction attribute (Ste 702). If the display control unit 314 determines that the AR object has a height restriction attribute (“Yes” in step 702), the display control unit 314 proceeds to step 703 and determines that the AR object does not have a height restriction attribute. (“No” in step 702), the process proceeds to step 704.
- the display control unit 314 determines whether or not the AR object has roll angle dependency (step 703). If the display control unit 314 determines that the AR object has a roll angle dependency (“Yes” in step 703), the display control unit 314 performs an AR object drawing process in consideration of the height restriction and the roll angle (S703). Return. On the other hand, if the display control unit 314 determines that the AR object has no roll angle dependency (“No” in step 703), the display control unit 314 performs an AR object drawing process in consideration of the height restriction (S705). Return.
- the display control unit 314 determines whether or not the AR object has roll angle dependency (step 704). If the display control unit 314 determines that the AR object has a roll angle dependency (“Yes” in step 704), the display control unit 314 performs an AR object drawing process considering the roll angle (S707), and returns to S701. On the other hand, if the display control unit 314 determines that the AR object has no roll angle dependency (“No” in step 704), the display control unit 314 performs an AR object drawing process considering the height restriction and the roll angle (S708). , The process returns to S701.
- FIG. 34 is a diagram illustrating a display example of the AR object when the display method of the AR object has roll angle dependency.
- FIG. 35 is a diagram illustrating a display example of the AR object when the display method of the AR object has no roll angle dependency.
- the display angle of view is wider than a certain degree (for example, when the pitch angles corresponding to the vertical and horizontal lengths of the visual field are each greater than 50 °), it is unlikely that the AR object is out of the visual field, and the AR Since the blur of the object A appears to be small, there is a possibility that the position roll angle dependency is useful.
- the visual field changes between the visual field V18-1 and the visual field V18-2 each time the user tilts his / her head.
- the AR object A since the AR object A is fixed with respect to the field of view, the AR object A is not likely to be out of the field of view, and the visibility of the AR object A can be maintained.
- the display angle of view is narrower than a certain level and the field of view is horizontally long (for example, when the pitch angle corresponding to the vertical length of the field of view is smaller than 20 °, and the horizontal length of the field of view is If it is longer than the vertical length, it is likely that no roll angle dependency is useful.
- the display method of the AR object A does not depend on the roll angle, it is not necessary to perform image processing according to the roll angle on the AR object A, and thus it is possible to reduce the load on the arithmetic device. Therefore, the absence of roll angle dependency can also be applied to a system without a graphic engine. Further, since the load on the arithmetic device is reduced, the power consumption by the arithmetic device is reduced, the battery duration time can be increased, and the battery can be reduced in weight. Furthermore, since the system can be simplified without dependency on the roll angle, the cost required for system construction can be reduced.
- the AR object may be a two-dimensional image or a three-dimensional object.
- the display of the AR object may be changed depending on whether the AR object is a two-dimensional image or a three-dimensional object. This will be specifically described with reference to FIG.
- FIG. 36 is a diagram illustrating a display example of an AR object when the AR object is a two-dimensional image and when the AR object is a three-dimensional object.
- the display control unit 314 performs predetermined image processing on the AR object A and displays the AR object A subjected to the predetermined image processing in the field of view V19-1. You may let me. If it does so, it will become possible to give the three-dimensional feeling of the 3D AR object A.
- image processing corresponding to the positional relationship between the three-dimensional AR object A and the HMD 100 for example, image processing for making the corresponding yaw angle and pitch angle visible). May be given.
- the display control unit 314 simply reduces the three-dimensional AR object A in the vertical direction of the visual field V19-1.
- the reduced three-dimensional AR object A is displayed in the visual field V19-1.
- image processing for simply reducing the three-dimensional AR object A in the vertical direction of the visual field V19-1 may be performed. The processing load required for the drawing process is reduced.
- the display control unit 314 performs image processing on the AR object A even if the AR object A is a three-dimensional object (the positional relationship between the three-dimensional AR object A and the HMD 100).
- Image processing for example, image processing for making the image visible from an angle corresponding to the corresponding yaw angle or pitch angle
- the ability of the HMD 100 may be an ability to perform a drawing process of the three-dimensional AR object A by the display control unit 314.
- the ability to perform drawing processing of the three-dimensional AR object A may be the ability of the arithmetic device, and the ability used by operations other than the drawing processing of the three-dimensional AR object A is excluded from the ability of the arithmetic device. It may be an ability.
- the display control unit 314 may display the AR object A that is not subjected to image processing in the field of view. If it does so, it will become possible to make a user grasp that AR object A has spread on the plane. In the example shown in FIG. 36, the AR object A that is not subjected to image processing is displayed in the visual field V19-2.
- FIG. 37 is a diagram showing a detailed display example when the AR object is a three-dimensional object.
- the AR object B20-1 viewed from the front is displayed in the field of view V20-1.
- an AR object B20-2 viewed obliquely from above is displayed in the field of view V20-1.
- the display control unit 314 performs image processing on the AR object as viewed from different angles, and displays the AR object that has been subjected to image processing. Good.
- FIG. 38 is a diagram showing a detailed display example when the AR object is a two-dimensional image.
- the AR object B21-1 viewed from the front is displayed in the field of view V21.
- the AR object B21-2 viewed from diagonally above is displayed in the field of view V21.
- the display control unit 314 may display an AR object that is not subjected to image processing without performing image processing on the AR object.
- the display control unit 314 may determine whether the AR object is a three-dimensional object or a two-dimensional image based on information associated with the AR object by an application developer. For example, for a three-dimensional AR object, information indicating that the object is a three-dimensional object may be associated by an application developer. Further, for example, information indicating that the object is a two-dimensional image may be associated with an AR object including predetermined information (for example, character information, an icon, etc.) by an application developer.
- predetermined information for example, character information, an icon, etc.
- the three-dimensional object whose positional relationship with the HMD 100 has changed is different from the two-dimensional image in that it needs to be updated.
- the positional relationship between the HMD 100 and the three-dimensional object changes, it is assumed that the three-dimensional object moves and the user moves.
- AR object update operation an example of the update operation of the AR object (hereinafter also simply referred to as “AR object update operation”) when the AR object is a three-dimensional object will be described.
- FIG. 39 is a flowchart illustrating an example of the update operation of the AR object.
- FIG. 40 is a flowchart showing a basic example of the drawing operation of the AR object.
- the cycle in which the AR object update operation is performed is not limited.
- the AR object update operation may be performed once every 1/30 seconds.
- the cycle in which the AR object drawing operation is performed is not limited.
- the AR object drawing operation may have a shorter cycle than the AR object update operation.
- the drawing operation of the AR object may be performed once every 1/60 seconds.
- the display control unit 314 determines whether there is an AR object that is not yet drawn and enters the field of view (step 901). Subsequently, when the display control unit 314 determines that there is no AR object that is not yet drawn and enters the field of view (“No” in Step 901), the drawing operation is terminated. On the other hand, if the display control unit 314 determines that there is an AR object that is not yet drawn and enters the field of view (“Yes” in step 901), the display control unit 314 performs an AR object drawing process (step 902), and returns to S901.
- the display control unit 314 determines whether there is an AR object whose positional relationship with the HMD 100 has changed (step 801). If the display control unit 314 determines that there is no AR object whose positional relationship with the HMD 100 has changed ("No" in step 801), the display control unit 314 shifts the operation to the next AR object update process.
- step 801 determines that there is an AR object whose positional relationship with the HMD 100 has changed (“Yes” in step 801), the display control unit 314 is on the cylindrical coordinates CO of the AR object whose positional relationship with the HMD has changed.
- the coordinate position of the AR object at is updated (step 802).
- the display control unit 314 determines whether or not there is a three-dimensional object in the AR object in which the coordinate position of the AR object on the cylindrical coordinate CO is updated (step 803). If the display control unit 314 determines that there is no three-dimensional object in the AR object whose coordinate position on the cylindrical coordinate CO has been updated (“No” in step 803), the display control unit 314 operates to update the next AR object. To migrate.
- the display control unit 314 determines that there is a three-dimensional object among the AR objects whose coordinate positions on the cylindrical coordinates CO have been updated (“Yes” in step 803), the display control unit 314 performs 3D rendering of the corresponding AR object. Overwriting with the rewritten one (step 804), the operation is shifted to the update processing of the next AR object.
- FIG. 41 is a diagram illustrating an example in which both an AR object and a non-AR object are provided to the user's field of view.
- the display control unit 314 may display the AR object B22 corresponding to the yaw angle and the pitch angle in the field of view V22. Further, for example, the display control unit 314 may display the non-AR object G22 that does not correspond to either the yaw angle or the pitch angle in the visual field V22.
- the AR object B22 indicates the direction of the destination as viewed from the user. If either the yaw angle or the pitch angle changes, the position of the AR object B22 in the user's visual field V22 also changes according to the change. obtain. On the other hand, the non-AR object G22 indicates the distance to the destination, and the position of the non-AR object G22 in the user's visual field V22 can be fixed even if the yaw angle and pitch angle change.
- FIG. 42 is a diagram illustrating another example in which both the AR object and the non-AR object are provided to the user's field of view.
- the display control unit 314 may display the AR objects A to F according to the yaw angle and the pitch angle in the visual field V23. Further, for example, the display control unit 314 may display the non-AR object G23 that does not correspond to either the yaw angle or the pitch angle in the visual field V23.
- the AR objects A to F are included in the menu screen, and if either the yaw angle or the pitch angle changes, the positions of the AR objects A to F in the user's visual field V23 change according to the change. obtain.
- the non-AR object G23 indicates the selected target position (the AR object at the selected target position is selected when a predetermined selection operation is performed), and even if the yaw angle and pitch angle change, the user's The position of the non-AR object G23 in the visual field V23 can be fixed.
- the present technology is applied to the HMD.
- an image display device other than the HMD for example, in a head-up display (HUD) mounted in a cockpit of a driver's seat of a vehicle or an aircraft.
- HUD head-up display
- the present technology can be applied to a contact lens type display device, the present technology can be applied to eyewear of one eye specification, and the present technology can also be applied to a terminal such as a smartphone.
- a see-through type (transmission type) HMD has been described.
- the present technology can also be applied to a non-transmission type HMD.
- a predetermined object according to the present technology may be displayed in an external field of view captured by a camera attached to the display unit.
- the HMD 100 is configured to display an object including information related to a predetermined object existing in the real space in the field of view V.
- the present invention is not limited to this, and the current position of the user U A destination guidance display or the like may be displayed in the field of view V based on the traveling direction.
- a display control unit that displays an object corresponding to at least one of the yaw angle and the pitch angle of the display unit in a user's field of view;
- the display control unit is operable as a first mode in which the position of the object in the field of view does not depend on the roll angle of the display unit.
- Display control device (2)
- the display control unit is operable as a second mode in which the position of the object in the field of view depends on the roll angle.
- the display control device according to (1).
- the display control unit selects one of the first mode and the second mode as an operation mode;
- the display control unit selects one of the first mode and the second mode as the operation mode based on an operation by a user.
- the display control device according to (3). (5) The display control unit selects the first mode as the operation mode when the object can accept an operation by a user. The display control device according to (3). (6) The display control unit selects the second mode as the operation mode when the object is a three-dimensional object. The display control device according to (3). (7) The display control unit selects one of the first mode and the second mode as the operation mode based on the capability of the display control device. The display control device according to (3). (8) When the display control unit is operating as the first mode, the orientation of the object in the field of view does not depend on the roll angle. The display control device according to any one of (1) to (7). (9) When the display control unit is operating as the first mode, the direction of the object in the visual field depends on the roll angle.
- the display control apparatus according to any one of (1) to (8).
- the display control unit when operating as the second mode, rotates the object in the field of view by the same amount as the roll angle in a direction opposite to the roll angle.
- the display control apparatus according to any one of (2) to (9).
- (11) When the display control unit is operating as the second mode and the roll angle exceeds a predetermined angle, the position of the object in the field of view is opposite to the roll angle in the predetermined angle. Rotate by the same amount as The display control apparatus according to (10).
- the display control unit displays at least the object corresponding to the pitch angle in the field of view; The pitch angle corresponding to the object is limited within a predetermined range; The display control apparatus according to any one of (1) to (15).
- the display control unit displays an object subjected to image processing in the field of view.
- the display control apparatus according to any one of (1) to (16).
- the display control unit displays an object that is not subjected to image processing in the field of view.
- the display control device according to any one of (1) to (17).
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- Optics & Photonics (AREA)
- Computer Graphics (AREA)
- Software Systems (AREA)
- Geometry (AREA)
- Controls And Circuits For Display Device (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
図1は、本技術の一実施形態に係るヘッドマウントディスプレイ(以下「HMD」と称する。)の機能を説明する概略図である。まず図1を参照して、本実施形態に係るHMDの基本的な機能の概要について説明する。
本実施形態のHMD100は、ユーザUの頭部に装着され、ユーザUの実空間の視野V(表示視野)に仮想的な画像を表示することが可能に構成される。視野Vに表示される画像には、当該視野Vに存在する所定の対象物A1,A2,A3,A4に関連する情報が含まれる。所定の対象物としては、例えば、ユーザUの周囲に存在する風景、店舗、商品等が該当する。
HMD100は、表示部10と、表示部10の姿勢を検出する検出部20と、表示部10の駆動を制御する制御ユニット30とを有する。本実施形態においてHMD100は、ユーザに実空間の視野Vを提供可能なシースルー型のHMDで構成されている。
表示部10は、ユーザUの頭部に装着可能に構成される。表示部10は、第1及び第2の表示面11R,11Lと、第1及び第2の画像生成部12R,12Lと、支持体13とを有する。
検出部20は、表示部10の少なくとも一軸周りの方位あるいは姿勢変化を検出することが可能に構成される。本実施形態において検出部20は、X,Y及びZ軸周りの表示部10の方位あるいは姿勢変化をそれぞれ検出するように構成されている。
制御ユニット30(第1の制御ユニット)は、検出部20の出力に基づいて、表示部10(画像生成部12R,12L)の駆動を制御する制御信号を生成する。本実施形態において制御ユニット30は、接続ケーブル30aを介して表示部10と電気的に接続されている。勿論これに限られず、制御ユニット30は表示部10と無線通信回線を通じて接続されてもよい。
携帯情報端末200(第2の制御ユニット)は、制御ユニット30と無線通信回線を介して相互に通信可能に構成されている。携帯情報端末200は、表示部10で表示するべき画像を取得する機能と、取得した画像を制御ユニット30へ送信する機能とを有する。携帯情報端末200は、HMD100と有機的に組み合わされることで、HMDシステムを構築する。
次に、制御ユニット30の詳細について説明する。
x0=(θ0-θv)・Wv/αv …(1)
y0=(h0-hv)・Hv/100 …(2)
(1)ノンストリクト(Non Strict)属性の導入
表示制御部314は、表示部10の方位あるいは姿勢が所定角度以上変化したときは、上記方位あるいは姿勢の変化に応じてオブジェクトを視野V内で移動させ、上記方位あるいは姿勢の変化が上記所定角度未満のときは、視野Vにおけるオブジェクトの表示位置を固定する処理を実行可能に構成される。
表示制御部314は、検出部20の出力変化が所定時間にわたって所定以下のときは、オブジェクトBを視野Vの所定位置に移動させる処理を実行可能に構成される。
表示制御部314は、ユーザUの操作により生成される所定信号の入力を検出したときは、オブジェクトを視野Vの所定位置に移動させる処理を実行可能に構成される。このような構成においても上述と同様にオブジェクトの視認性を高めることができるとともに、ユーザの意図に即して画像の表示を制御することが可能となる。
表示制御部314は、視野Vの所定位置にオブジェクトが表示されている状態において、検出部20の出力変化が所定周波数以上のときは、検出部20の出力のうち上記所定周波数以上の周波数成分を無効とする処理を実行可能に構成される。
Control)を組み合わせて設定値に収束させる制御をいう。図10A,10Bにおいて、視野Vと視野V'の間に各々接続されたバネ(p)及びダンパ(d)のうち、バネ(p)がPD制御のP成分に相当し、ダンパ(d)がPD制御のD成分に相当する。
Δx(t)=x'(t)-x(t) …(3)
Δy(t)=y'(t)-y(t) …(4)
と表され、対応点の速度差分を(Δvx(t),Δvy(t))とすると、
Δvx(t)={Δx'(t)-Δx'(t-Δt)}-{Δx(t)-Δx(t-Δt)} …(5)
Δvy(t)={Δy'(t)-Δy'(t-Δt)}-{Δy(t)-Δy(t-Δt)} …(6)
と表される。そのときに顔ぶれ補正座標系V1'がローカル座標系V1に追従移動すべき量(Δp(t)、Δq(t))は、
Δp(t)=Px×Δx(t)+Dx×Δvx(t) …(7)
Δq(t)=Py×Δy(t)+Dy×Δvy(t) …(8)
と表される。
ここで、Px、Pyはx、yに対する差分ゲイン定数、Dx、Dyはx、yに対する速度ゲイン定数である。
続いて、HMD100の領域制限機能について説明する。
続いて、HMD100の画像管理機能について説明する。
まず、携帯情報端末200は、制御ユニット30へオブジェクトB3を配置するためのフレームF3の送信要否を確認し(ステップ101)、これに対して制御ユニット30は、携帯情報端末200へフレームF3の送信を要求する(ステップ102)。制御ユニット30は、受信したフレームF3をメモリ302へ格納することで、フレームF3を円筒座標C1上の対応位置へ配置する。
制御ユニット30は、上記オブジェクトデータの送信許可通知をトリガにデータ取得フェーズに移行する。具体的には例えば、制御ユニット30は、検出部20の出力に基づき、現在の視野V(表示部10)の方位に最も近いフレーム(本例ではフレームF4)を判定し、そのフレームに属するオブジェクト(本例ではオブジェクトB4)の画像データの送信を要求する(ステップ106)。この要求を受けて、携帯情報端末200は、制御ユニット30へオブジェクトB4の画像データを送信する(ステップ107)。制御ユニット30は、受信したオブジェクトB4の画像データをメモリ302へ格納することで、オブジェクトB4を円筒座標C1上のフレームF4内に配置する。
次に、以上のように構成される本実施形態に係るHMD100を備えたHMDシステムの動作の一例について説明する。
以下、本実施形態のHMD100の適用例について説明する。
図17は、ヨー角、ピッチ角およびロール角の例を示す図である。上記したように、本実施形態においては、表示部10の鉛直軸(Z軸)周りの方位RY(以下、「ヨー角」とも言う。)および表示部10の左右軸(Y軸)周りの俯角(または仰角)RP(以下、「ピッチ角」とも言う。)の少なくともいずれか一方に応じたオブジェクト(以下、「ARオブジェクト」とも言う。)をユーザの視野に提供することが可能である。オブジェクトは、実世界の座標に対応付けられていてよい(表示部10の周囲の対象物に対応付けられていてよい)。このとき、表示制御部314は、ユーザの視野に対するARオブジェクトの位置および向きを、表示部10の前後軸(X軸)周りの角RO(以下、「ロール角」とも言う。)にも依存させてよい。かかる例について説明する。
以上に説明したように、ユーザの視野に対するARオブジェクトの位置および向きはロール角に依存されてよい。しかし、例えば、図18を参照すると、ARオブジェクトA、Bそれぞれの一部がユーザの視野V2-2から外れてしまっており、ARオブジェクトA、Bの検索性が低下してしまっている。そこで、以下では、ユーザの視野に表示されるARオブジェクトの検索性を向上させることが可能な技術を主に提案する。具体的には、表示制御部314は、ユーザの視野に対するARオブジェクトの位置および向きをロール角に依存させなくてよい。
続いて、ARオブジェクトの表示手法が、ロール角依存性ありの場合とロール角依存性なしの場合とにおけるARオブジェクトの検索性の相違についてさらに説明する。図22は、ARオブジェクトの表示手法が、ロール角依存性ありの場合におけるARオブジェクトの検索性について説明するための図である。また、図23は、ARオブジェクトの表示手法が、ロール角依存性なしの場合におけるARオブジェクトの検索性について説明するための図である。
上記では、表示制御部314は、ARオブジェクトの表示手法がロール角依存性ありの場合、視野におけるARオブジェクトをロール角とは逆向きにロール角と同じ大きさだけ回転させて視野に表示させる例を説明した。しかし、ARオブジェクトをロール角とは逆向きにロール角と同じ大きさだけ回転させてしまうと、視野にARオブジェクトが収まらない状況が生じ得る。また、かかる状況においては、ユーザによってARオブジェクトが視認されないといった状況が生じ得る。そのため、表示制御部314は、ARオブジェクトの表示手法がロール角依存性ありの場合に、状況に応じてARオブジェクトの回転を制限してもよい。
以上、ARオブジェクトの表示手法がロール角依存性ありの場合に、状況に応じてARオブジェクトの回転を制限する例について説明した。ところで、上記では、表示制御部314がピッチ角に対応するオブジェクトを視野に表示させるとき、座標設定部311がオブジェクトに対応するピッチ角を所定の範囲内に制限する例を説明した。より具体的には、座標設定部311は、円筒座標C0における視野Vの高さ方向の視野領域(Hv)を制限することが可能である。ここで、視野Vの高さ方向の視野領域(Hv)を制限する機能をロール角依存性なしと組み合わせることによって、ARオブジェクトの検索性をさらに向上させることが期待される。
以上、視野領域を制限する機能をロール角依存性なしと組み合わせた場合について説明した。ここで、上記では、表示制御部314は、ロール角依存性ありにおいてロール角がロール制限角を超えた場合、ARオブジェクトの位置をロール角とは逆向きにロール制限角と同じ大きさだけ回転させる例を説明した。このとき、表示制御部314は、ロール角依存性ありにおいてロール角がロール制限角を超えた場合、ARオブジェクトの位置をロール角とは逆向きにロール制限角と同じ大きさだけ回転させた位置に徐々に収束させてもよい。
また、上記では、ARオブジェクトの表示手法がロール角依存性なしの場合において、ユーザの視野に対するARオブジェクトの向きをロール角に依存させない例を説明した。しかし、表示制御部314は、ARオブジェクトの表示手法がロール角依存性なしの場合において、ユーザの視野に対するARオブジェクトの向きをロール角に依存させてもよい。そうすれば、ARオブジェクトの実在感を表現することが考慮されるとともに、ARオブジェクトの検索性も向上させることが可能となる。かかる例について説明する。
以上、ロール角依存性ありとロール角依存性なしについて説明した。ここで、ロール角依存性ありとロール角依存性なしとは、いずれか一方が永続的に用いられてもよいが、いずれか一方が適宜に選択されてもよい。以下では、表示制御部314が、ARオブジェクトの表示手法として、ロール角依存性なしが用いられる第1のモード(以下、「ロール角依存性なしモード」とも言う。)とロール角依存性ありが用いられる第2のモード(以下、「ロール角依存性ありモード」とも言う。)とのうちいずれか一方を動作モードとして選択可能な例について説明する。
以上、ロール角依存性なしモードおよびロール角依存性ありモードのいずれか一方を動作モードとして選択する例を説明した。ここで、ロール角依存性ありモードにおいて上記のロール制限角は固定値であってもよいが、状況に応じて更新されてもよい。以下、ロール制限角の更新について説明する。図31は、ロール制限角の更新の例を説明するための図である。例えば、図31において、ARオブジェクトA~Eは、メニュー画面を構成していてよい。
以上、ロール制限角の更新について説明した。続いて、ARオブジェクトの描画動作の例について説明する。図32は、ARオブジェクトの描画動作の例を示すフローチャートである。なお、ここでは、ARオブジェクトに対して、ロール角依存性なしモードを選択するか否かを示す情報(ロール角依存性があるか否かを示す情報)とロール制限角とが関連付けられている例を想定している。
以上、ARオブジェクトの描画動作の他の例について説明した。以下では、ARオブジェクトの表示手法がロール角依存性ありの場合およびロール角依存性なしの場合それぞれが奏する効果についてさらに詳細に説明する。図34は、ARオブジェクトの表示手法がロール角依存性ありの場合におけるARオブジェクトの表示例を示す図である。また、図35は、ARオブジェクトの表示手法がロール角依存性なしの場合におけるARオブジェクトの表示例を示す図である。
以上、ARオブジェクトの表示手法がロール角依存性ありの場合およびロール角依存性なしの場合それぞれが奏する効果について説明した。ところで、ARオブジェクトは、2次元画像である場合と3次元オブジェクトである場合とがある。このとき、ARオブジェクトが2次元画像である場合と3次元オブジェクトである場合とによって、ARオブジェクトの表示を変えてもよい。図36を参照しながら具体的に説明する。
以上に説明したように、HMD100との位置関係が変化した3次元オブジェクトは、更新される必要がある点において、2次元画像と異なっている。例えば、HMD100と3次元オブジェクトの位置関係が変化する場合としては、3次元オブジェクトが動く場合とユーザが動く場合とが想定される。以下では、続いて、ARオブジェクトが3次元オブジェクトである場合におけるARオブジェクトの更新動作(以下、単に「ARオブジェクトの更新動作」とも言う。)の例について説明する。
以上においては、ヨー角およびピッチ角の少なくともいずれか一方に応じたオブジェクトであるARオブジェクトをユーザの視野に提供する例を主に説明した。一方、ヨー角にもピッチ角にも応じないオブジェクト(以下、「非ARオブジェクト」とも言う。)もユーザの視野に提供され得る。以下では、ARオブジェクトと非ARオブジェクトとの双方をユーザの視野に提供する例を説明する。
(1)
表示部のヨー角およびピッチ角のうち少なくともいずれか一方に応じたオブジェクトをユーザの視野に表示させる表示制御部を備え、
前記表示制御部は、前記視野における前記オブジェクトの位置を前記表示部のロール角に依存させない第1のモードとして動作可能である、
表示制御装置。
(2)
前記表示制御部は、前記視野における前記オブジェクトの位置を前記ロール角に依存させる第2のモードとして動作可能である、
前記(1)に記載の表示制御装置。
(3)
前記表示制御部は、前記第1のモードおよび前記第2のモードのいずれか一方を動作モードとして選択する、
前記(2)に記載の表示制御装置。
(4)
前記表示制御部は、ユーザによる操作に基づいて、前記第1のモードおよび前記第2のモードのいずれか一方を前記動作モードとして選択する、
前記(3)に記載の表示制御装置。
(5)
前記表示制御部は、前記オブジェクトがユーザによる操作を受け付け可能である場合、前記第1のモードを前記動作モードとして選択する、
前記(3)に記載の表示制御装置。
(6)
前記表示制御部は、前記オブジェクトが3次元オブジェクトである場合、前記第2のモードを前記動作モードとして選択する、
前記(3)に記載の表示制御装置。
(7)
前記表示制御部は、前記表示制御装置の能力に基づいて、前記第1のモードおよび前記第2のモードのいずれか一方を前記動作モードとして選択する、
前記(3)に記載の表示制御装置。
(8)
前記表示制御部は、前記第1のモードとして動作しているとき、前記視野における前記オブジェクトの向きを前記ロール角に依存させない、
前記(1)~(7)のいずれか一項に記載の表示制御装置。
(9)
前記表示制御部は、前記第1のモードとして動作しているとき、前記視野における前記オブジェクトの向きを前記ロール角に依存させる、
前記(1)~(8)のいずれか一項に記載の表示制御装置。
(10)
前記表示制御部は、前記第2のモードとして動作しているとき、前記視野における前記オブジェクトを前記ロール角とは逆向きに前記ロール角と同じ大きさだけ回転させる、
前記(2)~(9)のいずれか一項に記載の表示制御装置。
(11)
前記表示制御部は、前記第2のモードとして動作しているとき、前記ロール角が所定の角度を超えた場合、前記視野における前記オブジェクトの位置を前記ロール角とは逆向きに前記所定の角度と同じ大きさだけ回転させる、
前記(10)に記載の表示制御装置。
(12)
前記表示制御部は、前記第2のモードとして動作しているとき、前記ロール角が所定の角度を超えた場合、前記視野における前記オブジェクトの位置を、前記ロール角とは逆向きに前記ロール角と同じ大きさだけ回転させた位置から、前記ロール角とは逆向きに前記所定の角度と同じ大きさだけ回転させた位置へ、徐々に移動させる、
前記(11)に記載の表示制御装置。
(13)
前記表示制御部は、前記ロール角の大きさが閾値を超えた時間が所定時間継続した場合、前記所定の角度を更新する、
前記(11)または(12)に記載の表示制御装置。
(14)
前記表示制御部は、前記視野の形態が所定の条件を満たす場合、前記所定の角度を更新する、
前記(11)または(12)に記載の表示制御装置。
(15)
前記表示制御部は、前記第2のモードとして動作しているとき、前記視野における前記オブジェクトの向きを前記ロール角に依存させる、
前記(2)~(14)のいずれか一項に記載の表示制御装置。
(16)
前記表示制御部は、少なくとも前記ピッチ角に対応する前記オブジェクトを前記視野に表示させ、
前記オブジェクトに対応する前記ピッチ角は、所定の範囲内に制限される、
前記(1)~(15)のいずれか一項に記載の表示制御装置。
(17)
前記表示制御部は、前記オブジェクトが3次元オブジェクトである場合、画像処理が施されたオブジェクトを前記視野に表示させる、
前記(1)~(16)のいずれか一項に記載の表示制御装置。
(18)
前記表示制御部は、前記オブジェクトが2次元画像である場合、画像処理が施されないオブジェクトを前記視野に表示させる、
前記(1)~(17)のいずれか一項に記載の表示制御装置。
(19)
表示部のヨー角およびピッチ角のうち少なくともいずれか一方に応じたオブジェクトをユーザの視野に表示させることを備え、
前記視野における前記オブジェクトの位置を前記表示部のロール角に依存させない第1のモードとして動作可能である、
表示制御方法。
(20)
コンピュータを、
表示部のヨー角およびピッチ角のうち少なくともいずれか一方に応じたオブジェクトをユーザの視野に表示させる表示制御部を備え、
前記表示制御部は、前記視野における前記オブジェクトの位置を前記表示部のロール角に依存させない第1のモードとして動作可能である、
表示制御装置として機能させるためのプログラム。
11R,11L…表示面
12R,12L…画像生成部
20…検出部
30…制御ユニット
100…ヘッドマウントディスプレイ(HMD)
200…携帯情報端末
311…座標設定部
312…画像管理部
313…座標判定部
314…表示制御部
A1~A4…対象物
B、B1~B4…オブジェクト
C0,C1…円筒座標(ワールド座標)
V…視野
U…ユーザ
Claims (20)
- 表示部のヨー角およびピッチ角のうち少なくともいずれか一方に応じたオブジェクトをユーザの視野に表示させる表示制御部を備え、
前記表示制御部は、前記視野における前記オブジェクトの位置を前記表示部のロール角に依存させない第1のモードとして動作可能である、
表示制御装置。 - 前記表示制御部は、前記視野における前記オブジェクトの位置を前記ロール角に依存させる第2のモードとして動作可能である、
請求項1に記載の表示制御装置。 - 前記表示制御部は、前記第1のモードおよび前記第2のモードのいずれか一方を動作モードとして選択する、
請求項2に記載の表示制御装置。 - 前記表示制御部は、ユーザによる操作に基づいて、前記第1のモードおよび前記第2のモードのいずれか一方を前記動作モードとして選択する、
請求項3に記載の表示制御装置。 - 前記表示制御部は、前記オブジェクトがユーザによる操作を受け付け可能である場合、前記第1のモードを前記動作モードとして選択する、
請求項3に記載の表示制御装置。 - 前記表示制御部は、前記オブジェクトが3次元オブジェクトである場合、前記第2のモードを前記動作モードとして選択する、
請求項3に記載の表示制御装置。 - 前記表示制御部は、前記表示制御装置の能力に基づいて、前記第1のモードおよび前記第2のモードのいずれか一方を前記動作モードとして選択する、
請求項3に記載の表示制御装置。 - 前記表示制御部は、前記第1のモードとして動作しているとき、前記視野における前記オブジェクトの向きを前記ロール角に依存させない、
請求項1に記載の表示制御装置。 - 前記表示制御部は、前記第1のモードとして動作しているとき、前記視野における前記オブジェクトの向きを前記ロール角に依存させる、
請求項1に記載の表示制御装置。 - 前記表示制御部は、前記第2のモードとして動作しているとき、前記視野における前記オブジェクトを前記ロール角とは逆向きに前記ロール角と同じ大きさだけ回転させる、
請求項2に記載の表示制御装置。 - 前記表示制御部は、前記第2のモードとして動作しているとき、前記ロール角が所定の角度を超えた場合、前記視野における前記オブジェクトの位置を前記ロール角とは逆向きに前記所定の角度と同じ大きさだけ回転させる、
請求項10に記載の表示制御装置。 - 前記表示制御部は、前記第2のモードとして動作しているとき、前記ロール角が所定の角度を超えた場合、前記視野における前記オブジェクトの位置を、前記ロール角とは逆向きに前記ロール角と同じ大きさだけ回転させた位置から、前記ロール角とは逆向きに前記所定の角度と同じ大きさだけ回転させた位置へ、徐々に移動させる、
請求項11に記載の表示制御装置。 - 前記表示制御部は、前記ロール角の大きさが閾値を超えた時間が所定時間継続した場合、前記所定の角度を更新する、
請求項11に記載の表示制御装置。 - 前記表示制御部は、前記視野の形態が所定の条件を満たす場合、前記所定の角度を更新する、
請求項11に記載の表示制御装置。 - 前記表示制御部は、前記第2のモードとして動作しているとき、前記視野における前記オブジェクトの向きを前記ロール角に依存させる、
請求項2に記載の表示制御装置。 - 前記表示制御部は、少なくとも前記ピッチ角に対応する前記オブジェクトを前記視野に表示させ、
前記オブジェクトに対応する前記ピッチ角は、所定の範囲内に制限される、
請求項1に記載の表示制御装置。 - 前記表示制御部は、前記オブジェクトが3次元オブジェクトである場合、画像処理が施されたオブジェクトを前記視野に表示させる、
請求項1に記載の表示制御装置。 - 前記表示制御部は、前記オブジェクトが2次元画像である場合、画像処理が施されないオブジェクトを前記視野に表示させる、
請求項1に記載の表示制御装置。 - 表示部のヨー角およびピッチ角のうち少なくともいずれか一方に応じたオブジェクトをユーザの視野に表示させることを備え、
前記視野における前記オブジェクトの位置を前記表示部のロール角に依存させない第1のモードとして動作可能である、
表示制御方法。 - コンピュータを、
表示部のヨー角およびピッチ角のうち少なくともいずれか一方に応じたオブジェクトをユーザの視野に表示させる表示制御部を備え、
前記表示制御部は、前記視野における前記オブジェクトの位置を前記表示部のロール角に依存させない第1のモードとして動作可能である、
表示制御装置として機能させるためのプログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/529,660 US20170329480A1 (en) | 2014-12-04 | 2015-09-08 | Display control apparatus, display control method, and program |
EP15865988.8A EP3229104A4 (en) | 2014-12-04 | 2015-09-08 | Display control device, display control method, and program |
KR1020177014085A KR20170089854A (ko) | 2014-12-04 | 2015-09-08 | 표시 제어 장치, 표시 제어 방법 및 프로그램 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014245935A JP2016110319A (ja) | 2014-12-04 | 2014-12-04 | 表示制御装置、表示制御方法およびプログラム |
JP2014-245935 | 2014-12-04 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016088420A1 true WO2016088420A1 (ja) | 2016-06-09 |
Family
ID=56091378
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/075492 WO2016088420A1 (ja) | 2014-12-04 | 2015-09-08 | 表示制御装置、表示制御方法およびプログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US20170329480A1 (ja) |
EP (1) | EP3229104A4 (ja) |
JP (1) | JP2016110319A (ja) |
KR (1) | KR20170089854A (ja) |
WO (1) | WO2016088420A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108363486A (zh) * | 2017-01-27 | 2018-08-03 | 佳能株式会社 | 图像显示装置及方法、图像处理装置及方法和存储介质 |
WO2021131938A1 (ja) * | 2019-12-26 | 2021-07-01 | 株式会社コロプラ | プログラム、方法および情報処理装置 |
Families Citing this family (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6867104B2 (ja) * | 2016-01-20 | 2021-04-28 | 株式会社コロプラ | フローティング・グラフィカルユーザインターフェース |
JP2018004756A (ja) * | 2016-06-28 | 2018-01-11 | 株式会社リコー | 情報表示システム |
US9971157B2 (en) | 2016-07-25 | 2018-05-15 | Colopl, Inc. | Display control method and system for executing the display control method |
JP6152997B1 (ja) * | 2016-07-25 | 2017-06-28 | 株式会社コロプラ | 表示制御方法および当該表示制御方法をコンピュータに実行させるためのプログラム |
WO2018156809A1 (en) * | 2017-02-24 | 2018-08-30 | Masimo Corporation | Augmented reality system for displaying patient data |
KR102567007B1 (ko) | 2017-02-24 | 2023-08-16 | 마시모 코오퍼레이션 | 의료 모니터링 데이터 표시 시스템 |
JP7265312B2 (ja) * | 2017-03-16 | 2023-04-26 | 株式会社デンソーウェーブ | 情報表示システム |
WO2018208616A1 (en) | 2017-05-08 | 2018-11-15 | Masimo Corporation | System for pairing a medical system to a network controller by use of a dongle |
US10445997B2 (en) * | 2017-06-20 | 2019-10-15 | International Business Machines Corporation | Facilitating a search of individuals in a building during an emergency event |
KR102374404B1 (ko) * | 2017-07-25 | 2022-03-15 | 삼성전자주식회사 | 콘텐트를 제공하기 위한 디바이스 및 방법 |
KR102373510B1 (ko) * | 2017-08-11 | 2022-03-11 | 삼성전자주식회사 | 디스플레이를 회전함에 따라 컨텐츠를 시각화 하는 디스플레이 장치 및 이의 제어 방법 |
US10026209B1 (en) | 2017-12-21 | 2018-07-17 | Capital One Services, Llc | Ground plane detection for placement of augmented reality objects |
US10002442B1 (en) | 2017-12-21 | 2018-06-19 | Capital One Services, Llc | Placement of augmented reality objects using a guide marker |
JP7210153B2 (ja) * | 2018-04-04 | 2023-01-23 | キヤノン株式会社 | 電子機器、電子機器の制御方法、プログラム、及び、記憶媒体 |
JP7121523B2 (ja) * | 2018-04-10 | 2022-08-18 | キヤノン株式会社 | 画像表示装置、画像表示方法 |
JP7300287B2 (ja) | 2019-03-20 | 2023-06-29 | 任天堂株式会社 | 画像表示システム、画像表示プログラム、表示制御装置、および画像表示方法 |
US11393197B2 (en) * | 2019-05-03 | 2022-07-19 | Cvent, Inc. | System and method for quantifying augmented reality interaction |
JP7443100B2 (ja) * | 2020-03-10 | 2024-03-05 | キヤノン株式会社 | 電子機器、電子機器の制御方法、プログラムおよび記憶媒体 |
EP4020398A1 (en) * | 2020-08-18 | 2022-06-29 | Unity IPR APS | Method and system for displaying a large 3d model on a remote device |
JP7476128B2 (ja) | 2021-03-11 | 2024-04-30 | 株式会社日立製作所 | 表示システムおよび表示装置 |
JPWO2023047865A1 (ja) * | 2021-09-27 | 2023-03-30 | ||
US20230343028A1 (en) * | 2022-04-20 | 2023-10-26 | Apple Inc. | Method and Device for Improving Comfortability of Virtual Content |
WO2023238264A1 (ja) * | 2022-06-08 | 2023-12-14 | マクセル株式会社 | 情報表示装置およびその表示方法 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014098564A (ja) * | 2012-11-13 | 2014-05-29 | Panasonic Corp | 情報表示装置 |
WO2014129204A1 (ja) * | 2013-02-22 | 2014-08-28 | ソニー株式会社 | ヘッドマウントディスプレイ |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6061064A (en) * | 1993-08-31 | 2000-05-09 | Sun Microsystems, Inc. | System and method for providing and using a computer user interface with a view space having discrete portions |
US5742264A (en) * | 1995-01-24 | 1998-04-21 | Matsushita Electric Industrial Co., Ltd. | Head-mounted display |
DE69904759T2 (de) * | 1998-12-17 | 2003-09-25 | Nec Tokin Corp | Orientierungswinkeldetektor |
JP4810295B2 (ja) * | 2006-05-02 | 2011-11-09 | キヤノン株式会社 | 情報処理装置及びその制御方法、画像処理装置、プログラム、記憶媒体 |
US8780014B2 (en) * | 2010-08-25 | 2014-07-15 | Eastman Kodak Company | Switchable head-mounted display |
WO2012048252A1 (en) * | 2010-10-07 | 2012-04-12 | Aria Glassworks, Inc. | System and method for transitioning between interface modes in virtual and augmented reality applications |
US20130342572A1 (en) * | 2012-06-26 | 2013-12-26 | Adam G. Poulos | Control of displayed content in virtual environments |
-
2014
- 2014-12-04 JP JP2014245935A patent/JP2016110319A/ja active Pending
-
2015
- 2015-09-08 WO PCT/JP2015/075492 patent/WO2016088420A1/ja active Application Filing
- 2015-09-08 US US15/529,660 patent/US20170329480A1/en not_active Abandoned
- 2015-09-08 KR KR1020177014085A patent/KR20170089854A/ko not_active Application Discontinuation
- 2015-09-08 EP EP15865988.8A patent/EP3229104A4/en not_active Withdrawn
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014098564A (ja) * | 2012-11-13 | 2014-05-29 | Panasonic Corp | 情報表示装置 |
WO2014129204A1 (ja) * | 2013-02-22 | 2014-08-28 | ソニー株式会社 | ヘッドマウントディスプレイ |
Non-Patent Citations (1)
Title |
---|
See also references of EP3229104A4 * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108363486A (zh) * | 2017-01-27 | 2018-08-03 | 佳能株式会社 | 图像显示装置及方法、图像处理装置及方法和存储介质 |
US10726814B2 (en) | 2017-01-27 | 2020-07-28 | Canon Kabushiki Kaisha | Image display apparatus, image processing apparatus, image display method, image processing method, and storage medium |
WO2021131938A1 (ja) * | 2019-12-26 | 2021-07-01 | 株式会社コロプラ | プログラム、方法および情報処理装置 |
JP7458779B2 (ja) | 2019-12-26 | 2024-04-01 | 株式会社コロプラ | プログラム、方法および情報処理装置 |
Also Published As
Publication number | Publication date |
---|---|
EP3229104A4 (en) | 2018-08-08 |
JP2016110319A (ja) | 2016-06-20 |
EP3229104A1 (en) | 2017-10-11 |
US20170329480A1 (en) | 2017-11-16 |
KR20170089854A (ko) | 2017-08-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2016088420A1 (ja) | 表示制御装置、表示制御方法およびプログラム | |
JP7268692B2 (ja) | 情報処理装置、制御方法及びプログラム | |
US10796669B2 (en) | Method and apparatus to control an augmented reality head-mounted display | |
US11151773B2 (en) | Method and apparatus for adjusting viewing angle in virtual environment, and readable storage medium | |
US20190094850A1 (en) | Techniques for image recognition-based aerial vehicle navigation | |
KR20180043609A (ko) | 디스플레이 장치 및 디스플레이 장치의 영상 처리 방법 | |
CN108351736B (zh) | 可穿戴显示器、图像显示装置和图像显示系统 | |
US20220253198A1 (en) | Image processing device, image processing method, and recording medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15865988 Country of ref document: EP Kind code of ref document: A1 |
|
REEP | Request for entry into the european phase |
Ref document number: 2015865988 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 20177014085 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15529660 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |