JP2017004491A - Floating graphical user interface - Google Patents

Floating graphical user interface Download PDF

Info

Publication number
JP2017004491A
JP2017004491A JP2016008456A JP2016008456A JP2017004491A JP 2017004491 A JP2017004491 A JP 2017004491A JP 2016008456 A JP2016008456 A JP 2016008456A JP 2016008456 A JP2016008456 A JP 2016008456A JP 2017004491 A JP2017004491 A JP 2017004491A
Authority
JP
Japan
Prior art keywords
widget
field
view
virtual space
moving
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2016008456A
Other languages
Japanese (ja)
Other versions
JP2017004491A5 (en
Inventor
集平 寺畑
Shuhei Terahata
集平 寺畑
Original Assignee
株式会社コロプラ
Colopl Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
Application filed by 株式会社コロプラ, Colopl Inc filed Critical 株式会社コロプラ
Priority to JP2016008456A priority Critical patent/JP2017004491A/en
Publication of JP2017004491A publication Critical patent/JP2017004491A/en
Publication of JP2017004491A5 publication Critical patent/JP2017004491A5/ja
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=57754672&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=JP2017004491(A) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Pending legal-status Critical Current

Links

Abstract

PROBLEM TO BE SOLVED: To provide an input device to display widgets in a virtual space without obstructing view as much as possible.SOLUTION: An input device for performing inputting tasks using widgets placed in a virtual space includes; a point-of-regard movement unit 1620 for moving a point of regard within the virtual space; a widget selection unit 1641 configured to determine whether the point of regard overlaps with a widget and to select the widget if affirmative; and an input unit 1642 for performing an inputting task corresponding to the selected widget.SELECTED DRAWING: Figure 16

Description

  The present invention relates to an apparatus, method, computer program, and computer for displaying an appropriate graphical user interface for an operator to perform operations in a virtual space such as a virtual reality space (VR) and an augmented reality space (AR). The present invention relates to a recording medium on which a program is recorded, and is particularly suitable for use in an immersive virtual space using a head mounted display (HMD) mounted on a user's head.

  A head-mounted display (HMD) that is mounted on a user's head and can present an image in a virtual space to the user by a display or the like disposed in front of the user is known. In Patent Document 1, a plurality of icons, menus, and other widgets are displayed in a portion corresponding to the periphery of the visual field using a perspective projection method, and a large number of icons, menus, and other widgets are secured while securing a central visual field. Is also displayed.

Japanese Patent Laid-Open No. 2015-032085

  In the method described in Patent Document 1, the substantial visual field in the virtual space is only the central part of the visual field. An object of the present invention is to display a widget in a virtual space so as not to interrupt the visual field as much as possible.

  According to the present invention, there is provided an input method for performing input using a widget placed in a virtual space, the gazing point moving step for moving a gazing point in the virtual space, and the widget and the gazing point. It is determined whether or not they overlap each other, and if they overlap, an input method including a widget selection step of selecting the widget and an input step of inputting corresponding to the selected widget is obtained.

  According to the present invention, there is provided an input device that performs input using a widget placed in a virtual space, the gazing point moving unit that moves a gazing point in the virtual space, the widget, and the gazing point. It is determined whether or not the viewpoints overlap, and if they overlap, an input device having a widget selection unit that selects the widget and an input unit that performs input corresponding to the selected widget is obtained.

According to the present invention, the substantial visual field in the virtual space can be widened.
Other features and advantages of the invention will be apparent from the following description of the embodiments, the accompanying drawings, and the claims.

It is a figure which shows an example of the miniaturized widget displayed according to this invention so that it may float. 2 is a diagram illustrating elements displayed in FIG. 1 and an example of an initial state of positions of a widget 201 and a cursor 202. FIG. FIG. 3 is a diagram showing a state in which the visual field is moved upward (+ Y direction) and slightly to the right (+ X direction) from the state of FIG. 2. It is a figure of the state which moved the visual field to the right (+ X direction) from the state of FIG. It is a figure of the state which moved the visual field further to the right (+ X direction) from FIG. FIG. 6 is a diagram showing a state in which the widget 201 has returned to the initial position in the field of view after FIG. 5. It is a figure of the state which moved the visual field downward (-Y direction) from the state of FIG. It is a figure of the state which moved the visual field further downward (-Y direction) from FIG. FIG. 9 is a diagram showing a state in which the widget 201 is completely out of the field of view and is not visible as a result of further moving the field of view downward (−Y direction) from FIG. 8. It is a figure which shows the virtual sphere 1001 centering on an operator. It is a figure which shows the process about a movement to the left-right direction (X direction) among the operations of the widget. It is a figure which shows a process when the widget 201 is selected. 1 is a diagram illustrating an overview of a system 1300 for displaying a widget 201. FIG. It is a figure explaining the angle information data which can be detected with the inclination sensor of a head mounted display (HMD) 1310. It is a figure which shows the point which emits infrared rays for the position tracking camera (position sensor) 1330 provided on the head mounted display (HMD) 1310. FIG. 3 is a diagram illustrating a configuration of main functions of a component for displaying a widget 201.

[Description of Embodiment of the Present Invention]
First, the contents of the embodiment of the present invention will be listed and described. A program according to an embodiment of the present invention has the following configuration.

(Item 1)
An input method for performing input using a widget placed in a virtual space, wherein a gaze point moving step for moving a gaze point in the virtual space, and whether or not the widget and the gaze point overlap An input method comprising: a widget selection step for determining and selecting the widget if they overlap, and an input step for performing an input corresponding to the selected widget.

  According to the input method of this item, the substantial visual field in the virtual space can be widened. For example, the field of view can be ensured by using the input method of this item, a floating graphical user interface, that is, a miniaturized widget displayed so as to float in the virtual space, and in the virtual space. Since the input can be performed using the always-on operation of moving the line of sight, there is no need to perform a special operation to perform the input. The effect is particularly remarkable when the line of sight in the virtual space is moved using the posture data of the head mounted display (HMD) main body.

(Item 2)
The input method according to item 1, wherein the gazing point moving step is a part of a visual field moving step of moving the gazing point by moving a visual field with respect to the virtual space.

(Item 3)
3. The input method according to item 2, wherein the visual field moving step moves the visual field with respect to the virtual space using the measured movement of the head of the operator.

(Item 4)
The widget is placed at a position (X vs0, Y vs0, Z vs0) in the virtual space corresponding to the initial position (X fv0, Y fv0) in the field of view, and the movement velocity vector V in the virtual space of the widget An initialization step for initializing the widget to zero, a widget moving step for moving the widget in the virtual space at the moving speed vector V, and a position of the widget in the field of view is an initial state in the field of view. A widget movement speed reinitialization step for reinitializing the movement speed vector V to 0 when the position matches the position; and a widget out-of-field determination step for determining that the widget is positioned outside the field of view. When the widget out-of-view determination step determines that the widget is outside the view, the widget is moved. The input according to item 2 or 3, further comprising a widget moving speed setting step for giving the moving speed vector V a value that returns to a position of an initial state in the visual field within the displayed visual field. Method.

(Item 5)
The widget movement speed re-initialization step determines that the position of the widget in the field of view matches the position of the initial state in the field of view. (X direction) when the position in the field of view coincides with the initial position Xfv0 in the field of view, and the widget is positioned outside the field of view in the widget out-of-field determination step. Is when it is positioned outside the left-right direction (X direction) in the field of view, and in the widget movement speed setting step, the field of view is the field of view in which the widget is moved. Giving a value that returns to the position of the initial state in the moving speed vector V means that only in the horizontal direction (X direction) in the visual field Input method according to claim 4 to be a value such as to move in the virtual space so as to return to the initial position Xfv0 is to give to the moving velocity vector V, and characterized.

(Item 6)
Item 6. The input according to any one of Items 1 to 5, wherein when the input step determines that the widget is being watched continuously for a predetermined time or longer, an input corresponding to the selected widget is performed. The input method described in.

(Item 7)
A program for executing the method according to any one of items 1 to 6.
(Item 8)
A recording medium on which a program for executing the method according to any one of items 1 to 6 is recorded.

(Item 9)
An input device that performs input using a widget placed in a virtual space, and a gaze point moving unit that moves a gaze point in the virtual space, and whether or not the widget and the gaze point overlap An input device comprising: a widget selection unit that determines and selects the widget when they overlap, and an input unit that performs an input corresponding to the selected widget.

  According to the input device of this item, the substantial visual field in the virtual space can be widened. For example, the field of view can be ensured by using the input device of this item, a floating graphical user interface, that is, a miniaturized widget displayed so as to float in the virtual space, and in the virtual space. Since the input can be performed using the always-on operation of moving the line of sight, there is no need to perform a special operation to perform the input. The effect is particularly remarkable when the line of sight in the virtual space is moved using the posture data of the head mounted display (HMD) main body.

(Item 10)
Item 10. The input device according to Item 9, wherein the gazing point moving unit is a part of a visual field moving unit that moves the gazing point by moving a visual field with respect to the virtual space.

(Item 11)
Item 11. The input device according to Item 10, wherein the visual field moving unit moves the visual field with respect to the virtual space using the measured movement of the head of the operator.

(Item 12)
The widget is placed at a position (X vs0, Y vs0, Z vs0) in the virtual space corresponding to the initial position (X fv0, Y fv0) in the field of view, and the movement velocity vector V in the virtual space of the widget An initialization unit for initializing the widget to 0, a widget moving unit for moving the widget in the virtual space at the moving speed vector V, and a position of the widget in the field of view is an initial state in the field of view. A widget movement speed reinitialization unit that reinitializes the movement speed vector V to 0 when the position matches the position, and a widget out-of-field determination unit that determines that the widget is positioned outside the field of view. When the widget out-of-field determination unit determines that the widget is located outside the field of view, the widget is out of the field of view. And further comprising a widget movement speed setting unit that gives a value that returns to the position of the period state to the moving velocity vector V, the input device according to claim 10 or 11.

(Item 13)
The widget movement speed re-initialization unit determines that the position of the widget in the field of view matches the position of the initial state in the field of view. The position in the field of view in the X direction) coincides with the initial position Xfv0 in the field of view, and the widget is positioned outside the field of view in the widget out-of-field determination unit. The time is when it comes to be located outside the left-right direction (X direction) in the field of view, and the widget movement speed setting unit has the field of view in the field of view in which the widget is moved. Giving a value that returns to the position of the initial state in the moving speed vector V means that the position Xfv0 of the initial state in the field of view is only in the horizontal direction (X direction) in the field of view. 13. The input device according to item 12, wherein the moving speed vector V is given a value for moving in the virtual space so as to return to step (b).

(Item 14)
The item according to any one of Items 9 to 13, wherein the input unit performs an input corresponding to the selected widget when it is determined that the widget is being watched continuously for a predetermined time or more. The input device described in 1.

[Details of the embodiment of the present invention]
Embodiments of the present invention will be described below with reference to the drawings. In this embodiment, a head-mounted display (HMD) that includes various sensors (for example, an acceleration sensor and an angular velocity sensor) and that can measure posture data thereof is used. ) Is displayed on the premise of an immersive virtual space that scrolls the image displayed in the virtual space and realizes the movement of the line of sight in the virtual space. However, the virtual space is displayed on a normal display, and it is displayed with a keyboard, mouse, joystick, etc. It can also be applied to an object that moves the line of sight in the virtual space by input.

Moreover, in the figure, the same code | symbol is attached | subjected to the same component.
FIG. 1 is a diagram showing an example of a miniaturized widget displayed as floating according to the present invention, and FIG. 2 explains the elements displayed in FIG. It is a figure shown about an example of the initial state of this position.

The widget 201 is displayed so as to float in the virtual space.
In FIG. 1, the widget 201 has a form in which a drone hangs a button. However, if a design of a type that does not cause a sense of incongruity is floating in the target virtual space, the widget 201 has a form. It can be UFO, robot, insect, bird, fairy or ghost.

  In FIG. 1, the widget 201 has two buttons to be operated. However, the present invention is not limited to providing two buttons to be operated. An appropriate number may be provided in accordance with the size of the field of view to be obtained and the type of operation required. Furthermore, it is not necessary to always display the buttons to be operated, and the operation is performed by detecting that the widget 201 has been selected once, like the drop-down list, pop-up menu, and context menu of a normal graphical user interface. A target button or the like may be expanded and displayed.

The cursor 202 indicates a place where the user is gazing in the virtual space, that is, a gazing point.
Even when a head-mounted display (HMD) capable of measuring posture data is used, the posture data often detects the movement of the head of the operator wearing the head-mounted display (HMD). In this case, control is performed so that the visual field in the virtual space moves based on the posture data, and the gazing point itself is usually fixed at the center of the visual field, that is, the center of the screen. In this case, even if the display of the cursor 202 is omitted, there is no sense of incongruity in operation.

  Note that this gazing point is a gazing point for performing an operation on the widget 201. Therefore, the gazing point does not necessarily have to be placed at the center of the field of view, that is, at the center of the screen, and can be placed at a position shifted from the center of the field of view. is there. In that case, the operation is easier when the cursor 202 is displayed. In addition, the cursor 202 should be displayed even when the gaze point can be moved independently of the movement of the visual field by detecting the movement of the eyeball or using some auxiliary input. Operation is easy.

  Further, even when the cursor 202 is displayed, the cursor 202 may be displayed only when the gazing point indicated by the widget 201 and the cursor 202 are close, considering that the cursor 202 blocks the visual field.

  The window 203 is a window that exists at a fixed position in the virtual space, and is described for convenience in order to show the relative positional relationship between the virtual space and the visual field 204. In addition, as shown in FIG. 2, the horizontal direction of the visual field 204 is referred to as the X direction, the vertical direction is referred to as the Y direction, and the depth direction is referred to as the Z direction.

  Hereinafter, the movement of the widget 201 in the virtual space will be described with reference to FIGS. This explanation is made on the assumption that the gazing point is fixed at the center of the visual field, and the cursor 202 is also displayed while being fixed at the center of the screen. However, the position where the gazing point is deviated from the center of the visual field. It can also be applied to the case where the gazing point can be moved while fixing the field of view, the display of the cursor 202 is omitted, or the display is only performed when it is close to the widget 202. . Therefore, when the cursor 202 is not displayed, it may be interpreted as a gaze point instead of the cursor 202.

As described above, FIG. 2 is a diagram illustrating an example of an initial state of the positions of the widget 201 and the cursor 202.
In the example shown in FIG. 2, as an initial state, the widget 201 is positioned substantially at the center of the visual field in the left-right direction (X direction), but is positioned above the center of the visual field in the vertical direction (Y direction). The position in the initial state is (X fv0, Y fv0) with respect to the visual field, and the position in the virtual space is (X vs0, Y vs0, Z vs0) with the coordinate axis in the virtual space.

  The position in the visual field for displaying the widget 201 in the initial state is not limited to this. Considering only to avoid the important part of the field of view being blocked by the widget 201, it is better that the widget 201 is displayed away from the center of the field of view, and even if the widget 201 is displayed at the corner of the field of view, there is no problem. There is no. However, as shown below, the widget 201 is moved to the position of the cursor 202, which is the center of the field of view, and the widget 201 is operated by overlapping each other, so that the widget 201 is in the field of view. The further away from the center, the longer it takes to operate.

  FIG. 3 is a diagram when the visual field is moved upward by ΔY (+ Y direction) and slightly to the right by ΔX (+ X direction) from the state of FIG. The widget 201 basically floats at a fixed position with respect to the virtual space. In FIG. 3, the coordinate axis in the virtual space remains at (X vs0, Y vs0, Zvs0), but the field of view is higher. By moving slightly (+ Y direction) to the right (+ X direction), the widget 201 moves to (X fv0 −ΔX, Y fv0 −ΔY) on the coordinate axis with respect to the field of view. Can come. Since the window 203 does not move in the virtual space, comparing FIG. 2 and FIG. 3, it can be seen that the positional relationship between the widget 201 and the window 203 has not changed.

  What is shown in FIG. 3 is that, at this point, the cursor 202 actually overlaps the button on the right side of the widget 201, and the button on the right side of the widget 201 has changed to indicate that it has been selected. It has become. In order to perform an operation corresponding to the button on the right side of the widget 201, it is necessary to confirm this selection. In this example, it is a natural operation method to fix the view by fixing the field of view and displaying the cursor 202 over the right button of the widget 201 for a certain period of time. Various methods such as an auxiliary input using a keyboard or the like, or a button for confirming selection on the widget 201 can be used.

  In FIG. 3, the cursor 202 is overlapped with the button on the right side of the widget 201, so that the button on the right side of the widget 201 is simply selected. Similarly to the drop-down list, pop-up menu, and context menu, it is also possible to detect that the widget 201 has been selected once and expand and display buttons to be operated.

  Further, as described below, considering that the widget 201 moves in the virtual space, in order to determine whether the widget 201 has been selected, it is determined on the condition that the widget 201 and the cursor 202 overlap. In addition, a condition that the moving speed of the widget 201 in the virtual space is 0 may be added.

Next, how the widget 201 is displayed when the visual field is moved in the left-right direction (X direction) will be described with reference to FIGS.
4 shows a state in which the visual field is moved to the right (+ X direction) from the state of FIG. 2, and as a result, the widget 201 is moved to the left (−X direction) in the visual field. As already described, since the widget 201 is basically floating at a fixed position with respect to the virtual space, a relative positional relationship between the widget 201 and the window 203 is obtained by comparing FIG. 2 and FIG. It can be seen that has not changed.

  FIG. 5 shows a state in which the field of view is further moved to the right (+ X direction) from FIG. At this time, the widget 201 further moves to the left (−X direction), and a part of the widget 201 is located outside the field of view. Up to this point, the widget 201 floats at a fixed position with respect to the virtual space, but when the widget 201 is positioned outside the field of view in the left-right direction (X direction), the initial state in the field of view is Start moving in virtual space to return to position. It should be noted that the condition for determining that the widget 201 is located outside the field of view is that when a part of the widget 201 is located outside the field of view, when half of the widget 201 is located outside the field of view, the widget 201 is completely It may be any case where it is located outside the field of view.

  Specifically, the initial state position in the field of view returned at this time is the position X fv0 in the initial state in the horizontal direction (X direction) in the field of view. Even if the position in the vertical direction (Y direction) is different from the position in the initial state, the initial value in the horizontal direction (X direction) in the field of view is maintained while maintaining the difference in the vertical direction (Y direction). It is better to move in the virtual space so as to return to the state position X fv0.

  In order to make the widget 201 appear to actually float in the virtual space, the movement in the virtual space to return to the initial position in the visual field is performed at a finite speed Vx, Although it is desirable to change the state of follow-up to be seen when the movement is slow and fast, it may be an instantaneous movement.

  FIG. 6 is a diagram illustrating the widget 201 that has returned to the initial position in the field of view, that is, in this case, the position X fv0 in the initial state of the field of view in the left-right direction (X direction). Since the widget 201 has moved in the virtual space, at this point, the widget 201 is located at a location different from the initial position (X vs0, Y vs0, Z vs0) in the virtual space. Comparing the positional relationship between the widget 201 and the window 203 in FIG. 2 and the positional relationship between the widget 201 and the window 203 in FIG. 6, it can be seen that the positional relationship between the widget 201 and the virtual space changes at the time of FIG. 6.

Next, how the widget 201 is displayed when the visual field is moved up and down (Y direction) will be described with reference to FIGS.
FIG. 7 shows a state in which the visual field is moved downward (−Y direction) from the state of FIG. 2, and as a result, the widget 201 is moved upward (+ Y direction) in the visual field. Similar to FIG. 4, comparing FIG. 2 and FIG. 7, it can be seen that the relative positional relationship between the widget 201 and the window 203 has not changed.

  FIG. 8 shows a state in which the field of view is further moved downward (−Y direction) from FIG. 7. At this time, a part of the widget 201 is located outside the field of view. When the widget 201 is positioned outside the field of view in the left-right direction (X direction) as shown in FIG. 5, the widget 201 starts moving in the virtual space to return to the initial position in the field of view. In this embodiment, such movement is not performed in the vertical direction (Y direction).

FIG. 9 is a diagram illustrating that the widget 201 is completely out of the field of view and is not visible as a result of moving the field of view further downward (−Y direction) from FIG. 8.
When the visual field described in FIGS. 4 to 7 is moved in the left-right direction (X direction) and when the visual field described in FIGS. 7 to 10 is moved in the vertical direction (Y direction), the widget 201 is The reason why the processing when it is located outside is different for the following reason.

  The movement of the human visual field in the horizontal direction (X direction), that is, the movement of the yaw angle that rotates about the Y axis in FIG. 14 is the movement of the entire body facing left and right in addition to the action of twisting the neck left and right. After the human visual field moves in the left-right direction (X direction) by turning the entire body to the left and right, it is possible to perform an operation of turning further left and right around the newly directed direction. On the other hand, the movement of the visual field in the vertical direction (Y direction), that is, the movement of the pitch angle rotating around the X axis in FIG. It is difficult to continue for a long time, or to move the field of view further in the vertical direction (Y direction) around a newly facing position, and the movement itself temporarily stops. That is, when moving the visual field in the vertical direction (Y direction) and moving the visual field in the horizontal direction (X direction), the human motion is different, so when moving the visual field in the vertical direction (Y direction). This is because when the field of view is moved in the left-right direction (X direction), it is natural to make the movement of the widget 201 different.

  Therefore, for example, the virtual space to be displayed is a zero-gravity space, and as a movement of the pitch angle that rotates about the X axis in FIG. If there is, even if the visual field is moved in the vertical direction (Y direction) and the widget 201 is located outside the visual field, the visual field described in FIGS. As in the case of moving in the (X direction), it is more natural to configure the widget 201 to return to the initial position in the field of view.

As the movement of the visual field in the virtual space, in addition to the horizontal direction (X direction) and the vertical direction (Y direction), a roll that rotates about the direction of the line of sight, that is, the depth direction (Z direction) in FIG. There is a corner movement.

  The movement of the roll angle that rotates about the depth direction (Z direction) is actually an action of tilting the neck to the left and right, and the movement is performed by bending the neck back and forth to move the field of view up and down (Y direction). ), The same restriction as in the case of the movement of the pitch angle, and the movement itself remains temporary, so that the field of view moves in the vertical direction (Y direction), as in the Z axis. Even if the widget 201 is positioned outside the field of view as a result of the rotation of the rotating roll angle, it is not necessary to move the widget 201 in the virtual space.

  On the other hand, as discussed when the field of view moves in the vertical direction (Y direction), the virtual space to be displayed is a zero-gravity space and has a roll angle that rotates around the depth direction (Z direction). As the movement, when the whole body is allowed to move sideways to the left and right, the position of the operator in the virtual space and the rotation direction about the Z axis of the widget 201 are relatively held, The widget 201 may be moved in the virtual space following the movement of the operator. Moving the widget 201 in the virtual space in this way also means that the widget 201 is fixedly displayed in the field of view, so that the widget 201 moves out of the field of view except for a transitional case. There is no need to make a judgment and take special measures.

  So far, in order to facilitate understanding, the movement of the visual field has been described as being due to the movement of the yaw angle, pitch angle, and roll angle in FIG. 14, but in reality, both the operator and the widget 201 are in the virtual space. Therefore, it is necessary to consider the case where the operator moves around in the virtual space.

First, consider that the operator moves in the Z-axis direction as shown in FIGS.
Considering that the widget 201 is an operation target, it is desirable that the apparent size of the widget 201 does not change. In order to realize this, as shown in FIG. 10, the widget 201 is centered on the operator in the virtual space so that the distance between the operator and the widget 201 in the virtual space is maintained. It is necessary to exist on the virtual sphere 1001.

  Hereinafter, in order to simplify the description, it is assumed that the positional relationship among the widget 201, the cursor 202, and the operator is the positional relationship illustrated in FIG. At this time, the operator has a field of view as shown in FIG.

  When the operator moves in the Z-axis direction connecting the operator and the cursor 202 in the virtual space, the virtual sphere 1001 also moves in the Z-axis direction connecting the operator and the cursor 202. In order to maintain the presence on the virtual sphere 1001, it is preferable to move following the movement of the operator in the Z-axis direction. In this case as well, the movement is performed at a finite speed so that the widget 201 appears to be actually floating in the virtual space, and the situation of follow-up changes depending on whether the movement of the operator is slow or fast. Although it is desirable to make it move, instantaneous movement may be sufficient.

  In addition, since it is troublesome in appearance that the widget 201 repeats a fine movement, this follow-up may be performed after the operator moves in a Z-axis direction beyond a certain size, and does not follow. If the movement of the operator remains, it may be performed collectively when the movement of the widget 201 becomes necessary thereafter.

  Moving the widget 201 in the virtual space in this way means that the widget 201 is fixedly displayed in the field of view, so that the field of view in the direction approaching the widget 201 in the Z-axis direction is too much. The widget 201 does not go out of the field of view except when the widget 201 cannot quickly move in the virtual space and transits out of the field of view transiently. Even in this transitional case, if the movement of the field of view stops, the widget 201 returns to the field of view. Therefore, it is determined whether the widget 201 exists inside or outside the field of view, and special processing is performed. There is no need to do it.

  Next, consider that the operator moves in the X-axis direction and the Y-axis direction as shown in FIGS. In that case, assuming that the widget 201 exists on a virtual sphere 1001 centering on the operator as shown in FIG. 10, the operator itself moves in the horizontal direction (X direction) and the vertical direction (Y direction). Even when the virtual sphere 1001 moves, the virtual sphere 1001 itself moves, and as a result, the widget 201 deviates from the virtual sphere 1001.

  As described above, when the widget 201 moves out of the field of view as a result of the field of view moving in the horizontal direction (X direction), the widget 201 moves to return to the initial position in the field of view. . As a result of the operator moving in the X-axis direction and the Y-axis direction, if the field of view moves in the horizontal direction (X direction) and the vertical direction (Y direction), in addition to the movement of the widget 201 already described, the widget It is necessary to obtain a straight line connecting 201 and the new position of the moved operator, and move the widget 201 to a position in the virtual space where the straight line and the virtual sphere 1001 after the movement intersect.

  In this case as well, the movement is performed at a finite speed so that the widget 201 appears to be actually floating in the virtual space, and the situation of follow-up changes depending on whether the movement of the operator is slow or fast. Although it is desirable to make it move, instantaneous movement may be sufficient.

  In this case as well, like the movement of the operator in the Z-axis direction, it is troublesome in appearance for the widget 201 to repeat fine movements. If the movement of the operator who does not follow remains, it may be performed collectively when the movement of the widget 201 becomes necessary thereafter.

  FIGS. 11 and 12 are flowcharts showing processing for realizing the operation of the widget 201 described so far. Note that this flowchart shows only basic processing, and for example, processing regarding movement in the Z-axis direction and processing for preventing the deviation of the widget 201 from the virtual sphere 1001 are omitted.

FIG. 11 shows processing for moving the visual field 204 and moving the widget 201.
Steps S1101 and S1102 are initialization steps for displaying the widget 201. The virtual space position (X vs0, Y vs0, Y) corresponds to the position (X fv0, Y fv0) of the widget 201 in the field of view. Z vs0), the entire visual field is drawn, and the moving speed vector V of the widget 201 in the virtual space is initialized to 0.

  Step S1103 is a widget moving step for moving the widget 201 in the virtual space with the moving speed vector V. In this embodiment, since the moving speed vector V = 0 in most states including the initial state, at this time, this step S1103 does not perform any operation. However, when a value other than 0 is set in the moving speed vector V, VΔt is used so that the widget 201 is returned to the initial position in the field of view by using steps S1107 and S1108 described later. Just move in the virtual space.

  Steps S1104 and S1105 are widget movement speed reinitialization steps for detecting that the widget 201 has returned to the initial position in the field of view and reinitializing the movement speed vector V to zero.

  Step S1106 is a visual field moving step in which the visual field is moved in the virtual space in accordance with the operation of the operator and the visual field is drawn correspondingly. In this visual field moving step, the gazing point is moved by controlling the visual field in the virtual space to move. However, the movement of the eyeball is detected or some auxiliary input is used. Thus, when the point of interest can be moved independently of the movement of the visual field, the processing may be performed together in the visual field movement step. Whether based on visual field movement or based on movement of the eyeball, the portion related to the gazing point movement in this visual field movement step can also be referred to as a gazing point movement step.

  This visual field moving step is configured to detect the movement of the operator's head measured by the head mounted display (HMD) and to control the visual field in the virtual space to move based on the detected posture data. Is desirable.

  Step S1107 is a step of determining whether or not the widget 201 is moving. If the widget 201 is moving, the field of view moves during the movement, and a new position is added to cope with the fact that the position in the virtual space corresponding to the position of the initial state in the field of view changes. Since it is necessary to set the moving speed vector V, the process proceeds to step S1109.

  Step S1108 is a widget out-of-field determination step for determining whether the position in the left-right direction (X direction) on the field of view of the widget 201 is out of the field of view. As described above, the condition for determining that the widget 201 is located outside the field of view is that when a part of the widget 201 is located outside the field of view, when half of the widget 201 is located outside the field of view, the widget 201 It may be any case where 201 is completely outside the field of view.

  Step S1109 is a widget movement speed setting step in which a value is set to the movement speed vector V in order to return the widget 201 to the initial position in the field of view. As already described, using this moving velocity vector V, in step S1103, the widget 201 is moved in the virtual space so as to return to the initial position in the field of view. Therefore, the direction of the moving velocity vector V is a direction connecting the position in the virtual space corresponding to the position of the initial state in the current visual field from the position in the current virtual space of the widget 201. In order to make the widget 201 appear to actually float in the virtual space, the value set for the moving speed vector V is determined so that the magnitude of the vector does not exceed a certain finite value. It is good to be done.

The processes up to step S1109 are about the movement of the visual field 204 and the movement of the widget 201.
Next, processing when the widget 201 is selected will be described with reference to FIG.

  In step S1201, it is determined whether the widget 201 and the cursor 202 overlap. Since the cursor 202 is at the gazing point, it is determined whether or not the widget 201 is at the position of the gazing point.

  If it is determined in step S1201 that the widget 201 and the cursor 202 overlap, the widget 201 is changed to a display of a selected state in step S1202. As the display of the selection state, various kinds such as highlight, change of color, and change to display of another mode are conceivable. Further, at this time, like a drop-down list, a pop-up menu, or a context menu of a normal graphical user interface, an operation target button or the like may be expanded and displayed.

These step S1201 and step S1202 constitute a widget selection step.
In step S1203, it is determined whether or not the widget 201 has been selected for a certain time or more. If the widget 201 has been selected for a certain time or more, an input step for executing an operation corresponding to the widget 201 selected in step S1205. It is. If it has not been selected for a certain period of time, the process directly returns to step S1103.

  In step S1204, when it is determined in step S1201 that the widget 201 and the cursor 202 do not overlap each other and when an operation corresponding to the widget 201 is executed in step S1205, the widget 201 is displayed in a non-selected state. Then, the process returns to step S1103.

  Next, a system for displaying the widget 201 will be described with reference to FIGS. As already described, these FIGS. 13 to 15 are described on the assumption that a head mounted display (HMD) that includes various sensors (for example, an acceleration sensor and an angular velocity sensor) and can measure posture data thereof is used. However, the present invention can also be applied to a device that displays a virtual space on a normal display and moves the line of sight in the virtual space by inputting with a keyboard, a mouse, a joystick, or the like.

FIG. 13 is a diagram illustrating an overview of a system 1300 for displaying the widget 201.
As shown in FIG. 13, the system 1300 includes a head mounted display (HMD) 1310, a control circuit unit 1320, a position tracking camera (position sensor) 1330, and an external controller 1340.

  A head mounted display (HMD) 1310 includes a display 1312 and a sensor 1314. The display 1312 is a non-transmissive display device configured to completely cover the user's visual field, and the user can observe only the screen displayed on the display 1312. Then, the user wearing the non-transmissive head-mounted display (HMD) 1310 loses all the field of view of the outside world, so that the display mode is completely immersed in the virtual space displayed by the application executed in the control circuit unit 1320. It becomes.

  A sensor 1314 included in the head mounted display (HMD) 1310 is fixed in the vicinity of the display 1312. The sensor 1314 includes a geomagnetic sensor, an acceleration sensor, and / or a tilt (angular velocity, gyro) sensor, and through one or more thereof, a head mounted display (HMD) 1310 (display 1312) mounted on the user's head. Can detect various movements. In particular, in the case of an angular velocity sensor, as shown in FIG. 14, according to the movement of the head mounted display (HMD) 1310, the angular velocity around the three axes of the head mounted display (HMD) 1310 is detected over time. The time change of the angle (tilt) around the axis can be determined.

  The angle information data that can be detected by the tilt sensor will be described with reference to FIG. As shown in the figure, XYZ coordinates are defined around the head of the user wearing the head-mounted display (HMD). The vertical direction in which the user stands upright is the Y axis, the direction perpendicular to the Y axis and connecting the center of the display 1312 and the user is the Z axis, and the axis in the direction perpendicular to the Y axis and the Z axis is the X axis. The tilt sensor is determined by an angle around each axis (that is, a yaw angle indicating rotation about the Y axis, a pitch angle indicating rotation about the X axis, and a roll angle indicating rotation about the Z axis). The motion detection unit 1610 determines angle (tilt) information data as visual field information based on the change over time.

  Returning to FIG. 13, the control circuit unit 1320 included in the system 1300 is a control circuit unit for immersing a user wearing a head-mounted display (HMD) into the three-dimensional virtual space and performing an operation based on the three-dimensional virtual space. It functions as 1320. As illustrated in FIG. 13, the control circuit unit 1320 may be configured as hardware different from the head mounted display (HMD) 1310. The hardware can be a computer such as a personal computer or a server computer via a network. That is, any computer including a CPU, a main memory, an auxiliary memory, a transmission / reception unit, a display unit, and an input unit connected to each other via a bus can be used.

  Alternatively, the control circuit unit 1320 may be mounted inside the head mounted display (HMD) 1310 as an object operation device. Here, the control circuit unit 1320 can implement all or some of the functions of the object operating device. When only a part is mounted, the remaining functions may be mounted on the head mounted display (HMD) 1310 side or the server computer (not shown) side through the network.

  A position tracking camera (position sensor) 1330 provided in the system 1300 is communicably connected to the control circuit unit 1320 and has a position tracking function of a head mounted display (HMD) 1310. The position tracking camera (position sensor) 1330 is realized by using an infrared sensor or a plurality of optical cameras. By including a position tracking camera (position sensor) 1330 and detecting the position of the head-mounted display (HMD) of the user's head, the system 1300 can generate a virtual camera / immersive user virtual space in a three-dimensional virtual space. The position can be accurately associated and specified.

  More specifically, the position tracking camera (position sensor) 1330 is virtually provided on a head-mounted display (HMD) 1310 as shown in FIG. 15, and detects a plurality of infrared rays. The real space position of the detection point is detected over time corresponding to the movement of the user. Then, based on the temporal change of the real space position detected by the position tracking camera (position sensor) 1330, the time change of the angle around each axis according to the movement of the head mounted display (HMD) 1310 is determined. can do.

  Returning to FIG. 13 again, the system 1300 includes an external controller 1340. The external controller 1340 is a general user terminal, and can be a smartphone as illustrated, but is not limited thereto. For example, any device can be used as long as it is a portable device terminal having a touch display such as a PDA, a tablet computer, a game console, and a notebook PC. That is, the external controller 1340 can be any portable device terminal including a CPU, a main memory, an auxiliary memory, a transmission / reception unit, a display unit, and an input unit that are bus-connected to each other. The user can perform various touch operations including touch, swipe, and hold on the touch display of the external controller 1340.

  The block diagram of FIG. 16 shows the configuration of the main functions of components centering on the control circuit unit 1320 in order to implement object operations in the three-dimensional virtual space according to the embodiment of the present invention. The control circuit unit 1320 mainly receives inputs from the sensor 1314 / position tracking camera (position sensor) 1330 and the external controller 1340, processes the inputs, and outputs them to the display 1312. The control circuit unit 1320 mainly includes a motion detection unit 1610, a gaze point movement unit 1621, a visual field image generation unit 1630, and a widget control unit 1640, and processes various types of information.

  The motion detection unit 1610 measures motion data of a head mounted display (HMD) 1310 mounted on the user's head based on input of motion information from the sensor 1314 / position tracking camera (position sensor) 1330. To do. In the present invention, in particular, angle information detected over time by the tilt sensor 1314 and position information detected over time by the position tracking camera (position sensor) 1330 are determined.

  The field of view moving unit 1620 is based on the three-dimensional virtual space information stored in the spatial information storage unit 1650, the angle information detected by the tilt sensor 1314, and the position information detected by the position sensor 130. Visual field information is obtained based on the detection information. The gazing point moving unit 1621 included in the visual field moving unit 1620 determines gazing point information in the three-dimensional virtual space based on the visual field information. Note that when the movement of the eyeball is detected or the auxiliary point can be moved independently of the movement of the visual field by using some auxiliary input, the fixation point movement unit 1621 That will also be processed.

  In the visual field moving unit 1620, the head mounted display (HMD) uses the movement of the head of the operator measured by the sensor 1314 / position tracking camera (position sensor) 1330 to move the visual field in the virtual space. It is desirable to configure so as to control.

The visual field image generation unit 1630 generates a visual field image based on the visual field information and the position of the widget 201 sent from the widget control unit 1640.
The widget control unit 1640 is a part that controls most of the controls shown in FIGS. Specifically, the widget control unit 1640 includes a widget selection unit 1641, an input unit 1642, a widget movement unit 1643, a widget movement speed reinitialization unit 1644, a widget out-of-field determination unit 1645, and a widget speed setting unit 1646. Yes. The widget selection unit 1641 determines whether the widget 201 and the cursor 202 are overlapped. When it is determined that the widget 201 and the cursor 202 overlap, the widget selection unit 1641 performs processing for changing the widget 201 to display the selected state. In step 1642, it is determined whether the widget 201 has been selected for a certain period of time. If the widget 201 has been selected for a certain period of time, an operation corresponding to the selected widget 201 is executed. Note that the widget selection unit 1641 and the input unit 1642 may receive input from the external controller 1340 and determine whether or not the widget 201 has been selected based on the input. The widget moving unit 1643 performs a process of moving the widget 201 with the moving speed vector V in the virtual space, and the widget moving speed reinitializing unit 1644 detects that the widget 201 has returned to the initial position in the field of view. Then, the process of reinitializing the moving speed vector V to 0 is performed, and the widget out-of-field determination unit 1645 determines whether the position in the left-right direction (X direction) on the field of view of the widget 201 is out of the field of view. The widget speed setting unit 1646 performs a process of setting a value to the moving speed vector V in order to return the widget 201 to the initial position in the field of view.

The initialization unit 1660 is responsible for initialization processing.
In FIG. 16, each element described as a functional block for performing various processes can be configured with a CPU, a memory, and other integrated circuits in terms of hardware, and loaded into the memory in terms of software. This is realized by various programs. Accordingly, those skilled in the art will understand that these functional blocks can be realized by hardware, software, or a combination thereof.

  As mentioned above, although embodiment of this invention was described, this invention is not limited to the said embodiment. Those skilled in the art will appreciate that various modifications of the embodiments can be made without departing from the spirit and scope of the invention as set forth in the appended claims.

201 Widget 202 Cursor 203 Window 204 Field of View 1001 Virtual Sphere 1300 System 1310 Head Mounted Display (HMD)
1312 Display 1314 Sensor 1320 Control circuit unit 1330 Position tracking camera (position sensor)
1340 External controller 1610 Motion detection unit 1620 Visual field movement unit 1621 Gaze point movement unit 1630 Visual field image generation unit 1640 Widget control unit 1641 Widget selection unit 1642 Input unit 1643 Widget movement unit 1644 Widget movement speed reinitialization unit 1645 Widget out-of-field determination unit 1646 Widget speed setting unit 1650 Spatial information storage unit 1660 Initialization unit

Claims (14)

  1. An input method for inputting using a widget placed in a virtual space,
    A gazing point moving step for moving the gazing point in the virtual space;
    Determining whether or not the widget and the gazing point overlap, and if overlapping, a widget selection step of selecting the widget; and
    An input step for inputting corresponding to the selected widget;
    input method.
  2.   The input method according to claim 1, wherein the gazing point moving step is a part of a visual field moving step of moving the gazing point by moving a visual field with respect to the virtual space.
  3.   The input method according to claim 2, wherein the visual field moving step moves the visual field with respect to the virtual space using the measured movement of the head of the operator.
  4. The widget is placed at a position (X vs0, Y vs0, Z vs0) in the virtual space corresponding to the initial position (X fv0, Y fv0) in the field of view, and the movement velocity vector V in the virtual space of the widget An initialization step for initializing to zero,
    A widget moving step of moving the widget in the virtual space at the moving speed vector V;
    A widget movement speed reinitialization step of reinitializing the movement speed vector V to 0 when the position of the widget in the field of view matches the position of the initial state in the field of view;
    A widget out-of-view determination step for determining that the widget is positioned outside the view; and
    If the widget out-of-view determination step determines that the widget is located outside the field of view, the moving speed is set to a value that returns the initial position in the field of view within the field of view where the widget is moved. And further comprising a widget moving speed setting step to be applied to the vector V.
    The input method according to claim 2 or 3.
  5. The widget movement speed re-initialization step determines that the position of the widget in the field of view matches the position of the initial state in the field of view. The position in the field of view (X direction) coincides with the initial position X fv0 in the field of view,
    In the widget out-of-view determination step, when the widget is positioned outside the field of view, it is when the widget is positioned outside the left-right direction (X direction) in the field of view,
    In the widget moving speed setting step, giving the moving speed vector V a value that returns to the initial position in the field of view in the field of view in which the widget has been moved means that the right and left in the field of view Providing the moving velocity vector V with a value that moves in the virtual space so as to return to the initial position X fv0 in the field of view only in the direction (X direction);
    The input method according to claim 4.
  6.   The input corresponding to the selected widget is performed when the input step determines that the widget is being watched continuously for a predetermined time or longer. The input method described in the paragraph.
  7.   The program for performing the method as described in any one of Claims 1-6.
  8.   The recording medium which recorded the program for performing the method as described in any one of Claims 1-6.
  9. An input device that performs input using a widget placed in a virtual space,
    A gazing point moving unit that moves the gazing point in the virtual space;
    It is determined whether or not the widget and the gaze point overlap, and if it overlaps, a widget selection unit that selects the widget,
    An input unit for inputting corresponding to the selected widget;
    Input device.
  10.   The input device according to claim 9, wherein the gazing point moving unit is a part of a visual field moving unit that moves the gazing point by moving a visual field with respect to the virtual space.
  11.   The input device according to claim 10, wherein the visual field moving unit moves the visual field with respect to the virtual space using the measured movement of the head of the operator.
  12. The widget is placed at a position (X vs0, Y vs0, Z vs0) in the virtual space corresponding to the initial position (X fv0, Y fv0) in the field of view, and the movement velocity vector V in the virtual space of the widget An initialization unit for initializing to 0,
    A widget moving unit that moves the widget in the virtual space at the moving speed vector V;
    A widget movement speed re-initialization unit that re-initializes the movement speed vector V to 0 when the position of the widget in the field of view matches the position of the initial state in the field of view;
    A widget out-of-field determination unit that determines that the widget is positioned outside the field of view;
    When the widget out-of-field determination unit determines that the widget is located outside the field of view, the moving speed is set to a value that returns the initial position in the field of view within the field of view where the widget is moved. It further has a widget moving speed setting part to be given to the vector V,
    The input device according to claim 10 or 11.
  13. The widget movement speed re-initialization unit determines that the position of the widget in the field of view matches the position of the initial state in the field of view. The position in the field of view in the X direction coincides with the initial position Xfv0 in the field of view,
    In the widget out-of-view determination unit, when the widget is positioned outside the field of view, it is when the widget is positioned outside the left-right direction (X direction) in the field of view,
    In the widget moving speed setting unit, giving the moving speed vector V a value that returns to the initial position in the field of view in the field of view in which the widget is moved means that the right and left in the field of view Providing the moving velocity vector V with a value for moving in the virtual space so as to return to the initial position Xfv0 in the field of view only in the direction (X direction);
    The input device according to claim 12.
  14. The input according to any one of claims 9 to 13, wherein the input unit performs an input corresponding to the selected widget when it is determined that the widget is being watched continuously for a predetermined time or more. The input device according to item.

JP2016008456A 2016-01-20 2016-01-20 Floating graphical user interface Pending JP2017004491A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2016008456A JP2017004491A (en) 2016-01-20 2016-01-20 Floating graphical user interface

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2016008456A JP2017004491A (en) 2016-01-20 2016-01-20 Floating graphical user interface

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
JP2015119252 Division 2015-06-12

Publications (2)

Publication Number Publication Date
JP2017004491A true JP2017004491A (en) 2017-01-05
JP2017004491A5 JP2017004491A5 (en) 2018-07-05

Family

ID=57754672

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2016008456A Pending JP2017004491A (en) 2016-01-20 2016-01-20 Floating graphical user interface

Country Status (1)

Country Link
JP (1) JP2017004491A (en)

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07244556A (en) * 1994-03-04 1995-09-19 Hitachi Ltd Information terminal
JPH07271546A (en) * 1994-03-31 1995-10-20 Olympus Optical Co Ltd Image display control method
JPH1069356A (en) * 1996-06-20 1998-03-10 Sharp Corp Mouse cursor controller and recording medium where mouse cursor control program
JP2001070632A (en) * 1999-09-08 2001-03-21 Namco Ltd Sporting game device
JP2001337645A (en) * 2000-05-26 2001-12-07 Fujitsu Ltd Display system and storage medium
US20030184602A1 (en) * 2002-03-29 2003-10-02 Canon Kabushiki Kaisha Information processing method and apparatus
JP2008125617A (en) * 2006-11-17 2008-06-05 Sega Corp Game device and game program
US20130117707A1 (en) * 2011-11-08 2013-05-09 Google Inc. Velocity-Based Triggering
US20140028544A1 (en) * 2012-07-26 2014-01-30 Nintendo Co., Ltd. Storage medium and information processing apparatus, method and system
US20140225920A1 (en) * 2013-02-13 2014-08-14 Seiko Epson Corporation Image display device and display control method for image display device
US20150123997A1 (en) * 2013-11-07 2015-05-07 Konica Minolta, Inc. Information Display System Including Transmission Type HMD, Non-Transitory Computer-Readable Storage Medium and Display Control Method
JP2016110319A (en) * 2014-12-04 2016-06-20 ソニー株式会社 Display control device, display control method, and program
US20160232879A1 (en) * 2015-02-05 2016-08-11 Samsung Electronics Co., Ltd. Method and electronic device for displaying screen

Patent Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07244556A (en) * 1994-03-04 1995-09-19 Hitachi Ltd Information terminal
JPH07271546A (en) * 1994-03-31 1995-10-20 Olympus Optical Co Ltd Image display control method
JPH1069356A (en) * 1996-06-20 1998-03-10 Sharp Corp Mouse cursor controller and recording medium where mouse cursor control program
JP2001070632A (en) * 1999-09-08 2001-03-21 Namco Ltd Sporting game device
JP2001337645A (en) * 2000-05-26 2001-12-07 Fujitsu Ltd Display system and storage medium
JP2003296757A (en) * 2002-03-29 2003-10-17 Canon Inc Information processing method and device
US20030184602A1 (en) * 2002-03-29 2003-10-02 Canon Kabushiki Kaisha Information processing method and apparatus
JP2008125617A (en) * 2006-11-17 2008-06-05 Sega Corp Game device and game program
US20130117707A1 (en) * 2011-11-08 2013-05-09 Google Inc. Velocity-Based Triggering
JP2014023719A (en) * 2012-07-26 2014-02-06 Nintendo Co Ltd Information processing program, information processing apparatus, information processing method and information processing system
US20140028544A1 (en) * 2012-07-26 2014-01-30 Nintendo Co., Ltd. Storage medium and information processing apparatus, method and system
US20140225920A1 (en) * 2013-02-13 2014-08-14 Seiko Epson Corporation Image display device and display control method for image display device
JP2014153645A (en) * 2013-02-13 2014-08-25 Seiko Epson Corp Image display device and display control method of image display device
US20150123997A1 (en) * 2013-11-07 2015-05-07 Konica Minolta, Inc. Information Display System Including Transmission Type HMD, Non-Transitory Computer-Readable Storage Medium and Display Control Method
JP2015090635A (en) * 2013-11-07 2015-05-11 コニカミノルタ株式会社 Information display system having transmission type hmd, and display control program
JP2016110319A (en) * 2014-12-04 2016-06-20 ソニー株式会社 Display control device, display control method, and program
US20170329480A1 (en) * 2014-12-04 2017-11-16 Sony Corporation Display control apparatus, display control method, and program
US20160232879A1 (en) * 2015-02-05 2016-08-11 Samsung Electronics Co., Ltd. Method and electronic device for displaying screen

Similar Documents

Publication Publication Date Title
US10261594B2 (en) Systems and methods of creating a realistic displacement of a virtual object in virtual reality/augmented reality environments
EP3164785B1 (en) Wearable device user interface control
US20180292648A1 (en) External user interface for head worn computing
US20200042112A1 (en) External user interface for head worn computing
US9829998B2 (en) Information processing apparatus, input apparatus, information processing system, information processing method, and program
EP2976690B1 (en) Head-mounted device for user interactions in an amplified reality environment
EP2862042B1 (en) User interface interaction for transparent head-mounted displays
US10096167B2 (en) Method for executing functions in a VR environment
US20170242479A1 (en) Menu navigation in a head-mounted display
US8482527B1 (en) Apparatus and method of providing user interface on head mounted display and head mounted display thereof
US8854325B2 (en) Two-factor rotation input on a touchscreen device
EP2972727B1 (en) Non-occluded display for hover interactions
JP6682623B2 (en) Teleportation in augmented and / or virtual reality environments
JP6271960B2 (en) Information processing system
US20160026239A1 (en) External user interface for head worn computing
US20160062118A1 (en) External user interface for head worn computing
CN102681664B (en) Electronic installation, information processing method, program and electronic apparatus system
EP2972669B1 (en) Depth-based user interface gesture control
US20160027218A1 (en) Multi-user gaze projection using head mounted display devices
US9904055B2 (en) Smart placement of virtual objects to stay in the field of view of a head mounted display
KR20160027097A (en) Web-like hierarchical menu display configuration for a near-eye display
US20180300953A1 (en) Method And System For Receiving Gesture Input Via Virtual Control Objects
WO2017047367A1 (en) Computer program for line-of-sight guidance
KR20180081162A (en) Perception based predictive tracking for head mounted displays
JP5869177B1 (en) Virtual reality space video display method and program

Legal Events

Date Code Title Description
RD03 Notification of appointment of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7423

Effective date: 20180517

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20180525

A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20180525

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20190131

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20190225

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20190424

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20190927

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20191203

A911 Transfer of reconsideration by examiner before appeal (zenchi)

Free format text: JAPANESE INTERMEDIATE CODE: A911

Effective date: 20191210

A912 Removal of reconsideration by examiner before appeal (zenchi)

Free format text: JAPANESE INTERMEDIATE CODE: A912

Effective date: 20191227