JP2013258614A - Image generation device and image generation method - Google Patents

Image generation device and image generation method Download PDF

Info

Publication number
JP2013258614A
JP2013258614A JP2012134264A JP2012134264A JP2013258614A JP 2013258614 A JP2013258614 A JP 2013258614A JP 2012134264 A JP2012134264 A JP 2012134264A JP 2012134264 A JP2012134264 A JP 2012134264A JP 2013258614 A JP2013258614 A JP 2013258614A
Authority
JP
Japan
Prior art keywords
head
unit
display
image
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2012134264A
Other languages
Japanese (ja)
Inventor
Yoichi Nishimaki
洋一 西牧
Hiroshi Osawa
洋 大澤
Ken Yamagishi
建 山岸
Original Assignee
Sony Computer Entertainment Inc
株式会社ソニー・コンピュータエンタテインメント
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Computer Entertainment Inc, 株式会社ソニー・コンピュータエンタテインメント filed Critical Sony Computer Entertainment Inc
Priority to JP2012134264A priority Critical patent/JP2013258614A/en
Publication of JP2013258614A publication Critical patent/JP2013258614A/en
Pending legal-status Critical Current

Links

Images

Abstract

PROBLEM TO BE SOLVED: To solve the problem in which a user cannot look at his/her hands in a state of wearing a head-mounted display to have difficulty in operation of a controller or the like.SOLUTION: An angle information acquisition unit 730 acquires information on a rotation angle of the head of a user wearing a head-mounted display. A visual line direction changing unit 740 changes a visual line direction in which a panoramic image is viewed on the basis of the information on the rotation angle of the head. A panoramic image processing unit 750 generates a panoramic image viewed from the changed visual line direction. An image providing unit 770 supplies the generated panoramic image data to the head-mounted display 100.

Description

  The present invention relates to an apparatus and a method for generating an image to be displayed on a head mounted display.

  A head-mounted display connected to a game machine is attached to the head, and a game is played by operating a controller or the like while viewing a screen displayed on the head-mounted display. In a normal stationary display connected to a game machine, the user's field of view extends outside the display screen, so it may not be possible to concentrate on the display screen or lack of immersion in the game. . On the other hand, when the head mounted display is mounted, since the user does not see any video other than the video displayed on the head mounted display, the feeling of immersion in the video world is increased and the entertainment property of the game is further enhanced.

  In addition, when a panoramic image is displayed on the head-mounted display and a user wearing the head-mounted display rotates his / her head, a 360-degree panoramic image and a virtual space are displayed. The operability of applications such as games will be improved.

  Since the user must perform various operations while viewing the image displayed on the head mounted display with the head mounted display attached, the user cannot see the hand and the controller is difficult to operate.

  The present invention has been made in view of these problems, and an object of the present invention is to provide an image generation apparatus and an image generation method capable of detecting a user's movement and displaying an image according to the user's movement on a head-mounted display. It is to provide.

  In order to solve the above-described problem, an image generation apparatus according to an aspect of the present invention includes a motion acquisition unit that acquires information about a motion of a user wearing a head-mounted display, and an image generation that generates an image according to the user's motion. And an image providing unit for providing the generated image to the head mounted display.

  Another aspect of the present invention is an image generation method. This method provides a motion acquisition step of acquiring information related to a motion of a user wearing a head mounted display, an image generating step of generating an image according to the user's motion, and the generated image to the head mounted display. Image providing step.

  It should be noted that any combination of the above-described constituent elements and the expression of the present invention converted between a method, an apparatus, a system, a computer program, a data structure, a recording medium, and the like are also effective as an aspect of the present invention.

  ADVANTAGE OF THE INVENTION According to this invention, the image according to the motion of the user with which the head mounted display was mounted | worn can be displayed on a head mounted display.

It is an external view of a head mounted display. It is a functional lineblock diagram of a head mount display. It is a block diagram of the panoramic image generation system which concerns on this Embodiment. It is a functional block diagram of the panoramic image generation apparatus which concerns on this Embodiment. FIGS. 5A and 5B are diagrams illustrating the relationship between the absolute angle of the user's head and the camera angle of the panoramic image. FIG. 6A to FIG. 6D are diagrams for explaining a method of correcting the drift of the posture sensor by detecting the marker provided on the head mounted display. It is a flowchart explaining the panorama image generation procedure by the panorama image generation apparatus of FIG. It is a figure explaining the panoramic image displayed on a head mounted display. It is a figure explaining the marker attached to the controller with which the user who mounts a head mounted display has. It is a block diagram of the virtual space drawing processing system which concerns on this Embodiment. It is a functional block diagram of the virtual space drawing processing apparatus which concerns on this Embodiment. It is a figure explaining a mode that the user who mounted | wore the head mounted display is drawing the object in virtual space by moving a marker. FIG. 13A is a diagram for explaining the relationship between an object in the three-dimensional virtual space and the viewer camera, and FIG. 13B is a diagram for explaining an image in the three-dimensional space viewed from the viewer camera. It is a functional block diagram of the user interface processing apparatus which concerns on this Embodiment. Fig.15 (a)-FIG.15 (d) are the figures explaining the example of the menu screen displayed on a head mounted display.

  FIG. 1 is an external view of the head mounted display 100. The head mounted display 100 includes a main body part 110, a forehead contact part 120, a temporal contact part 130, and a camera 140.

  The head mounted display 100 is a display device that is worn on the user's head and enjoys still images and moving images displayed on the display, and listens to sound and music output from the headphones.

  The position information of the user can be measured by a position sensor such as GPS (Global Positioning System) incorporated in or externally attached to the head mounted display 100. Further, posture information such as the orientation and inclination of the head of the user wearing the head mounted display 100 can be measured by a posture sensor built in or externally attached to the head mounted display 100.

  The main body 110 includes a display, a position information acquisition sensor, a posture sensor, a communication device, and the like. The forehead contact unit 120 and the temporal contact unit 130 include a biological information acquisition sensor that can measure biological information such as a user's body temperature, pulse, blood component, sweating, brain waves, and cerebral blood flow.

  The head mounted display 100 may further be provided with a camera that captures the eyes of the user. The camera mounted on the head mounted display 100 can detect the user's line of sight, pupil movement, blinking, and the like.

  The camera 140 is mounted on the front portion of the head mounted display 100, and can capture the outside world while the user is wearing the head mounted display 100.

  Here, a method for generating an image displayed on the head mounted display 100 will be described. However, the image generating method according to the present embodiment is not limited to the head mounted display 100 in a narrow sense, but a glasses, a glasses type display, a glasses type camera, The present invention can also be applied when headphones, headsets (headphones with microphones), earphones, earrings, ear-mounted cameras, hats, hats with cameras, hair bands, and the like are attached.

  FIG. 2 is a functional configuration diagram of the head mounted display 100.

  The control unit 10 is a main processor that processes and outputs signals such as image signals and sensor signals, commands and data. The input interface 20 receives operation signals and setting signals from the touch panel and the touch panel controller, and supplies them to the control unit 10. The output interface 30 receives the image signal from the control unit 10 and displays it on the display. The backlight 32 supplies a backlight to the liquid crystal display.

  The communication control unit 40 transmits data input from the control unit 10 to the outside through wired or wireless communication via the network adapter 42 or the antenna 44. The communication control unit 40 also receives data from the outside via wired or wireless communication via the network adapter 42 or the antenna 44 and outputs the data to the control unit 10.

  The storage unit 50 temporarily stores data, parameters, operation signals, and the like processed by the control unit 10.

  The GPS unit 60 receives position information from a GPS satellite and supplies it to the control unit 10 in accordance with an operation signal from the control unit 10. The radio unit 62 receives position information from the radio base station and supplies it to the control unit 10 in accordance with an operation signal from the control unit 10.

  The attitude sensor 64 detects attitude information such as the orientation and inclination of the main body 110 of the head mounted display 100. The posture sensor 64 is realized by appropriately combining a gyro sensor, an acceleration sensor, an angular acceleration sensor, and the like.

  The external input / output terminal interface 70 is an interface for connecting peripheral devices such as a USB (Universal Serial Bus) controller. The external memory 72 is an external memory such as a flash memory.

  The clock unit 80 sets time information according to a setting signal from the control unit 10 and supplies time data to the control unit 10.

  The camera unit 90 includes a configuration necessary for photographing such as a lens, an image sensor, and a distance measuring sensor. The distance measuring sensor is a sensor that irradiates infrared light or the like to photograph reflected light and measures the distance to the subject based on the principle of triangulation.

  The control unit 10 can supply the image and text data to the output interface 30 to be displayed on the display, or can supply the image and text data to the communication control unit 40 to be transmitted to the outside.

(First embodiment)
FIG. 3 is a configuration diagram of the panoramic image generation system according to the present embodiment. The head mounted display 100 is connected to the game machine 200 via an interface for connecting peripheral devices such as wireless communication or USB. The game machine 200 may be further connected to a server via a network. In that case, the server may provide the game machine 200 with an online application such as a game that allows a plurality of users to participate via a network. The head mounted display 100 may be connected to a computer or a portable terminal instead of the game machine 200.

  FIG. 4 is a functional configuration diagram of the panoramic image generation apparatus 700 according to the present embodiment. This figure depicts a block diagram focusing on functions, and these functional blocks can be realized in various forms by hardware only, software only, or a combination thereof.

  The panorama image generation apparatus 700 is mounted on the game machine 200 to which the head mounted display 100 is connected. However, at least a part of the functions of the panorama image generation apparatus 700 may be mounted on the control unit 10 on the head mounted display 100 side. Good. Alternatively, at least a part of the functions of the panoramic image generation apparatus 700 may be implemented in a server connected to the game machine 200 via a network.

  The angle information acquisition unit 730 and the sensitivity adjustment unit 720 are an example of a movement acquisition unit that acquires information regarding the movement of the user wearing the head mounted display. The line-of-sight direction changing unit 740 and the panoramic image processing unit 750 are an example of an image generation unit that generates an image according to a user's movement.

  The zoom instruction acquisition unit 710 acquires the zoom magnification indicated by the user via the input interface 20 of the head mounted display 100. The zoom magnification acquired by the zoom instruction acquisition unit 710 is supplied to the sensitivity adjustment unit 720 and the panorama image processing unit 750.

  The angle information acquisition unit 730 acquires the rotation angle of the head of the user wearing the head mounted display 100 based on the posture information detected by the posture sensor 64 of the head mounted display 100.

  The angle information acquisition unit 730 acquires the rotation angle of the user's head as an absolute angle based on the sensitivity instructed from the sensitivity adjustment unit 720. For example, when the user turns his / her neck, the posture sensor 64 detects a change in the angle of the user's head, but the sensitivity adjustment unit 720 changes the detected angle until the change in angle exceeds a predetermined value. Is instructed to ignore the angle information acquisition unit 730.

  Further, the sensitivity adjustment unit 720 adjusts the sensitivity of the head angle detection based on the zoom magnification acquired from the zoom instruction acquisition unit 710. As the zoom magnification increases, the sensitivity of head angle detection decreases. When the zoom is performed, the angle of view becomes small. Therefore, the vibration of the display image due to the shaking of the head can be suppressed by reducing the sensitivity of the head angle detection.

  In addition, the angle information acquisition unit 730 acquires the absolute angle of the user's head by performing self-position estimation from a plurality of images with different viewpoint positions captured by the camera 140 mounted on the head mounted display 100. Also good.

  There is a technique called SLAM (Simultaneous Localization and Mapping) as one of self-position estimation methods for simultaneously obtaining the three-dimensional position information of the subject and the viewpoint position based on a plurality of parallax images having different viewpoint positions. SLAM is a technology that simultaneously performs self-position estimation and environmental map creation based on information acquired from sensors, and is applied to autonomous mobile robots and the like. Here, as an example, the three-dimensional position of the feature point of the subject and the three-dimensional position of the camera are estimated using a well-known SLAM technique, but other self-position estimation techniques may be used. SLAM is introduced in the following paper, for example. Andrew J. Davison, "Real-Time Simultaneous Localization and Mapping with a Single Camera", ICCV 2003.

  Although self-position estimation techniques such as SLAM do not provide sufficient detection accuracy, the robustness of head movement detection can be improved by combining posture detection with a motion sensor. In addition, there is an effect of correcting the drift of the motion sensor by combining the self-position estimation technology.

  Further, the marker detection unit 780 detects the position of the marker attached to the head mounted display 100, and the angle information acquisition unit 730 determines the absolute position of the user's head based on the marker position detected by the marker detection unit 780. An angle may be acquired. When the marker enters the blind spot of the camera, the position of the marker cannot be detected. In that case, it can be dealt with by combining posture detection by a motion sensor.

  As a motion sensor, a back-and-forth, left-right, and up-and-down movement of the user's head may be detected using a combination of at least one of a 3-axis geomagnetic sensor, a 3-axis acceleration sensor, and a 3-axis gyro (angular velocity) sensor. . Further, the accuracy of head movement detection may be improved by combining the position acquisition by the marker.

  The line-of-sight direction changing unit 740 changes the camera angle of the panoramic image displayed on the head-mounted display 100, that is, the line-of-sight direction according to the absolute angle of the user's head detected by the angle information acquisition unit 730. The line-of-sight direction is given to the panoramic image processing unit 750.

  FIGS. 5A and 5B are diagrams illustrating the relationship between the absolute angle of the user's head and the camera angle of the panoramic image. When the user wearing the head mounted display 100 turns his / her neck and the absolute angle of the head acquired by the angle information acquisition unit 730 changes from 30 degrees to 75 degrees as shown in FIG. The changing unit 740 changes the angle of the panoramic image viewer camera 400 from 30 degrees to 75 degrees.

  A change in the absolute angle of the user's head detected by the angle information acquisition unit 730 may be filtered by a low-pass filter to remove noise due to neck vibration or the like.

  Returning to FIG. 4, the panorama image processing unit 750 reads panorama image data from the panorama image storage unit 760, and a panorama image viewed from the line-of-sight direction specified by the line-of-sight direction changing unit 740 is specified by the zoom instruction acquisition unit 710. The image is generated at the zoom magnification and given to the image providing unit 770.

  The image providing unit 770 supplies the panorama image data generated by the panorama image processing unit 750 to the head mounted display 100.

  The marker detection unit 780 detects a marker attached to the head mounted display 100 with a camera connected to the game machine 200 and provides the position information of the detected marker to the angle information acquisition unit 730 and the drift correction unit 790.

  The drift correction unit 790 determines whether or not to correct the drift of the posture sensor 64 of the head mounted display 100 based on the change in the marker position given from the marker detection unit 780. The result is given to the head mounted display 100. On the head mounted display 100 side, the detection value of the attitude sensor 64 is corrected based on the drift correction result.

  The drift correction unit 790 determines that the origin of the posture sensor 64 is drifting when a change in the rotation angle of the head is detected by the posture sensor 64 even though the marker position has not changed. Perform drift correction.

  FIGS. 6A to 6D are diagrams for explaining a method of correcting a drift of the posture sensor 64 by detecting a marker provided on the head mounted display 100. FIG.

  The attitude sensor 64, particularly the gyro sensor, has a problem called “yaw drift” in which the origin of the detected yaw angle drifts. As shown in FIG. 6A, a function is provided for automatically correcting the drift as indicated by the solid arrow when the influence of the drift occurs on the detected yaw angle as indicated by the dotted arrow. However, when only one spherical marker 310 is provided at the front of the head mounted display 100 as shown in FIG. 6A, the head of the user wearing the head mounted display 100 is shown in FIG. Even if it rotates like this, it is erroneously determined as a drift of the yaw angle, and drift correction is performed as indicated by a solid line arrow, and a change in the absolute angle of the head may not be detected.

  Therefore, as shown in FIG. 6C, a marker 330 a other than a spherical shape such as a square is provided at the front portion of the head mounted display 100. In this case, when the absolute angle of the head of the user wearing the head mounted display 100 changes, the direction of the marker 330b detected by the camera is different from the direction of the marker 330a before the change. Therefore, unlike the case of the spherical marker, the change in the absolute angle of the head can be regarded as the change in the direction of the markers 330a and 330b, and the change in the absolute angle of the head is erroneously determined as yaw drift. Can be prevented.

  Further, as shown in FIG. 6D, when a spherical marker 330 is provided in front of the head mounted display 100 and another spherical marker 340a is provided behind, the head of the user wearing the head mounted display 100 is placed. When the absolute angle changes, the direction of the straight line connecting the front marker 330 and the rear spherical markers 340a and 340b of the head mounted display 100 changes, and thus the change in the absolute angle of the head is erroneously determined as yaw drift. Can be prevented.

  FIG. 7 is a flowchart for explaining a panorama image generation procedure performed by the panorama image generation apparatus 700.

  The orientation of the user's head is acquired by the posture sensor 64 of the head mounted display 100 (S10). The head mounted display 100 gives information on the absolute angle of the head to the game machine 200 (S12).

  The line-of-sight direction changing unit 740 of the game machine 200 rotates the viewer camera 400 based on the information on the absolute angle of the head (S14). The panoramic image processing unit 750 generates and outputs a panoramic image viewed from the direction of the viewer camera 400, that is, the line-of-sight direction, within the range of the field angle (S16). The image providing unit 770 gives the video data generated by the panoramic image processing unit 750 to the control unit 10 (S18).

  A panoramic image is displayed on the head mounted display 100 (S20). If the end condition is satisfied (Y in S22), the process ends. If the end condition is not satisfied (N in S22), the process returns to step S10 and the subsequent processing is repeated.

  FIG. 8 is a diagram for explaining a panoramic image displayed on the head mounted display 100. When the user is facing the left front with respect to the panoramic image, the image 510a in the range of the angle of view 150a in the direction of the head mounted display 100a is displayed, and when the user turns the neck and faces the right front, An image 510b in the range of the angle of view 150b in the direction of the head mounted display 100b is displayed.

  As described above, the line-of-sight direction for viewing the panoramic image displayed on the head-mounted display 100 changes according to the movement of the head, so that it is possible to enhance the immersion in the panoramic image.

(Second embodiment)
FIG. 9 is a diagram for explaining the marker 300 attached to the controller 350 of the user wearing the head mounted display 100. The user operates the controller 350 of the game machine 200 with the head mounted display 100 attached. Depending on the game application, the hand may be moved or the body may be moved while holding the controller 350. A marker 300 is attached to the controller 350, and the position of the marker 300 can be detected by a camera connected to the game machine 200. When the camera connected to the game machine 200 is provided with a distance sensor, the three-dimensional coordinates of the marker 300 can be specified.

  FIG. 10 is a configuration diagram of the virtual space rendering processing system according to the present embodiment. The controller 350 to which the head mounted display 100 and the marker 300 are attached is connected to the game machine 200 through an interface for connecting peripheral devices such as wireless communication or USB. The game machine 200 may be further connected to a server via a network. In that case, the server may provide the game machine 200 with a virtual space application in which a plurality of users can participate via a network. The head mounted display 100 may be connected to a computer or a portable terminal instead of the game machine 200.

  FIG. 11 is a functional configuration diagram of the virtual space rendering processing apparatus 800 according to the present embodiment. This figure depicts a block diagram focusing on functions, and these functional blocks can be realized in various forms by hardware only, software only, or a combination thereof.

  The virtual space rendering processing device 800 is mounted on the game machine 200 to which the head mounted display 100 is connected. At least a part of the functions of the virtual space rendering processing device 800 is mounted on the control unit 10 on the head mounted display 100 side. May be. Alternatively, at least a part of the functions of the virtual space rendering processing device 800 may be mounted on a server connected to the game machine 200 via a network.

  The position / posture information acquisition unit 810 is an example of a movement acquisition unit that acquires information regarding the movement of the user wearing the head mounted display. The drawing instruction acquisition unit 820, the object generation unit 830, the viewpoint position / gaze direction setting unit 850, and the three-dimensional rendering unit 860 are examples of an image generation unit that generates an image according to a user's movement.

  The position / posture information acquisition unit 810 acquires the position of the user from the GPS unit 60 or the wireless unit 62 of the head mounted display 100 and acquires information of the user's posture from the posture sensor 64 of the head mounted display 100. The acquired user position and orientation information is provided to the object generation unit 830 and the viewpoint position / gaze direction setting unit 850.

  The drawing instruction acquisition unit 820 acquires a user's drawing instruction from the controller 350 and gives a drawing command to the object generation unit 830.

  The object generation unit 830 refers to the position and orientation information of the user acquired by the position / posture information acquisition unit 810, and uses the trajectory of the marker 300 when the user moves the marker 300 by taking the posture at the position. A corresponding object is generated in the virtual space and stored in the object information storage unit 840. The trajectory drawn by the marker 300 is acquired by tracking the position of the marker 300 with the camera of the game machine 200.

  The viewpoint position / gaze direction setting unit 850 sets the user's viewpoint position and gaze direction in the virtual space based on the user position and orientation information acquired by the position / attitude information acquisition unit 810. Based on the set viewpoint position and line-of-sight direction, the 3D rendering unit 860 reads out information on the virtual space and information on the object drawn by the user in the space from the object information storage unit 840, and stores the information in the 3D virtual space. Render the object.

  When the user wearing the head-mounted display 100 changes the position or orientation, the direction in which the three-dimensional virtual space displayed on the head-mounted display 100 is viewed changes, and the relative positional relationship and orientation with respect to the object drawn by the user in the space Also changes. As a result, the user wearing the head mounted display 100 can stereoscopically observe the object drawn by the user while moving in the space.

  The image providing unit 870 gives the video data of the virtual space generated by the three-dimensional rendering unit 860 to the head mounted display 100.

  FIG. 12 is a diagram for explaining a state in which objects 600 a, 600 b, 600 c, and 600 d are drawn in the virtual space by moving the marker 300 by the user wearing the head mounted display 100. While the user presses the button on the controller 350, the object can be drawn in the space using the marker 300, and the user observes the drawn object in the virtual space displayed on the head mounted display 100. be able to. When the user changes the position or the head direction, the user's viewpoint position or viewpoint direction in the virtual space changes, and the object drawn by himself / herself in the three-dimensional virtual space viewed from the new line of sight at the new viewpoint position Can see. At this time, since the object drawn by the user also has three-dimensional information, the position and orientation of the object are changed to reflect the user's viewpoint position and line-of-sight direction.

  13A is a diagram for explaining the relationship between the objects 610a, 610b, and 610c in the three-dimensional virtual space and the viewer camera 400. FIG. 13B is an image in the three-dimensional space viewed from the viewer camera 400. FIG.

  Objects 610a, 610b, and 610c shown in FIG. 13A are objects drawn by the user moving the marker 300 in the space. The viewpoint position / line-of-sight setting unit 850 changes the position and direction of the viewer camera 400 viewing the virtual space, that is, changes the user's viewpoint position and line-of-sight direction, thereby generating an image of the virtual space generated by the three-dimensional rendering unit 860. Changes. FIG. 13B is an image of the virtual space when viewed from the position and direction of the viewer camera 400 shown in FIG. 13A, and a part of the objects 610a and 610c is visible to the user.

(Third embodiment)
FIG. 14 is a functional configuration diagram of the user interface processing apparatus 900 according to the present embodiment. This figure depicts a block diagram focusing on functions, and these functional blocks can be realized in various forms by hardware only, software only, or a combination thereof.

  Although the user interface processing device 900 is mounted on the game machine 200 to which the head mounted display 100 is connected, at least a part of the functions of the user interface processing device 900 may be mounted on the control unit 10 on the head mounted display 100 side. Good. Alternatively, at least a part of the functions of the user interface processing device 900 may be mounted on a server connected to the game machine 200 via a network.

  The tilt / acceleration information acquisition unit 910 is an example of a movement acquisition unit that acquires information regarding the movement of the user wearing the head mounted display. The gesture determination unit 920, the event identification unit 930, and the user interface unit 960 are an example of an image generation unit that generates an image according to a user's movement.

  The tilt / acceleration information acquisition unit 910 acquires information on the tilt and acceleration of the head of the user wearing the head mounted display 100 from the attitude sensor 64 of the head mounted display 100. The gesture determination unit 920 determines a user gesture based on the tilt and acceleration of the user's head. For example, gestures such as nodding, tilting the head, and shaking the head are determined based on the tilt and acceleration of the movement of the head.

  The event identification unit 930 refers to a table that associates a gesture with an event held in the correspondence table storage unit 940, and identifies an event to be generated from the gesture determined by the gesture determination unit 920. For example, when the user nods, “decision”, when the head is swayed, “cancel”, when facing up, down, left or right, “cursor movement” in that direction, Corresponding to “menu display”.

  The user interface unit 960 reads information on the menu screen from the menu information storage unit 970, displays the menu screen based on the event specified by the event specifying unit 930, or generates an event for selecting an item from the menu screen. The operation screen based on the event is provided to the head mounted display 100.

  FIG. 15A to FIG. 15D are diagrams illustrating examples of menu screens displayed on the head mounted display 100.

  FIG. 15A is a screen displayed on the head mounted display 100. FIG. 15B shows a menu 630 displayed when the user tilts his head to the left. When the tilt / acceleration information acquisition unit 910 detects that the user's head is tilted to the left, the gesture determination unit 920 associates the action with an event that causes the menu 630 to appear on the screen, and the user interface unit 960 630 is made to slide from the right end of the screen (referred to as “slide-in” operation).

  When the user returns the head to the original position, an item can be selected from the menu 630 with the menu 630 displayed on the screen, as shown in FIG. For example, when the user turns up or down, any item in menu 630 can be selected in turn, and when the user turns to the right, the selected item can be executed. When the user turns to the left, the selected item can be canceled.

  FIG. 15D shows a menu 630 displayed when the user tilts his / her head to the right. When the tilt / acceleration information acquisition unit 910 detects that the user's head is tilted to the right, the gesture determination unit 920 associates the action with an event that deletes the menu 630 from the screen, and the user interface unit 960 displays the menu 630. Slide to the right edge of the screen to erase (referred to as “slide out” operation).

  When the menu 630 is slid in, slid out, or an item is selected from the menu 630, the user tilts his head or shakes. At this time, the viewing direction of the panoramic image displayed on the head mounted display 100 A lock may be applied so that does not change in conjunction with the movement of the head. By fixing the line-of-sight direction of the panoramic image displayed on the head mounted display 100 during operation of the menu 630, confusion with the operation instruction of the menu 630 can be avoided. Further, the menu 630 may be operated only while the controller button is pressed.

  The present invention has been described based on the embodiments. The embodiments are exemplifications, and it will be understood by those skilled in the art that various modifications can be made to combinations of the respective constituent elements and processing processes, and such modifications are within the scope of the present invention. . Such a modification will be described.

  10 control unit, 20 input interface, 30 output interface, 32 backlight, 40 communication control unit, 42 network adapter, 44 antenna, 50 storage unit, 60 GPS unit, 62 wireless unit, 64 attitude sensor, 70 external input / output terminal interface 72 external memory, 80 clock unit, 100 head-mounted display, 110 main body unit, 120 frontal head contact unit, 130 temporal head contact unit, 200 game machine, 700 panoramic image generation device, 710 zoom instruction acquisition unit, 720 sensitivity Adjustment unit, 730 angle information acquisition unit, 740 gaze direction change unit, 750 panorama image processing unit, 760 panorama image storage unit, 770 image providing unit, 780 marker detection unit, 790 drift correction unit 800 virtual space rendering processing device, 810 position / attitude information acquisition unit, 820 drawing instruction acquisition unit, 830 object generation unit, 840 object information storage unit, 850 viewpoint position / gaze direction setting unit, 860 three-dimensional rendering unit, 870 image provision Unit, 900 user interface processing device, 910 tilt / acceleration information acquisition unit, 920 gesture determination unit, 930 event identification unit, 940 correspondence table storage unit, 960 user interface unit, 970 menu information storage unit.

Claims (9)

  1. A movement acquisition unit that acquires information on movement of the user wearing the head mounted display;
    An image generation unit that generates an image according to the user's movement acquired by the movement acquisition unit;
    And an image providing unit that provides the generated image to the head mounted display.
  2. The movement acquisition unit includes an angle information acquisition unit that acquires information about the rotation angle of the head of the user wearing the head mounted display,
    The image generation unit
    A line-of-sight direction changing unit that changes a line-of-sight direction for viewing a panoramic image based on the information about the rotation angle of the head;
    The image generation apparatus according to claim 1, further comprising: a panorama image processing unit that generates a panorama image viewed from the changed line-of-sight direction.
  3.   The image generation apparatus according to claim 2, wherein the angle information acquisition unit acquires information related to a rotation angle of the head from posture information detected by a posture sensor mounted on the head mounted display.
  4.   The image generation apparatus according to claim 2, wherein the angle information acquisition unit acquires information related to a rotation angle of the head from position information of a marker mounted on the head mounted display.
  5. A marker detection unit for detecting position information of a marker mounted on the head mounted display using an image of the head mounted display captured by a camera;
    The image generation apparatus according to claim 3, further comprising a drift correction unit that determines whether or not to correct drift of the posture sensor based on the detected position information of the marker.
  6. The movement acquisition unit includes a position / posture information acquisition unit that acquires information about the position and posture of the user wearing the head mounted display,
    The image generation unit
    An object generation unit that generates an object in the virtual space according to the trace drawn by the marker tracked by the camera;
    A viewpoint position / line-of-sight direction setting unit that sets a viewpoint position and a line-of-sight direction for viewing the virtual space based on the information on the position and orientation;
    The image generating apparatus according to claim 1, further comprising: a three-dimensional rendering unit that renders a virtual space including the object based on the set viewpoint position and line-of-sight direction.
  7. The movement acquisition unit includes an inclination information acquisition unit that acquires information about the inclination of the head of the user wearing the head-mounted display,
    The image generation unit
    A gesture determination unit for determining a user's gesture based on the inclination of the head;
    An event identification unit that identifies an event to be generated from the gesture determined by the gesture determination unit with reference to a table in which gestures and events are associated;
    The image generating apparatus according to claim 1, further comprising: a user interface unit that provides an operation screen based on the event specified by the event specifying unit to the head mounted display.
  8. A movement acquisition step of acquiring information about movement of the user wearing the head mounted display;
    An image generation step for generating an image according to a user's movement;
    And an image providing step of providing the generated image to the head mounted display.
  9. A movement acquisition step of acquiring information about movement of the user wearing the head mounted display;
    An image generation step for generating an image according to a user's movement;
    A program for causing a computer to execute an image providing step of providing a generated image to the head mounted display.
JP2012134264A 2012-06-13 2012-06-13 Image generation device and image generation method Pending JP2013258614A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2012134264A JP2013258614A (en) 2012-06-13 2012-06-13 Image generation device and image generation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2012134264A JP2013258614A (en) 2012-06-13 2012-06-13 Image generation device and image generation method

Publications (1)

Publication Number Publication Date
JP2013258614A true JP2013258614A (en) 2013-12-26

Family

ID=49954686

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2012134264A Pending JP2013258614A (en) 2012-06-13 2012-06-13 Image generation device and image generation method

Country Status (1)

Country Link
JP (1) JP2013258614A (en)

Cited By (45)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015103621A1 (en) 2014-01-06 2015-07-09 Oculus Vr, Llc Calibration of virtual reality systems
WO2015112361A1 (en) 2014-01-25 2015-07-30 Sony Computer Entertainment America Llc Environmental interrupt in a head-mounted display and utilization of non field of view real estate
WO2015139005A1 (en) * 2014-03-14 2015-09-17 Sony Computer Entertainment Inc. Methods and systems tracking head mounted display (hmd) and calibrations for hmd headband adjustments
JP2015231445A (en) * 2014-06-09 2015-12-24 株式会社バンダイナムコエンターテインメント Program and image generating device
JP2015232783A (en) * 2014-06-09 2015-12-24 株式会社バンダイナムコエンターテインメント Program and image creating device
JP5898756B1 (en) * 2014-11-28 2016-04-06 株式会社コロプラ System, program, and method for operating screen by linking display and multiple controllers connected by network
JP2016092567A (en) * 2014-11-04 2016-05-23 セイコーエプソン株式会社 Head-mounted display device, control method for head-mounted display device, and computer program
WO2016099630A1 (en) * 2014-12-18 2016-06-23 Intel Corporation Head mounted display update buffer
JP2016115329A (en) * 2015-05-22 2016-06-23 株式会社コロプラ Head-mounted display system, method of displaying on head-mounted display, and program
WO2016098688A1 (en) * 2014-12-15 2016-06-23 株式会社コロプラ Head mounted display system, method for performing display in head mounted display and program
JP5944600B1 (en) * 2015-08-28 2016-07-05 株式会社タカラトミー Information processing device with head-mounted display
JP2016127981A (en) * 2016-02-25 2016-07-14 株式会社コロプラ System, program, and method for performing screen operation by interlocking display and plural controllers connected through network
WO2016136794A1 (en) * 2015-02-27 2016-09-01 株式会社ソニー・インタラクティブエンタテインメント Display control program, dislay control device, and display control method
JP2016158794A (en) * 2015-02-27 2016-09-05 株式会社ソニー・インタラクティブエンタテインメント Display control program, display control apparatus, and display control method
JP6002286B1 (en) * 2015-07-14 2016-10-05 株式会社コロプラ Head mounted display control method and head mounted display control program
DE102016107202A1 (en) 2015-04-20 2016-10-20 Fanuc Corporation Display system
WO2017037962A1 (en) 2015-08-28 2017-03-09 株式会社タカラトミー Information processing device provided with head-mounted display
JP2017055173A (en) * 2015-09-07 2017-03-16 株式会社ソニー・インタラクティブエンタテインメント Information processing apparatus and image generation method
JP6159455B1 (en) * 2016-09-02 2017-07-05 株式会社コロプラ Method, program, and recording medium for providing virtual space
US9721396B2 (en) 2015-03-17 2017-08-01 Colopl, Inc. Computer and computer system for controlling object manipulation in immersive virtual space
CN107211195A (en) * 2015-02-12 2017-09-26 日商可乐普拉股份有限公司 Use the device and system of the content audiovisual of head-mounted display
JP2017211912A (en) * 2016-05-27 2017-11-30 株式会社コロプラ Display control method and program for causing computer to execute the method
JP6242452B1 (en) * 2016-08-22 2017-12-06 株式会社コロプラ Method for providing virtual space, method for providing virtual experience, program, and recording medium
JP2017224003A (en) * 2016-05-17 2017-12-21 株式会社コロプラ Method, program, and storage medium for providing virtual space
US20180028915A1 (en) * 2015-02-27 2018-02-01 Sony Interactive Entertainment Inc. Display control program, dislay control apparatus and display control method
JP6277567B1 (en) * 2016-11-21 2018-02-14 株式会社コナミデジタルエンタテインメント Terminal device and program
JP2018061667A (en) * 2016-10-12 2018-04-19 株式会社カプコン Game program and game device
US9986207B2 (en) 2013-03-15 2018-05-29 Sony Interactive Entertainment America Llc Real time virtual reality leveraging web cams and IP cams and web cam and IP cam networks
JP2018085125A (en) * 2017-12-28 2018-05-31 株式会社コナミデジタルエンタテインメント Terminal device and program
JP2018094086A (en) * 2016-12-13 2018-06-21 株式会社ソニー・インタラクティブエンタテインメント Information processing device and image formation method
US10216738B1 (en) 2013-03-15 2019-02-26 Sony Interactive Entertainment America Llc Virtual reality interaction with 3D printing
US10297062B2 (en) 2014-03-18 2019-05-21 Seiko Epson Corporation Head-mounted display device, control method for head-mounted display device, and computer program
US10320946B2 (en) 2013-03-15 2019-06-11 Sony Interactive Entertainment America Llc Virtual reality universe representation changes viewing based upon client side parameters
US10356215B1 (en) 2013-03-15 2019-07-16 Sony Interactive Entertainment America Llc Crowd and cloud enabled virtual reality distributed location network
US10474711B1 (en) 2013-03-15 2019-11-12 Sony Interactive Entertainment America Llc System and methods for effective virtual reality visitor interface
US10539797B2 (en) 2016-05-06 2020-01-21 Colopl, Inc. Method of providing virtual space, program therefor, and recording medium
US10565249B1 (en) 2013-03-15 2020-02-18 Sony Interactive Entertainment America Llc Real time unified communications interaction of a predefined location in a virtual reality location
US10579109B2 (en) 2014-09-19 2020-03-03 Sony Corporation Control device and control method
US10587763B2 (en) 2017-12-11 2020-03-10 Fuji Xerox Co., Ltd. Information processing apparatus and non-transitory computer readable medium for combining video contents and user image
US10599707B1 (en) 2013-03-15 2020-03-24 Sony Interactive Entertainment America Llc Virtual reality enhanced through browser connections
WO2020059327A1 (en) * 2018-09-18 2020-03-26 ソニー株式会社 Information processing device, information processing method, and program
US10620699B2 (en) 2014-10-22 2020-04-14 Sony Interactive Entertainment Inc. Head mounted display, mobile information terminal, image processing apparatus, display control program, display control method, and display system
US10728401B2 (en) 2017-12-11 2020-07-28 Fuji Xerox Co., Ltd. Information processing apparatus, information processing method, and non-transitory computer readable medium capable of outputting image
US10809798B2 (en) 2014-01-25 2020-10-20 Sony Interactive Entertainment LLC Menu navigation in a head-mounted display
US10832547B2 (en) 2017-12-12 2020-11-10 Fuji Xerox Co., Ltd. Information processing apparatus and non-transitory computer readable medium

Cited By (71)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10320946B2 (en) 2013-03-15 2019-06-11 Sony Interactive Entertainment America Llc Virtual reality universe representation changes viewing based upon client side parameters
US10356215B1 (en) 2013-03-15 2019-07-16 Sony Interactive Entertainment America Llc Crowd and cloud enabled virtual reality distributed location network
US9986207B2 (en) 2013-03-15 2018-05-29 Sony Interactive Entertainment America Llc Real time virtual reality leveraging web cams and IP cams and web cam and IP cam networks
US10599707B1 (en) 2013-03-15 2020-03-24 Sony Interactive Entertainment America Llc Virtual reality enhanced through browser connections
US10216738B1 (en) 2013-03-15 2019-02-26 Sony Interactive Entertainment America Llc Virtual reality interaction with 3D printing
US10565249B1 (en) 2013-03-15 2020-02-18 Sony Interactive Entertainment America Llc Real time unified communications interaction of a predefined location in a virtual reality location
US10474711B1 (en) 2013-03-15 2019-11-12 Sony Interactive Entertainment America Llc System and methods for effective virtual reality visitor interface
US9779540B2 (en) 2014-01-06 2017-10-03 Oculus Vr, Llc Calibration of virtual reality systems
EP3017591A4 (en) * 2014-01-06 2017-09-27 Oculus VR, LLC Calibration of virtual reality systems
JP2017142813A (en) * 2014-01-06 2017-08-17 オキュラス ブイアール,エルエルシー Calibration of virtual reality system
US10001834B2 (en) 2014-01-06 2018-06-19 Oculus Vr, Llc Calibration of multiple rigid bodies in a virtual reality system
WO2015103621A1 (en) 2014-01-06 2015-07-09 Oculus Vr, Llc Calibration of virtual reality systems
CN106164993B (en) * 2014-01-25 2020-10-20 索尼互动娱乐美国有限责任公司 Utilization of environmental disruptions and non-visual real estate in head mounted displays
US10809798B2 (en) 2014-01-25 2020-10-20 Sony Interactive Entertainment LLC Menu navigation in a head-mounted display
US10096167B2 (en) 2014-01-25 2018-10-09 Sony Interactive Entertainment America Llc Method for executing functions in a VR environment
WO2015112361A1 (en) 2014-01-25 2015-07-30 Sony Computer Entertainment America Llc Environmental interrupt in a head-mounted display and utilization of non field of view real estate
CN106164993A (en) * 2014-01-25 2016-11-23 索尼互动娱乐美国有限责任公司 Environment in head mounted display interrupts and the praedial utilization in the non-visual field
EP3097552A4 (en) * 2014-01-25 2018-01-24 Sony Computer Entertainment America LLC Environmental interrupt in a head-mounted display and utilization of non field of view real estate
WO2015139005A1 (en) * 2014-03-14 2015-09-17 Sony Computer Entertainment Inc. Methods and systems tracking head mounted display (hmd) and calibrations for hmd headband adjustments
US10409364B2 (en) 2014-03-14 2019-09-10 Sony Interactive Entertainment Inc. Methods and systems tracking head mounted display (HMD) and calibrations for HMD headband adjustments
JP2017516187A (en) * 2014-03-14 2017-06-15 株式会社ソニー・インタラクティブエンタテインメント Calibration method and system for head mounted display (HMD) tracking and HMD headband adjustment
US9710057B2 (en) 2014-03-14 2017-07-18 Sony Interactive Entertainment Inc. Methods and systems including tracking a head mounted display (HMD) and calibrations for HMD headband adjustments
US10297062B2 (en) 2014-03-18 2019-05-21 Seiko Epson Corporation Head-mounted display device, control method for head-mounted display device, and computer program
JP2015231445A (en) * 2014-06-09 2015-12-24 株式会社バンダイナムコエンターテインメント Program and image generating device
JP2015232783A (en) * 2014-06-09 2015-12-24 株式会社バンダイナムコエンターテインメント Program and image creating device
US10579109B2 (en) 2014-09-19 2020-03-03 Sony Corporation Control device and control method
US10620699B2 (en) 2014-10-22 2020-04-14 Sony Interactive Entertainment Inc. Head mounted display, mobile information terminal, image processing apparatus, display control program, display control method, and display system
JP2016092567A (en) * 2014-11-04 2016-05-23 セイコーエプソン株式会社 Head-mounted display device, control method for head-mounted display device, and computer program
CN106999772A (en) * 2014-11-28 2017-08-01 日商可乐普拉股份有限公司 Display is set to carry out the system, program and method of screen operation in linkage with multiple controllers via network connection
WO2016084941A1 (en) * 2014-11-28 2016-06-02 株式会社コロプラ System, program, and method for operating screen by linking display and plurality of controllers connected via network
JP5898756B1 (en) * 2014-11-28 2016-04-06 株式会社コロプラ System, program, and method for operating screen by linking display and multiple controllers connected by network
US9940754B2 (en) 2014-12-15 2018-04-10 Colopl, Inc. Head-mounted display system and method for presenting display on head-mounted display
WO2016098688A1 (en) * 2014-12-15 2016-06-23 株式会社コロプラ Head mounted display system, method for performing display in head mounted display and program
US10553033B2 (en) 2014-12-15 2020-02-04 Colopl, Inc. Head-mounted display system and method for presenting display on head-mounted display
US9542718B2 (en) 2014-12-18 2017-01-10 Intel Corporation Head mounted display update buffer
WO2016099630A1 (en) * 2014-12-18 2016-06-23 Intel Corporation Head mounted display update buffer
CN107211195A (en) * 2015-02-12 2017-09-26 日商可乐普拉股份有限公司 Use the device and system of the content audiovisual of head-mounted display
US9958937B2 (en) 2015-02-12 2018-05-01 Colopl, Inc. Device and system for viewing content using head-mounted display
CN107211195B (en) * 2015-02-12 2020-04-24 日商可乐普拉股份有限公司 Apparatus and system for viewing and listening to content using head mounted display
US20180028915A1 (en) * 2015-02-27 2018-02-01 Sony Interactive Entertainment Inc. Display control program, dislay control apparatus and display control method
US10596464B2 (en) * 2015-02-27 2020-03-24 Sony Interactive Entertainment Inc. Display control program, display control apparatus and display control method
US10712575B2 (en) 2015-02-27 2020-07-14 Sony Interactive Entertainment Inc. Display control apparatus, display control method, and recording medium for setting viewpoint and sightline in a virtual three- dimensional space
WO2016136794A1 (en) * 2015-02-27 2016-09-01 株式会社ソニー・インタラクティブエンタテインメント Display control program, dislay control device, and display control method
JP2016158794A (en) * 2015-02-27 2016-09-05 株式会社ソニー・インタラクティブエンタテインメント Display control program, display control apparatus, and display control method
US9721396B2 (en) 2015-03-17 2017-08-01 Colopl, Inc. Computer and computer system for controlling object manipulation in immersive virtual space
US10268433B2 (en) 2015-04-20 2019-04-23 Fanuc Corporation Display system
DE102016107202A1 (en) 2015-04-20 2016-10-20 Fanuc Corporation Display system
JP2016115329A (en) * 2015-05-22 2016-06-23 株式会社コロプラ Head-mounted display system, method of displaying on head-mounted display, and program
JP6002286B1 (en) * 2015-07-14 2016-10-05 株式会社コロプラ Head mounted display control method and head mounted display control program
US10115235B2 (en) 2015-07-14 2018-10-30 Colopl, Inc. Method for controlling head mounted display, and system for implemeting the method
WO2017037962A1 (en) 2015-08-28 2017-03-09 株式会社タカラトミー Information processing device provided with head-mounted display
US9703102B2 (en) 2015-08-28 2017-07-11 Tomy Company Ltd. Information processing device including head mounted display
JP5944600B1 (en) * 2015-08-28 2016-07-05 株式会社タカラトミー Information processing device with head-mounted display
WO2017043398A1 (en) * 2015-09-07 2017-03-16 株式会社ソニー・インタラクティブエンタテインメント Information processing device and image generation method
US10614589B2 (en) 2015-09-07 2020-04-07 Sony Interactive Entertainment Inc. Information processing apparatus and image generating method
JP2017055173A (en) * 2015-09-07 2017-03-16 株式会社ソニー・インタラクティブエンタテインメント Information processing apparatus and image generation method
JP2016127981A (en) * 2016-02-25 2016-07-14 株式会社コロプラ System, program, and method for performing screen operation by interlocking display and plural controllers connected through network
US10539797B2 (en) 2016-05-06 2020-01-21 Colopl, Inc. Method of providing virtual space, program therefor, and recording medium
JP2017224003A (en) * 2016-05-17 2017-12-21 株式会社コロプラ Method, program, and storage medium for providing virtual space
JP2017211912A (en) * 2016-05-27 2017-11-30 株式会社コロプラ Display control method and program for causing computer to execute the method
JP6242452B1 (en) * 2016-08-22 2017-12-06 株式会社コロプラ Method for providing virtual space, method for providing virtual experience, program, and recording medium
JP6159455B1 (en) * 2016-09-02 2017-07-05 株式会社コロプラ Method, program, and recording medium for providing virtual space
JP2018061667A (en) * 2016-10-12 2018-04-19 株式会社カプコン Game program and game device
JP6277567B1 (en) * 2016-11-21 2018-02-14 株式会社コナミデジタルエンタテインメント Terminal device and program
JP2018094086A (en) * 2016-12-13 2018-06-21 株式会社ソニー・インタラクティブエンタテインメント Information processing device and image formation method
US10369468B2 (en) 2016-12-13 2019-08-06 Sony Interactive Entertainment Inc. Information processing apparatus, image generating method, and program
US10587763B2 (en) 2017-12-11 2020-03-10 Fuji Xerox Co., Ltd. Information processing apparatus and non-transitory computer readable medium for combining video contents and user image
US10728401B2 (en) 2017-12-11 2020-07-28 Fuji Xerox Co., Ltd. Information processing apparatus, information processing method, and non-transitory computer readable medium capable of outputting image
US10832547B2 (en) 2017-12-12 2020-11-10 Fuji Xerox Co., Ltd. Information processing apparatus and non-transitory computer readable medium
JP2018085125A (en) * 2017-12-28 2018-05-31 株式会社コナミデジタルエンタテインメント Terminal device and program
WO2020059327A1 (en) * 2018-09-18 2020-03-26 ソニー株式会社 Information processing device, information processing method, and program

Similar Documents

Publication Publication Date Title
US10606380B2 (en) Display control apparatus, display control method, and display control program
US10198870B2 (en) Information processing apparatus, information processing system, and information processing method
US9684173B2 (en) Image processing device, image processing method, and image processing system
US9442567B2 (en) Gaze swipe selection
CN105319718B (en) Wearable glasses and method of displaying image via wearable glasses
EP3195105B1 (en) Management of content in a 3d holographic environment
RU2638776C1 (en) Image generating device and method
US10268276B2 (en) Autonomous computing and telecommunications head-up displays glasses
JP6359644B2 (en) Method for facilitating computer vision application initialization
US9857589B2 (en) Gesture registration device, gesture registration program, and gesture registration method
US9552060B2 (en) Radial selection by vestibulo-ocular reflex fixation
US9256987B2 (en) Tracking head movement when wearing mobile device
US9928650B2 (en) Computer program for directing line of sight
JP5580855B2 (en) Obstacle avoidance device and obstacle avoidance method
US10521026B2 (en) Passive optical and inertial tracking in slim form-factor
US9401050B2 (en) Recalibration of a flexible mixed reality device
JP5659304B2 (en) Image generating apparatus and image generating method
JP5659305B2 (en) Image generating apparatus and image generating method
JP6515813B2 (en) Information processing apparatus, information processing method, and program
US8350896B2 (en) Terminal apparatus, display control method, and display control program
US20160378176A1 (en) Hand And Body Tracking With Mobile Device-Based Virtual Reality Head-Mounted Display
US9602809B2 (en) Storage medium storing display control program for providing stereoscopic display desired by user with simpler operation, display control device, display control method, and display control system
CN103180893B (en) For providing the method and system of three-dimensional user interface
US9728010B2 (en) Virtual representations of real-world objects
JP4900741B2 (en) Image recognition apparatus, operation determination method, and program