WO2021192589A1 - 情報処理装置及び情報処理方法、コンピュータプログラム、並びに拡張現実感システム - Google Patents
情報処理装置及び情報処理方法、コンピュータプログラム、並びに拡張現実感システム Download PDFInfo
- Publication number
- WO2021192589A1 WO2021192589A1 PCT/JP2021/002883 JP2021002883W WO2021192589A1 WO 2021192589 A1 WO2021192589 A1 WO 2021192589A1 JP 2021002883 W JP2021002883 W JP 2021002883W WO 2021192589 A1 WO2021192589 A1 WO 2021192589A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- virtual
- user
- hand
- gripping
- virtual object
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/014—Hand-worn input/output arrangements, e.g. data gloves
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
- G06F3/0308—Detection arrangements using opto-electronic means comprising a plurality of distinctive and separately oriented light emitters or reflectors associated to the pointing device, e.g. remote cursor controller with distinct and separately oriented LEDs at the tip whose radiations are captured by a photo-detector associated to the screen
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04812—Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0486—Drag-and-drop
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/001—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
- G09G3/003—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to produce spatial visual effects
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2354/00—Aspects of interface with display user
Definitions
- this disclosure relates to an information processing device and an information processing method for processing information related to augmented reality, a computer program, and an augmented reality system.
- VR virtual reality
- AR augmented reality
- MR Magnetic Reality
- VR is a technology that allows virtual space to be perceived as reality.
- AR is a technology that expands the real space seen by the user by adding, emphasizing, attenuating, or deleting information to the real environment surrounding the user.
- MR is a technology for displaying a virtual object (hereinafter, also referred to as “virtual object”) that replaces an object in real space, and interlacing the real and the virtual.
- AR and MR are realized by using, for example, a see-through type head-mounted display (hereinafter, also referred to as “AR glass”).
- AR glass see-through type head-mounted display
- OSs Operating Systems
- Windows and Linux registered trademark
- operations are performed on running applications via input devices such as keyboards, mice, and touch panels.
- input devices such as keyboards, mice, and touch panels.
- VR and AR an input form in which a user who wears a head-mounted display and searches for a virtual space operates by holding a controller in his / her hand is known (see, for example, Patent Document 1). matter).
- An object of the present disclosure is to provide an information processing device and an information processing method for processing information related to augmented reality, a computer program, and an augmented reality feeling system.
- the first aspect of the disclosure is An acquisition unit that acquires the position of the user's hand and gestures of the fingers, A control unit that controls the display operation of a display device that superimposes and displays virtual objects in real space, Equipped with The control unit sets a virtual gripping point at a position having a certain offset with respect to the hand, and displays the gripping operation of the virtual object based on the positional relationship between the virtual gripping point and the virtual object and the contact state between the fingertips. It is an information processing device that controls the display device so as to perform the above.
- the acquisition unit acquires the position of the hand and the gesture of the fingers based on the sensor information from the sensor attached to the back of the hand, or includes a sensor attached to the back of the hand.
- the acquisition unit further acquires the posture of the user's fingers.
- the control unit controls mode switching between a gripping operation mode in which the virtual object is gripped by contact between the fingertips and a contact operation mode in which the virtual object is touched by the palm or the fingertips, based on the posture information of the fingers. Further, the control unit further controls the mode switching to the button operation mode in which the virtual button is pressed with a fingertip.
- the second aspect of the present disclosure is The acquisition step to acquire the position of the user's hand and the gesture of the finger, A control step that controls the display operation of a display device that superimposes and displays virtual objects in real space, Have, In the control step, a virtual gripping point is set at a position having a certain offset with respect to the hand, and the gripping operation of the virtual object is displayed based on the positional relationship between the virtual gripping point and the virtual object and the contact state between the fingertips. It is an information processing method that controls the display device so as to perform the above.
- the third aspect of the present disclosure is Acquisition unit, which acquires the position of the user's hand and the gesture of the fingers, A control unit that controls the display operation of a display device that superimposes and displays virtual objects in real space.
- a control unit that controls the display operation of a display device that superimposes and displays virtual objects in real space.
- the control unit sets a virtual gripping point at a position having a certain offset with respect to the hand, and displays the gripping operation of the virtual object based on the positional relationship between the virtual gripping point and the virtual object and the contact state between the fingertips.
- Control the display device so as to It is a computer program.
- the computer program according to the third aspect of the present disclosure defines a computer program written in a computer-readable format so as to realize a predetermined process on the computer.
- a collaborative action is exerted on the computer, and the same action as the information processing device according to the first aspect of the present disclosure. The effect can be obtained.
- the fourth aspect of the present disclosure is A display device that superimposes and displays virtual objects in real space
- An acquisition unit that acquires the position of the user's hand and gestures of the fingers
- a control unit that controls the display operation of the display device, Equipped with The control unit sets a virtual gripping point at a position having a certain offset with respect to the hand, and displays the gripping operation of the virtual object based on the positional relationship between the virtual gripping point and the virtual object and the contact state between the fingertips.
- Control the display device so as to It is an augmented reality system.
- system here means a logical assembly of a plurality of devices (or functional modules that realize a specific function), and each device or functional module is in a single housing. It does not matter whether or not it is.
- an information processing device and an information processing method, a computer program, and an augmented reality system that realize the interaction of a virtual object with a user's hand or finger.
- FIG. 1 is a diagram showing the back of a user's hand on which the controller 10 is installed using the belt 11.
- FIG. 2 is a diagram showing the palm of a user on which the controller 10 is installed using the belt 11.
- FIG. 3 is a view showing a side surface of a user's hand on which the controller 10 is installed using the belt 11.
- FIG. 4 is a diagram showing a user wearing the AR glass 41 on the head and the controllers 42 and 43 on both hands, respectively.
- FIG. 5 is a diagram showing a functional configuration example of the AR system 100.
- FIG. 6 is a diagram showing a state in which AR glasses are attached to the user's head.
- FIG. 7 is a diagram showing a configuration example of an AR system 700 including an AR glass 701 and a controller 702.
- FIG. 7 is a diagram showing a configuration example of an AR system 700 including an AR glass 701 and a controller 702.
- FIG. 8 is a diagram showing a configuration example of an AR system 800 including an AR glass 801, a controller 802, and an information terminal 803.
- FIG. 9 is a diagram showing a specific configuration example of the controller 110.
- FIG. 10 is a diagram showing a specific configuration example of the controller 110.
- FIG. 11 is a diagram showing a specific configuration example of the controller 110.
- FIG. 12 is a diagram showing an example of a functional configuration included in the control unit 140.
- FIG. 13 is a diagram showing how virtual objects are arranged around the user.
- FIG. 14 is a diagram for explaining a mechanism for displaying a virtual object so that the AR glass follows the movement of the user's head.
- FIG. 15 is a diagram showing a state according to the distance between the user's hand and the virtual object.
- FIG. 16 is a diagram showing how the user grips and operates the virtual object.
- FIG. 17 is a diagram showing how the holding of the virtual object is executed when the position of the fingertip is on the surface of the virtual object.
- FIG. 18 is a diagram showing how the virtual object is grasped when the thumb and the index finger come into contact with each other inside the virtual object.
- FIG. 19 is a diagram showing a state in which a virtual gripping point is set at a position having a constant offset with respect to the controller 110 main body.
- FIG. 20 is a diagram showing a specific example of the virtual gripping point.
- FIG. 21 is a diagram showing a gripping flow of a virtual object using a virtual gripping point.
- FIG. 22 is a diagram showing a gripping flow of a virtual object using a virtual gripping point.
- FIG. 23 is a diagram showing a gripping flow of a virtual object using a virtual gripping point.
- FIG. 24 is a diagram showing a gripping flow of a virtual object using a virtual gripping point.
- FIG. 25 is a diagram showing a gripping flow of a virtual object using a virtual gripping point.
- FIG. 26 is a diagram showing a gripping flow of a virtual object using a virtual gripping point.
- FIG. 27 is a diagram showing the mode transition of the AR system 100.
- FIG. 28 is a diagram showing the posture of the fingers in the gripping operation mode.
- FIG. 29 is a diagram showing the posture of the fingers in the contact operation mode.
- FIG. 30 is a diagram showing the behavior due to the contact between the hand and the virtual object.
- FIG. 31 is a diagram showing the behavior due to the contact between the hand and the virtual object.
- FIG. 32 is a diagram showing the posture of the fingers in the button operation mode.
- FIG. 33 is a diagram showing the mode transition of the AR system 100.
- FIG. 34 is a diagram showing a state in which a virtual pressing point is set on the fingertip in the button operation mode.
- FIG. 35 is a diagram showing how the virtual button is pressed in the button operation mode.
- FIG. 36 is a flowchart showing a processing procedure for determining the operation mode of the user.
- FIG. 37 is a diagram showing a method of calibrating the virtual gripping point.
- FIG. 38 is a diagram showing a method of calibrating the virtual gripping point.
- FIG. 39 is a diagram showing a method of performing running calibration of the virtual gripping point.
- FIG. 40 is a diagram showing a method of performing running calibration of a virtual gripping point.
- FIG. 41 is a diagram showing a method of performing running calibration of the virtual gripping point.
- FIG. 42 is a diagram showing a method of performing running calibration of the virtual gripping point.
- FIG. 43 is a diagram showing a display example of a virtual gripping point.
- FIG. 44 is a diagram showing a display example of a virtual gripping point.
- FIG. 45 is a diagram showing a display example of a virtual gripping point.
- A. System configuration In fields such as VR and AR, an input form in which a user who wears a head-mounted display and searches for a virtual space operates by holding a controller in his / her hand is known (see, for example, Patent Document 1). matter). However, it is preferable that the user can perform daily life in the real space such as walking and grasping an object (including a real object and a virtual object) while looking over the real space through the AR glass. Therefore, it is preferable that the fingers are not restrained by gripping the controller and the fingers can be used freely.
- an input method that does not restrain the user's finger there is a method of detecting the movement of the user's hand from the image taken by the camera.
- the bones of the user's fingers are extracted by an RGB camera or a ToF (Time Of Flight) camera attached to the AR glass outward, and the position and posture of the fingers and the gestures of the fingers are recognized.
- the method of detecting the user's hand from the image of the camera has a problem of occlusion and a problem that it cannot be detected outside the angle of view of the camera.
- a controller used for hand position detection, finger posture recognition, finger gesture recognition, etc. is installed in the user's hand, and an AR system is configured so that the fingers can be used freely. .. Further, in order to grasp a real object or a virtual object by using the hand on which the controller is installed, or to place the virtual object on the palm, it is preferable to keep the palm free. Therefore, it is preferable to install the controller on the back of the hand.
- the finger gestures referred to here indicate, for example, whether the thumb and the fingertips of other fingers (index finger, etc.) are in contact with each other or separated from each other. Further, in the present embodiment, it is essential that the controller is equipped with a hand position detection function and a finger gesture recognition function, but it is not essential to equip the controller with a finger posture recognition function.
- FIGS. 1 to 3 show an example in which the controller 10 is attached to the left hand of the user, the controller 10 having a symmetrical shape can also be attached to the right hand.
- FIG. 4 shows how the user wears the AR glass 41 on the head and the controllers 42 and 43 on both hands, respectively.
- each of the controllers 42 and 43 has the functions of hand position detection, finger posture recognition, and finger gesture recognition.
- the AR glass 41 has a function of superimposing and displaying a virtual object in the real space.
- the AR glass 41 can recognize the positions of the left and right hands, the postures of the fingers, and the gestures of the fingers through the controllers 42 and 43. Further, the AR glass 41 has a function of detecting the position and posture of the user's head. Therefore, the AR glass 41 can detect the relative positions of the user's head and the controllers 42 and 43, in other words, the relative positions of the user's left and right hands. Further, since the coordinate position of the virtual object displayed by the AR glass 41 in the real space is grasped, the relative position between the user's left and right hands and the virtual object can be detected.
- FIG. 5 shows an example of the functional configuration of the AR system 100 including the AR glass and the controller installed on the back of the user's hand.
- the illustrated AR system 100 comprehensively controls the operation of the controller 110 installed on the back of the user's hand, the head sensor unit 120, the display unit 131 for displaying virtual objects on the AR glass, and the entire AR system 100. It includes a control unit 140.
- the controller 110 includes a hand position detection unit 111, a finger posture recognition unit 112, a finger gesture recognition unit 113, and a tactile feedback unit 114.
- the head sensor unit 120 which is mounted on the AR glass, includes an outward camera 121, an inward camera 122, a microphone 123, a gyro sensor 124, an acceleration sensor 125, and a directional sensor 126.
- an outward camera 121 an inward camera 122
- a microphone 123 a microphone 123
- a gyro sensor 124 an acceleration sensor 125
- a directional sensor 126 a directional sensor 126.
- FIG. 5 only one controller 110 is drawn for simplification of the drawing, but when the controller 110 is installed in each of the left and right hands of the user, the AR system 100 includes two controllers 110. ..
- the AR system 100 may further include a speaker 132 that outputs an audio signal such as a voice related to a virtual object, and a communication unit 133 for the AR system 100 to communicate with the outside.
- the control unit 140 may be equipped with a large-scale storage unit 150 including an SSD (Solid State Drive) or the like.
- the AR glass body is generally a spectacle-type or goggle-type device, which is used by the user by wearing it on the head, superimposing digital information on the visual field of both eyes or one eye of the user, or emphasizing a specific real object. It can be degraded or attenuated, or a particular real object can be deleted to make it appear as if it does not exist.
- FIG. 6 shows a state in which AR glasses are attached to the user's head.
- a display unit 131 for the left eye and a display unit 131 for the right eye are arranged in front of the left and right eyes of the user, respectively.
- the display unit 131 is transparent or translucent, and displays a virtual object superimposed on a predetermined position in the real space, emphasizes or attenuates a specific real object, or deletes the specific real object so that it does not exist as if it does not exist. I show it to you.
- the left and right display units 131 may be independently displayed and driven, for example, to display a parallax image, that is, a virtual object in 3D.
- an outward camera 121 directed toward the user's line of sight is arranged substantially in the center of the AR glass.
- the AR system 100 can be composed of two devices, for example, an AR glass worn on the head of the user and a controller worn on the back of the user's hand. However, when the controllers are installed on the backs of the left and right hands of the user, the AR system 100 is composed of three devices, an AR glass and two controllers.
- FIG. 7 shows a configuration example of an AR system 700 including an AR glass 701 and a controller 110.
- the AR glass 701 includes a control unit 140, a storage unit 150, a head sensor unit 120, a display unit 131, a speaker 132, and a communication unit 133.
- the controller 110 includes a hand position detection unit 111, a finger posture recognition unit 112, a finger gesture recognition unit 113, and a tactile feedback unit 114.
- the AR system 100 is composed of three devices: an AR glass worn by the user on the head, a controller worn on the back of the user's hand, and an information terminal such as a smartphone or tablet.
- FIG. 8 shows a configuration example of an AR system 800 including an AR glass 801, a controller 110, and an information terminal 803.
- the AR glass 801 includes a display unit 131, a speaker 132, and a head sensor unit 120.
- the controller 110 includes a hand position detection unit 111, a finger posture recognition unit 112, a finger gesture recognition unit 113, and a tactile feedback unit 114.
- the information terminal 803 includes a control unit 140, a storage unit 150, and a communication unit 133.
- the specific device configuration of the AR system 100 is not limited to FIGS. 7 and 8. Further, the AR system 100 may further include components other than those shown in FIG.
- the controller 110 includes a hand position detection unit 111, a finger posture recognition unit 112, a finger gesture recognition unit 113, and a tactile feedback unit 114.
- the hand position detection unit 111 detects the position of the user's hand.
- the finger posture recognition unit 112 recognizes the posture of the user's fingers. In this embodiment, the finger posture recognition unit 112 is not essential.
- the finger gesture recognition unit 113 recognizes whether the gesture of the finger, for example, the thumb and the fingertip of another finger (index finger, etc.) are in contact with each other or separated from each other.
- the tactile feedback unit 114 is configured by arranging, for example, electromagnetic type or piezoelectric type vibrators in an array, and provides tactile feedback by presenting vibration to the back of the user's hand.
- the tactile feedback unit 114 is provided in the controller 110 installed on the back of the user's hand, but the tactile feedback unit 114 is attached to one or more parts other than the back of the hand on the user's body to present vibration. It may be configured to do so.
- the head sensor unit 120 is mounted on the AR glass, and includes an outward camera 121, an inward camera 122, a microphone 123, a gyro sensor 124, an acceleration sensor 125, and a directional sensor 126.
- the outward-facing camera 121 is composed of, for example, an RGB camera, and is installed so as to photograph the outside of the AR glass, that is, the front direction of the user wearing the AR glass.
- the outward camera 121 can capture the operation of the user's fingers, but when the user's fingers are hidden behind an obstacle, or when the fingertips are hidden by the back of the hand, the user has a hand behind the body. It is not possible to capture the operation of the user's fingers when turning.
- the outward-facing camera 121 may further include any one of an IR camera and a ToF camera including an IR light emitting unit and an IR light receiving unit.
- a retroreflective material is attached to an object to be captured such as the back of the hand, and the IR camera emits infrared light and emits infrared light reflected from the retroreflective material. Receive light.
- the IR camera receives a marker that emits infrared light or a dot pattern of a plurality of IR light sources installed on the controller.
- the image signal captured by the outward camera 121 is transferred to the control unit 140.
- the microphone 123 may be a single sound collecting element or a microphone array including a plurality of sound collecting elements.
- the microphone 123 collects the voice of the user wearing the AR glass and the ambient sound of the user.
- the audio signal picked up by the microphone 123 is transferred to the control unit 140.
- the gyro sensor 124, the acceleration sensor 125, and the azimuth sensor 126 may be composed of an IMU.
- the sensor signals of the gyro sensor 124, the acceleration sensor 125, and the directional sensor 126 are transferred to the control unit 140.
- the control unit 140 can detect the position and posture of the head of the user wearing the AR glasses based on these sensor signals.
- the display unit 131 is composed of a transmissive display (glasses lens, etc.) installed in front of both eyes or one eye of the user wearing AR glasses, and is used for displaying a virtual space. Specifically, the display unit 131 expands the real space as seen by the user by displaying information (virtual objects) and emphasizing, attenuating, or deleting real objects. The display unit 131 performs a display operation based on a control signal from the control unit 140. Further, the mechanism for see-through display of virtual objects on the display unit 131 is not particularly limited.
- the speaker 132 is composed of a single sounding element or an array of a plurality of sounding elements, and is installed in, for example, an AR glass.
- the speaker 132 outputs the sound related to the virtual object displayed on the display unit 131, but other audio signals may be output.
- the communication unit 133 has a wireless communication function such as Wi-Fi (registered trademark) or Bluetooth (registered trademark).
- the communication unit 133 mainly performs a communication operation for realizing data exchange between the control unit 140 and an external system (not shown).
- the control unit 140 is installed in the AR glass or is arranged in a device (smartphone or the like) separated from the AR glass together with a drive power source such as a storage unit 150 or a battery.
- the control unit 140 executes various programs read from the storage unit 150 to perform various processes.
- the controller 110 is an input device for the AR system 100 according to the present embodiment, which corresponds to a keyboard, mouse, touch panel, etc. in an OS such as Windows or Linux (registered trademark). As shown in FIGS. 1 to 3, the controller 110 is installed and used on the back of the user's hand. Therefore, the user can freely use the fingers without being restrained by the controller 110. For example, the user can grasp a real object or a virtual object by using the hand on which the controller 110 is installed, or place the virtual object on the palm.
- OS such as Windows or Linux (registered trademark).
- the controller 110 is a device that inputs to the AR system 100 based on the position of the user's hand, the posture of the fingers, and the posture of the fingers. Therefore, as shown in FIGS. 5, 7 and 8, the controller 110 includes a hand position detection unit 111, a finger posture recognition unit 112, and a hand gesture recognition unit 113.
- the hand position detection unit 111 detects the position of the user's hand.
- the finger posture recognition unit 112 recognizes the posture of the user's fingers.
- the finger gesture recognition unit 113 recognizes whether the gesture of the finger, for example, the thumb and the fingertip of another finger (index finger, etc.) are in contact with each other or separated from each other.
- the controller 110 includes a tactile feedback unit 114 that gives a tactile sensation to the back of the user's hand by presenting vibration.
- a tactile feedback unit 114 that gives a tactile sensation to the back of the user's hand by presenting vibration.
- FIG. 9 shows a configuration example of the hand position detection unit 111 and the hand gesture recognition unit 113.
- the controller 110 is equipped with an arbitrary finger posture recognition unit 112 or is not equipped with the finger posture recognition unit 112, and the finger posture recognition unit 112 will not be described here.
- the hand position detection unit 111 uses the IR detection method. That is, the hand position detection unit 111 is attached to a plurality of (4 in the example shown in FIG. 9) IR reflection markers 901 to 904 attached to the housing 10 of the controller 110 and the AR glass (or the head sensor unit 120). It is composed of a combination of provided IR cameras (not shown).
- the IR camera includes an IR transmitting unit and an IR receiving unit. The IR signal output from the IR transmitting unit is reflected by each IR reflection marker 901 to 904, and the reflected IR signal is received by the IR receiving unit to receive IR reflection. The bright spots of the markers 901 to 904 can be detected.
- the IR camera is preferably a stereo type having a plurality of IR receivers.
- the finger gesture recognition unit 113 recognizes whether the gesture of the finger, for example, the thumb and the fingertip of another finger (index finger, etc.) are in contact with each other or separated from each other.
- the finger gesture recognition unit 113 uses an electrode detection method. Electrodes 911 and 912 are attached to the fingertips of the user's thumb and index finger, respectively. Then, when the thumb and the index finger come into contact with each other, the electrode 911 and the electrode 912 are energized, so that the gesture of the thumb and the index finger can be recognized based on the energized state of the electrodes 911 and 912.
- FIG. 10 shows another configuration example of the hand position detection unit 111, the finger posture recognition unit 112, and the finger gesture recognition unit 113.
- the hand position detection unit 111 detects the position of the user's hand by combining the IR detection method and the IMU detection method.
- IR detection method IR reflection signals of a plurality of IR reflection markers 1001, 1002, 1003, ... Attached to the housing 10 of the controller 110 are transmitted to an IR camera (or head sensor unit 120) provided on the AR glass (or head sensor unit 120). Captured with (not shown), the position and orientation of the user's hand is detected based on the bright spot position of each IR reflection marker.
- the IMU detection method the position and orientation of the user's hand are detected based on the detection signal of the IMU (Inertial Measurement Unit) built in the main body 10 of the controller 110.
- IMU Inertial Measurement Unit
- the IMU includes a gyro sensor, an acceleration sensor, and a directional sensor.
- the IR detection method is used, and when the controller 110 is out of the field of view of the IR camera (including the case where occlusion occurs), the IMU method is used.
- the finger posture recognition unit 112 is composed of IMUs attached to several places of the user's fingers.
- IMU1011, 1012, and 1013 are attached to the thumb and the base and middle phalanx of the index finger by bands 1021, 1022, and 1023, respectively.
- the posture of the thumb, the posture of the proximal phalanx and the intermediate phalanx of the index finger (or the angle of the second joint of the index finger) can be measured based on the detection signals of each IMU1011, 1012, 1013.
- another IMU may be attached to another place of the thumb and the index finger, or the IMU may be attached to a finger other than the thumb and the index finger.
- the method of fixing the IMU to each finger is not limited to the band.
- the finger gesture recognition unit 113 determines whether the finger gesture, for example, the thumb and the fingertips of another finger (index finger, etc.) are in contact with or separated from each other, in addition to the finger joint angle recognized by the finger posture recognition unit 112. recognize.
- the hand gesture recognition unit 113 uses a capacitance detection method.
- electrodes for detecting capacitance are installed at the tip of each finger, the intermediate phalanx and the palm, or the middle phalanx of the thumb and the intermediate phalanx of the index finger. ..
- the finger posture recognition unit 112 can recognize the gesture of the thumb and the index finger by the change of the capacitance between the fingertips of the thumb and the index finger.
- FIG. 11 shows still another configuration example of the hand position detection unit 111, the finger posture recognition unit 112, and the finger gesture recognition unit 113.
- the hand position detection unit 111 and the hand gesture recognition unit 113 are configured in the same manner as in FIG. 10, the illustration and detailed description thereof will be omitted here, and the hand posture recognition unit 112 will be described.
- the finger posture recognition unit 112 is composed of a ToF camera 1101 installed on the palm of the hand using the belt 11.
- the ToF camera 1101 can capture five fingers by installing it at a wide angle, for example, near the wrist.
- each finger can be bone-recognized based on the depth image from the ToF camera 1201 to acquire the posture of the finger.
- Bone recognition may be used to recognize finger gestures such as the contact between the fingertips of the thumb and index finger, but in order to further improve the detection accuracy, the finger gestures are recognized using the capacitance detection method as described above. It is preferable to do so.
- a capacitive contact sensor is placed near the center of the palm using a belt 11, each finger of the index finger, middle finger, ring finger, and little finger approaches or touches the palm, and these four fingers can be pressed. You can use it to recognize the gesture you are gripping.
- the configuration of the hand position detection unit 111, the finger posture recognition unit 112, and the finger gesture recognition unit 113 equipped on the controller 110 is not necessarily limited to the above. If the position of the back of the hand can be detected with higher accuracy than the position of the fingers of the hand, a configuration other than the above can be applied to the controller 110. For example, in the case where the controller 110 alone can estimate the self-position by SLAM (Simultaneus Localization and Mapping), or in the finger recognition using the camera (RGB stereo camera, ToF camera, etc.) of the head sensor unit 120, the position of the hand.
- the controller 110 may have a configuration other than the above-described configuration as long as it is configured to detect the above with high accuracy and robustly. Further, in any of the configurations shown in FIGS. 9 to 11, the controller 110 is equipped with a speaker for outputting sound, an LED (Light Emitting Diode) or a display for presenting the state and information of the controller. You may be.
- the basic operation diagram 12 of the AR glass schematically shows an example of a functional configuration included in the control unit 140.
- the control unit 140 includes an application execution unit 1201, a head position / posture acquisition unit 1202, an output control unit 1203, a hand position acquisition unit 1204, a finger posture acquisition unit 1205, and a finger gesture acquisition unit 1206. It has.
- These functional modules are realized by executing various programs read from the storage unit 150 by the control unit 140.
- FIG. 12 shows only the minimum necessary functional modules for realizing the present disclosure, and the control unit 140 may further include other functional modules.
- the application execution unit 1201 executes the application program including the AR application under the execution environment provided by the OS.
- the application execution unit 1201 may execute a plurality of application programs in parallel at the same time.
- the AR application is an application such as video playback or a 3D object viewer, but the virtual object is superimposed or specified in the field of view of the user who wears the AR glass (see FIG. 6) on the head. Emphasizes or attenuates a real object in, or removes a particular real object to make it appear as if it doesn't exist.
- the application execution unit 1201 also controls the display operation of the AR application (virtual object) by using the display unit 131.
- the application execution unit 1201 also controls the user's gripping interaction with the virtual object based on the finger operation acquired through the controller 110. The details of the gripping operation of the virtual object will be described later.
- FIG. 13 schematically shows how a plurality of virtual objects 1301, 1302, 1303, ... Are arranged around 1300 around a user wearing AR glasses on his / her head.
- the application execution unit 1201 has each virtual object 1301, 1302, 1303 around the user with reference to the position of the user's head or the position of the center of gravity of the body estimated based on the sensor information from the head sensor unit 120. Place ...
- the head position posture detection unit 1202 is based on the sensor signals of the gyro sensor 124, the acceleration sensor 125, and the orientation sensor 126 included in the head sensor unit 120 mounted on the AR glass, and the position of the user's head. It detects the posture and also recognizes the user's line-of-sight direction or visual field range.
- the output control unit 1203 controls the output of the display unit 131, the speaker 132, and the tactile feedback unit 114 based on the execution result of the application program such as the AR application by the application execution unit 1201.
- the output control unit 1203 specifies the user's visual field range based on the detection result of the head position / posture detection unit 1202 so that the virtual object arranged in the visual field range can be observed by the user through the AR glass. That is, the display operation of the virtual object is controlled by the display unit 131 so as to follow the movement of the user's head.
- FIG. 14 A mechanism for displaying a virtual object so that the AR glass follows the movement of the user's head will be described with reference to FIG.
- the depth direction of the user's line of sight is the z w axis
- the horizontal direction is the y w axis
- the vertical direction is the x w axis
- the origin position of the user's reference axis x w y w z w is the user's viewpoint position. do.
- Roll ⁇ z corresponds to the movement of the user's head around the z w axis
- tilt ⁇ y corresponds to the movement of the user's head around the y w axis
- pan ⁇ z corresponds to the movement of the user's head around the x w axis. ..
- the head position / orientation detection unit 1202 moves the user's head in each of the roll, tilt, and pan directions ( ⁇ z , ⁇ y) based on the sensor signals of the gyro sensor 124, the acceleration sensor 125, and the orientation sensor 126. , ⁇ z ) and the translation of the head.
- the output control unit 1203 moves the display angle of view of the display unit 131 in the real space (for example, see FIG. 13) in which the virtual object is arranged so as to follow the posture of the user's head.
- the image of the virtual object existing at the display angle of view is displayed on the display unit 131.
- the region 1402-1 is rotated according to the roll component of the user's head movement
- the region 1402-2 is moved according to the tilt component of the user's head movement, or the user's head movement.
- the area 1402-3 is moved according to the pan component of the above, and the display angle is moved so as to cancel the movement of the user's head. Therefore, since the virtual object arranged at the display angle of view moved according to the position and orientation of the user's head is displayed on the display unit 131, the user can see the virtual object superimposed on the AR glass. You can observe the space.
- the hand position acquisition unit 1204, the finger posture acquisition unit 1205, and the hand gesture acquisition unit 1206 cooperate with the hand position detection unit 111, the finger posture recognition unit 112, and the finger gesture recognition unit 112 on the controller 110 side, respectively, to the user.
- the position of the user's hand, the posture of the fingers, and the gesture performed by the fingers are determined based on the image recognition result of the image captured by the outward-facing camera 121. Information can be obtained.
- an object can be held by a method such as pinching or grasping, and the shape of the object is changed by the force applied from the pinching or grasping hand.
- a method such as pinching or grasping
- the shape of the object is changed by the force applied from the pinching or grasping hand.
- the hand slips through the object, so it is not possible to hold the object in the same manner as in real space.
- UI user interface
- a finger is thrust into an object in a virtual space and picked with a fingertip, or a frame provided on the outer periphery of the object is picked is also conceivable.
- the present disclosure provides a method in which a user grips and operates a virtual object using the controller 110.
- "Using the controller 110" means that the controller 110 uses the hand placed on the back of the hand. Therefore, the control unit 140 (or the application execution unit 1201 that controls the display of the virtual object) constantly acquires the position of the hand that grips and operates the virtual object, the gesture of the finger, and the position and posture of the finger through the controller 110. be able to.
- the controller 110 installed on the back of the hand that performs the gripping operation of the virtual object is omitted, and the position of the hand, the gesture of the fingers, and the position and posture of the fingers are acquired by using the controller 110.
- the description of the processing to be performed will be omitted.
- FIG. 15 shows three states of “approach”, “contact”, and “entry”.
- “Approach” is a state in which the shortest distance between the user's hand and the virtual object is equal to or less than a predetermined value.
- Contact is a state in which the shortest distance between the user's hand and the virtual object is zero.
- Embedding is a state in which the user's hand is interfering with the area of the virtual object.
- FIG. 17 shows how the grip of the virtual object is executed when the position of the fingertip is on the surface of the virtual object 1701 by the first logic.
- the virtual object 1701 does not actually exist, even if the user's fingertip touches the virtual object 1701, the reaction force is not obtained and the object is not restrained. Since the user's fingertips slip through the virtual object 1701, the user cannot obtain a realistic touch.
- FIG. 18 shows a state in which the virtual object is grasped when the thumb and the index finger come into contact with each other inside the virtual object 1801 by the second logic.
- the virtual object 1801 can be recognized by the user's own tactile sensation. If the user recognizes that the state in which the movement of the fingertip is restrained by the contact is the state in which the virtual object 1801 is gripped, the user can easily recognize the change from the gripped state to the state in which the virtual object 1801 is released.
- the second gripping logic is superior to the first gripping operation logic in terms of judgment and reality of gripping a virtual object.
- the present disclosure presupposes a second gripping logic. Further, according to the second gripping logic, it is possible to realize the gripping operation of the virtual object even if the controller 110 is not equipped with the finger posture recognition unit 112.
- the "virtual gripping point" is set.
- the virtual gripping point is a position where the fingertips used for gripping are expected to come into contact with each other when performing the gripping operation at the current hand position.
- the virtual grip point is set at a position having a certain offset with respect to the user's hand.
- the position of the user's hand can be detected by the hand position detecting unit 111 of the controller 110 installed on the back of the hand.
- the hand position detecting unit 111 detects the position of the controller 110 main body which is substantially equal to the position of the back of the hand.
- the virtual gripping point may be a position having a constant offset with respect to the controller 110 main body instead of the user's hand.
- FIG. 19 shows a state in which the virtual gripping point 1901 is set at a position having a certain offset in the position coordinates of the controller 110 main body.
- the virtual gripping point is, for example, as shown in FIG. 20, at the position of the user's hand detected by using the controller 110, the position where the fingertips of the thumb and the index finger come into contact with each other, in other words, the object using the thumb and the index finger. It is set to the position 2001 to pick up.
- the present disclosure is based on the positional relationship between the virtual gripping point set at a position having a certain offset with respect to the user's hand and the virtual object displayed by the AR glass, and the gesture of the user's hand. Perform a gripping interaction with.
- the gesture of the user's finger is specifically a gripping operation on a virtual object.
- the gesture of the user's finger can be acquired by the finger gesture recognition unit 113 in the controller 110 installed on the back of the hand, as described above.
- accurate information may not be obtained due to problems of occlusion and accuracy.
- the controller 110 installed on the back of the hand the position of the hand and the gesture of the fingers can be accurately detected and recognized. Therefore, according to the present disclosure, the positional relationship between the virtual gripping point and the virtual object can be accurately detected, and the gripping operation by the fingertip can be accurately recognized, so that the virtual object is intuitive and does not cause discomfort. Gripping interaction can be realized.
- FIGS. 21 to 26 show the gripping flow of virtual objects using virtual gripping points in order.
- the desk may be either a real object or a virtual object.
- the virtual object grasping flow shown in FIGS. 21 to 26 is performed by the application execution unit 1201 based on the information of the user's hand position and finger gesture acquired through the controller 110, and follows the user's virtual object grasping operation.
- the display of virtual objects shall also be switched.
- a virtual gripping point is set at a position having a certain offset with respect to the user's hand.
- the virtual gripping point is set at a position where the fingertips of the thumb and the index finger come into contact with each other when the user's hand is in the current position, that is, a position where the thumb and the index finger are used to pinch the object.
- a marker indicating the virtual gripping point may be displayed on the AR glass so that the user can visually understand the set virtual gripping point. The user can observe the virtual object and the virtual gripping point set in the hand trying to grip the virtual object through the AR glass.
- the virtual gripping point may be set or the virtual gripping point may be displayed only when the user's hand approaches the virtual object to be gripped. ..
- a predetermined value may be, for example, 50 cm.
- the virtual gripping point invades the gripping detection area as shown in FIG. 22.
- the grip detection area is the same as the virtual object.
- the display of the virtual object is changed at the timing when the virtual grip point enters the grip detection area.
- the method of changing the display of the virtual object at the timing when the virtual grip point enters the grip detection area is arbitrary. In the example shown in FIG. 22, an aura is generated around the virtual object.
- a notification sound is emitted from the speaker 132 in accordance with the display of the aura or instead of the display of the aura, or the tactile feedback unit 114 is used in the user's hand. Tactile feedback may be returned.
- the user can observe the virtual object and the aura around it through the AR glass.
- the user can recognize that the virtual object can be grasped by performing the grasping operation as it is because the aura is generated in the virtual object to be grasped.
- Operations such as aura display, notification sound, and tactile feedback at the timing when the virtual grip point enters the grip detection area are similar to mouse over when the mouse pointer is placed on the object displayed on the OS screen.
- the virtual object When the user releases the gripping operation of the virtual object by releasing the fingertips of the thumb and index finger, the virtual object can move freely without being restricted in the relative position / posture relationship with the user's hand. For example, when the user releases the gripping operation of the virtual object in the state shown in FIG. 25, the virtual object falls due to gravity as shown in FIG. 26. In addition, when the gripping operation is released, the virtual object is also released from the aura and highlight display.
- FIGS. 19 to 26 have been described as a point where the position where the fingertips used for gripping are expected to come into contact does not have a region, that is, a "virtual gripping point".
- the position where the fingertips used for gripping are expected to come into contact with each other may be a sphere having a region volume, and in this case, it may be called a "virtual gripping region" instead of a "virtual gripping point”. ..
- the application execution unit 1201 can switch the operation mode based on the posture of the user's finger acquired through the controller 110. Further, when switching the operation mode, it is premised that the controller 110 is equipped with the finger posture recognition unit 112.
- FIG. 27 shows a mode transition diagram of the AR system 100 having a two-mode configuration of a grip operation mode and a contact operation mode.
- the grip operation mode is an operation mode in which the user is trying to grip a virtual object.
- the application execution unit 1201 can switch the AR system 100 to the gripping operation mode based on the postures of the user's thumb and index finger acquired through the controller 110.
- the application execution unit 1201 performs the following UI operation using the AR glass as described in the above section D.
- a virtual gripping point is set at a position having a certain offset with respect to the user's hand (see FIG. 21).
- the position of the virtual gripping point may be displayed on the AR glass.
- the display of the virtual object is switched to notify the user of the intrusion timing (see FIG. 22).
- the gripping process is performed.
- the display of the virtual object in the gripped state is switched to notify the user that the virtual object is in the gripped state (see FIG. 23).
- the relative position-posture relationship between the user's hand and the virtual object is maintained.
- the position and orientation of the virtual object are changed according to the movement of the user's hand (see FIGS. 24 and 25). (5) When the user releases the gripping operation, the constraint on the relative positional relationship between the virtual object and the user's hand is released, and the virtual object is put into a state in which it can move freely (see FIG. 26).
- the contact operation mode is an operation mode in which the user does not try to grasp the virtual object.
- the contact operation mode since the user does not try to grasp the virtual object using the thumb and the index finger, the fingertips of the thumb and the forefinger are separated as shown in FIG. 29, for example.
- the application execution unit 1201 can switch the AR system 100 to the contact operation mode based on the postures of the user's thumb and index finger acquired through the controller 110.
- the application execution unit 1201 sets, for example, a contact determination element called a collider on the fingers, palm, and back of the user's hand.
- a contact determination element called a collider on the fingers, palm, and back of the user's hand.
- the collider may be displayed in AR glasses.
- the behavior of the contact between the hand and the virtual object is the movement, repelling, and reception of the virtual object due to the contact, and the same behavior as the contact behavior in the real space is realized.
- the virtual gripping point and the gripping detection area are not set.
- FIG. 30 shows a state in which the user is pressing the virtual object 3001 with a fingertip as an example of the behavior due to the contact between the hand and the virtual object. Since the virtual object 3001 does not actually exist, the fingertip slips through. However, the application execution unit 1201 sets a collider for contact determination on the fingertip, so that the virtual object 3001 can be moved from the user's fingertip as shown in FIG. It is possible to realize the behavior of moving by receiving a pushing force. Further, FIG. 31 shows, as another example, a state in which the virtual object 3101 is placed on the palm. Since the virtual object 3101 does not actually exist, it slips through the palm and falls. However, the application execution unit 1201 sets a collider for contact determination on the palm, so that the virtual object 3101 can be moved from the palm as shown in FIG. 31. By receiving the reaction force, it is possible to realize the behavior of staying on the palm without falling.
- the gripping operation mode and the contact operation mode are clearly separated as shown in FIG. 27 for the interaction by the user's fingers.
- the virtual gripping point in the gripping operation mode is deactivated, and instead the collider for contact determination in the contacting operation mode is activated. ..
- the position offset of the virtual gripping point can be used as the position of the collider for determining the contact of the fingertip.
- local attitude information of the virtual fingertip position is also important.
- various operation modes in which the user manually operates the virtual object can be further defined. Then, the operation mode can be switched based on the posture of the fingers recognized through the controller 110.
- FIG. 33 shows a mode transition diagram of the AR system 100 having a three-mode configuration in which a button operation mode is added to the grip operation mode and the contact operation mode.
- the application execution unit 1201 can switch the AR system 100 to the button operation mode when the posture of sticking out the index finger is acquired through the controller 110.
- the application execution unit 1201 performs the following UI operations using the AR glass.
- a virtual pressing point is set at the fingertip position of the index finger (see FIG. 34), and the virtual button can be operated.
- the position of the virtual pressing point may be displayed on the AR glass.
- the application execution unit 1201 can recognize that the index finger touches the virtual button under certain conditions through the controller 110. Then, the application execution unit 1201 activates the process assigned to the virtual button based on the recognition result.
- a certain condition imposed on the contact operation of the virtual button is, for example, that the virtual pressing point is touched by the virtual button from the direction in which the inclination of the virtual button from the normal direction falls within an error of ⁇ 45 degrees. ..
- this condition erroneous operations such as the virtual button being grasped in a form corresponding to the grip operation mode or being moved in a form corresponding to the contact operation mode are unlikely to occur, and it can only be operated as a UI element called a button. It is possible. That is, by imposing this condition, the finger gesture of the operation of pressing the virtual button in the button operation mode is contacted with the finger gesture for the operation of picking the virtual object in the grip operation mode or the virtual object in the contact operation mode. It can be accurately distinguished from the finger gestures of. In short, certain conditions are set in the button operation mode so as not to be confused with an unintended button operation or an operation of moving the virtual button itself.
- the operation mode of interaction with a virtual object can be determined according to the posture of the finger.
- the operation mode can be determined by using the information on the degree of opening of the fingers and the distance between the fingertips.
- the specific method for determining the operation mode is as follows.
- the gripping operation mode is determined.
- the proximal phalanx of the index finger is opened by a certain angle or more, and the distance between the fingertip positions of the middle finger, ring finger, and little finger and the palm is within a certain distance. Judged as button operation mode.
- the contact operation mode is determined.
- FIG. 36 shows a processing procedure for determining the operation mode of the user in the form of a flowchart. This processing procedure is executed by the application execution unit 1201 based on the posture of the fingers obtained through the controller 110 installed on the back of the user's hand.
- the application execution unit 1201 first checks whether the distance between the fingertips of the thumb and the index finger is equal to or less than the predetermined value d (step S3601). Then, if the distance between the fingertips of the thumb and the index finger is equal to or less than the predetermined value d (Yes in step S3601), then it is further checked whether or not the deviation of the directions of the tips (end nodes) of both fingers is within a certain value. (Step S3602).
- step S3601 when the distance between the fingertips of the thumb and the index finger is equal to or less than the predetermined value d (Yes in step S3601) and the deviation of the directions of the tips (end nodes) of both fingers is within a certain value (step S3602). Yes), it is determined that the gripping operation mode is set.
- the application execution unit 1201 determines that the gripping operation mode is set, the application execution unit 1201 sets a virtual gripping point at a position having a certain offset with respect to the controller 110 main body. Then, the AR glass may be used to present the virtual gripping point to the user.
- the application execution unit 1201 further checks whether or not the proximal phalanx of the index finger is opened by a certain angle or more (step S3603). Then, when the proximal phalanx of the index finger is opened by a certain angle or more (Yes in step S3603), then it is checked whether the distance between the fingertip positions of the middle finger, ring finger, and little finger and the palm is within a certain distance (Yes). Step S3604).
- step S3603 When the proximal phalanx of the index finger is opened by a certain angle or more (Yes in step S3603) and the distance between the fingertip positions of the middle finger, ring finger, and little finger and the palm is within a certain distance (step S3604). Yes), the application execution unit 1201 determines that the button operation mode is set.
- the application execution unit 1201 determines that the button operation mode is set, the application execution unit 1201 sets a virtual pressing point at the fingertip position of the index finger. Then, the virtual pressing point may be presented to the user using the AR glass.
- the application execution unit 1201 determines that the contact operation mode is set.
- the collider When the application execution unit 1201 determines that the contact operation mode is set, the collider is set on the finger, palm, and back of the user's hand. Then, the collider may be presented to the user using the AR glass.
- the distance between the tip of the thumb and the index finger is a predetermined value d or less, which is an indispensable condition for the grip operation mode. ..
- the space between the fingertips exceeds a predetermined value d due to the size of the virtual object, which is close to the contact operation mode. It can occur in the posture of the fingers.
- the application execution unit 1201 may determine the operation mode different from the user's intention based on the posture of the fingers obtained through the controller 110.
- the user may directly instruct the AR system 100 of an intended operation mode such as a "grasping operation mode" by utterance.
- the application execution unit 1201 may switch from the operation mode once determined based on the utterance content input from the microphone 123 to the operation mode directly instructed by the user.
- the user may directly instruct the switching of the operation mode by utterance.
- any one of the plurality of operation modes is selectively determined.
- the mode transition diagrams shown in FIGS. 27 and 33 are based on the premise that the AR system 100 sets any one of the operation modes. As a modified example, it is conceivable to control a virtual object by coexisting a plurality of operation modes.
- the virtual object moves and approaches the user's hand
- a collider is set on the finger, palm, and back of the user's hand as in the contact operation mode, and the virtual object is used. It is possible to place a virtual object on the palm of the hand based on the positional relationship with the user's hand.
- the behavior when the operation by the user's hand is performed is individually set for each virtual object.
- the interaction between the user's hand and the virtual object may be realized.
- the behavior for user operations for each virtual object may be set according to the shape, size, category, and the like. For example, a small and light virtual object can be gripped and touched by the user, but a large and heavy virtual object such as a desk cannot be gripped or touched by the user ( In other words, the gripping operation and the contact operation cannot be performed). Further, different behaviors for user operations may be set for the same virtual object for each operation mode.
- E-7 Coexistence of a plurality of operation modes
- data such as the distance between fingertips is near the threshold value, and it is difficult to determine the mode. Can occur.
- two or more operation modes that are difficult to determine may coexist at the same time.
- the application execution unit 1201 sets the virtual gripping point at a position having a certain offset with respect to the back of the hand, and sets the virtual pressing point at the fingertip position of the index finger.
- the application execution unit 1201 does not change the virtual space even if the virtual gripping point collides with the virtual button.
- the application execution unit 1201 activates the operation of the virtual button when the virtual button pressing operation that the virtual pressing point touches the virtual button is performed.
- the application execution unit 1201 operates the user based on the posture of the fingers. It is possible to determine the mode and perform user operations on the virtual object for each operation mode.
- the application execution unit 1201 performs a gripping operation of the virtual object using the virtual gripping points as shown in FIGS. 20 to 26. be able to.
- the controller 110 is equipped with a finger gesture recognition unit 113 such as fingertip contact recognition as shown in FIG.
- the user attaches the controller 110 to the back of the hand, as a calibration for setting the position of the virtual gripping point, the user is instructed to set the position of the fingertip when performing the gripping operation, and the position is recorded. You may do so.
- the virtual gripping point is calibrated by a method different from the above. For example, it acquires physical information such as the size, gender, and height of the user's hand, estimates the size of the user's hand based on the physical information, and has a position with a certain offset based on the estimated hand size.
- the virtual gripping point may be set in.
- the user's physical information may be input to the AR system 100 using the user's speech or other input device, but is obtained through the head sensor unit 120 or the like when the user wears the AR glass.
- the user's physical information may be inferred based on the spatial position information.
- the virtual grip point set based on the hand size estimated from the user's physical information will be There is concern that it will not be accurate.
- the virtual gripping point is a position where the fingertips used for gripping are expected to come into contact with each other, and should be originally set at a position having a certain offset with respect to the user's hand. It is set at a position having a certain offset with respect to the controller 110 main body. Therefore, the virtual gripping point also changes depending on the position when the controller 110 is attached to the back of the hand. For example, each time the controller 110 is attached, the offset of the virtual gripping point with respect to the user's hand changes.
- the hand position detection unit 111 is used to detect the hand position information that can be accurately detected, and virtualize it. A method of calibrating the gripping point is effective.
- FIG. 37 shows a method of calibrating the virtual gripping point based on the hand position information detected by the hand position detecting unit 111 of the controller 110.
- calibration is performed using two controllers 110R and 110L mounted on the backs of the right and left hands of the user, respectively.
- the finger posture referred to here is a posture in which the thumb of the right hand and the fingertips of other fingers (index finger, etc.) are brought into contact with each other at a specific position of the controller 110L main body attached to the back of the left hand.
- the application execution unit 1201 may display a guide instruction on the AR glass so that the user can guide the user to the finger posture as shown in FIG. 37.
- the right hand position and the left hand position detected by the controllers 110R and 110L are recorded.
- the relative position information of the two controllers 110R and 110L is used.
- the position of the virtual gripping point of the right hand having a certain offset with respect to the controller 110R main body mounted on the back of the right hand can be directly obtained.
- the virtual gripping point of the right hand can be obtained based on the offset amount of the specific position of the controller 110L with respect to the position of the controller 110R main body.
- FIG. 37 shows an example of calibrating the virtual gripping point of the right hand.
- When calibrating the virtual grip point of the left hand flip the left and right to instruct the user to take the finger posture of the left hand, and calculate the virtual grip point of the left hand in the same manner as above. do it.
- the position of the virtual grip point calculated with the right hand is flipped left and right, and the position of the virtual grip point of the left hand is set so that the calibration can be performed only once. May be good.
- the winding length of the belt 11 is detected by a sensor.
- a method of obtaining information on the circumference of the hand, estimating the length of the finger corresponding to the circumference, and calculating the offset distance of the virtual gripping point with respect to the main body of the controller 10 is also conceivable.
- FIG. 39 shows a state in which the virtual gripping point 3902 is erroneously set at a position farther from the fingertip than the position of the virtual gripping point 3901 which should be originally.
- an aura is generated around the virtual object at the timing when the virtual gripping point 3902 set at a distant position by mistake enters the gripping detection area. Then, the user tries to grip the virtual object by aiming at the virtual grip point 3902 that is mistakenly set at a distant position. Therefore, as shown in FIG. 41, the user picks the virtual object at a position deviated from the center of the virtual object. I will try. As a result, the operation of grasping the virtual object observed by the user through the AR glass becomes an unnatural image and an unrealistic image.
- the application execution unit 1201 corrects the offset amount of the virtual grip point with respect to the controller 110 body based on the relative positional relationship between the position where the virtual object is arranged and the contact position between the fingertips acquired through the controller 110. It may be detected and running calibration may be performed.
- the virtual gripping point When using the controller 110 that is not equipped with the finger posture recognition unit, the virtual gripping point is set to an incorrect position, so that gripping occurs continuously outside the range of the virtual object or in a place far from the center of the virtual object. Due to the tendency of such an erroneous gripping operation, the application execution unit 1201 assumes that the virtual gripping point is set to an erroneous position, and virtually grips the virtual object at the center position where the grip is supposed to be performed. Running calibration of the virtual gripping point position is performed by performing a process such as gradually bringing the points closer to each other.
- the virtual gripping point is a position where the fingertips used for gripping are expected to come into contact with each other when performing the gripping operation at the current hand position (described above).
- the virtual object can be gripped with the virtual gripping point located near the center.
- the application execution unit 1201 assumes that the virtual gripping point is set to the wrong position due to the tendency of the wrong gripping operation, and the virtual gripping point is located at the center position of the virtual object that should be gripped.
- the running calibration of the virtual gripping point position is performed by performing a process such as gradually approaching the position of. When the running calibration is performed, the user can perform a normal and realistic gripping operation of the virtual object as shown in FIGS. 21 to 26.
- the virtual gripping point is set to a position where the fingertips used for gripping are expected to come into contact with each other when the gripping operation is performed at the current hand position (or in the gripping operation mode).
- Set In the gripping flow of the virtual object (see, for example, FIGS. 21 to 26), if the marker indicating the virtual gripping point is displayed on the AR glass, the user can see the virtual gripping point collide with the virtual object. You can recognize that you are in a state of being.
- the hit area defined in the virtual object does not have to match the apparent size of the virtual object, for example, the hit area is set to 1 than the size of the virtual object. It may be set relatively large.
- the virtual gripping point may be treated as a "virtual gripping area" such as a sphere having a region volume instead of a point having no region, and the size of this region may be controlled.
- a virtual gripping area such as a sphere having a region volume instead of a point having no region
- the AR system 100 displays the virtual gripping point 4301 at a position having a certain offset with respect to the back of the hand as shown in FIG. 43 in the gripping operation mode.
- the controller 110 is omitted for simplification.
- FIG. 45 shows an example in which the visibility is improved by changing the virtual gripping point 4501 from a “point” to a “cross”. Even if the position of the virtual grip point 4501 itself is hidden by the occlusion by the hand, the user can understand that the virtual grip point 4501 is at the position where the crosses intersect.
- the application execution unit 1201 detects the occlusion of the virtual grip point by the hand based on the position of the hand acquired through the controller, or detects the occlusion of the virtual grip point by the hand based on the image taken by the outward camera 121, it is virtual.
- the method of displaying the gripping point may be switched.
- the virtual gripping point is displayed in the same color as the nearby (for example, trying to grip) virtual object, it will be difficult to see. Therefore, the visibility may be improved by displaying the virtual gripping points using the complementary colors of other virtual objects.
- the collider or the virtual push point is displayed using the complementary colors of the target virtual object or virtual button. If this is done, the visibility will be improved.
- the AR system 100 detects the position of the controller installed on the back of the user's hand, sets a virtual grip point at a position with a certain offset from the position, and sets a virtual grip point to the virtual object with the virtual grip point. It is configured to perform gripping interaction based on the positional relationship and the contact state between the fingertips trying to grip the virtual object. Even when the finger position cannot be detected accurately due to problems such as occlusion or detection accuracy, it is possible to realize intuitive and comfortable gripping interaction of virtual objects by the user based on the position and orientation information of the controller. can.
- the AR system 100 is configured to switch the operation mode performed by the user's fingers on the virtual object based on the posture information of the fingers acquired through the controller.
- the operation mode includes a grip operation mode in which the virtual object is gripped by contact between fingertips, and a contact operation mode in which the virtual object is touched with the palm or fingertip. Therefore, the user can easily and intuitively perform a plurality of operations for grasping and touching the same virtual object without assisting with a UI or the like.
- the AR system 100 includes switching to a button operation mode in which a virtual button is pressed with a fingertip in addition to the grip operation mode and the contact operation mode, based on the posture information of the fingers acquired through the controller. It is configured to switch modes. Therefore, the user can intuitively switch between the three operation modes without any load, and can smoothly realize the intended operation of the virtual object. In addition, it is possible to prevent the user from invoking the operation of pressing the virtual button unintentionally.
- the AR system 100 is configured to calibrate the position of the virtual gripping point based on the position information of the hand acquired through the controller. There are individual differences in the size of the user's hand, and it is assumed that the position of the controller with respect to the back of the hand changes each time the controller is installed on the back of the hand. Therefore, the offset of the virtual gripping point with respect to the position of the controller varies depending on individual differences and the installation position of the controller. According to the AR system 100 according to the present disclosure, it is possible to correct to an accurate virtual gripping point by calibration, so that each user performs an appropriate gripping operation of a virtual object when the controller is installed on the back of the hand. Can be done.
- the virtual gripping point with respect to the back of the hand is based on the contact position between the fingertips and the amount of deviation of the virtual gripping point when the user performs the gripping operation of the virtual object in the gripping operation mode. It is configured to calculate the correction value of the offset amount and dynamically calibrate the position of the virtual gripping point. Therefore, while the user performs the gripping operation of the virtual object, the discrepancy between the contact position between the fingertips and the position of the virtual gripping point is naturally resolved.
- the AR system 100 can display the virtual gripping point using the AR glass to support the user's gripping operation of the virtual object. Further, the AR system 100 according to the present disclosure displays a collider for contact determination on the palm or fingertip in the contact operation mode, and displays a virtual pressing point on the fingertip that presses the virtual button in the button operation mode. It is possible to assist the user in operating the virtual object for each operation mode.
- the AR system 100 is configured to change the display method of the virtual gripping point in a situation where the visibility of the virtual gripping point is impaired by the occlusion of the virtual gripping point and the user's hand in the gripping operation mode. Has been done. Therefore, even if an occlusion of the virtual grip point occurs by hand, the visibility of the virtual grip point is ensured by changing the display method, so that the user can easily grasp the position of the virtual grip point and prepare the virtual object.
- the gripping operation can be performed.
- the present specification has mainly described the embodiment in which the interaction between the fingertip and the virtual object according to the present disclosure is applied to the AR system
- the gist of the present disclosure is not limited to this.
- the present disclosure can be similarly applied to a VR system that perceives a virtual space as reality, an MR system that intersects reality and virtual, and the like, and interaction between a fingertip and a virtual object can be realized.
- the technology disclosed in this specification can also have the following configuration.
- An acquisition unit that acquires the position of the user's hand and gestures of the fingers
- a control unit that controls the display operation of a display device that superimposes and displays virtual objects in real space, Equipped with The control unit sets a virtual gripping point at a position having a certain offset with respect to the hand, and displays the gripping operation of the virtual object based on the positional relationship between the virtual gripping point and the virtual object and the contact state between the fingertips. Control the display device so as to Information processing device.
- the acquisition unit acquires the position of the hand and the gesture of the fingers based on the sensor information from the sensor attached to the back of the hand.
- the information processing device according to (1) above.
- the acquisition unit further acquires the posture of the user's fingers.
- the control unit controls the switching of the operation mode performed by the user's fingers on the virtual object based on the posture information of the fingers.
- the information processing device according to any one of (1) and (2) above.
- the control unit controls mode switching between a gripping operation mode in which the virtual object is gripped by contact between fingertips and a contact operation mode in which the virtual object is touched by the palm or fingertips.
- the control unit further controls mode switching to the button operation mode in which the virtual button is pressed with a fingertip.
- the control unit determines the operation mode by using the information on the opening degree of the fingers or the distance between the fingertips.
- the information processing device according to any one of (3) to (5) above.
- the control unit calibrates the position of the virtual gripping point based on the position of the hand acquired by the acquisition unit.
- the information processing device according to any one of (1) to (6) above.
- the control unit calculates a correction value of the offset amount of the virtual gripping point with respect to the hand based on the contact position between the fingertips and the deviation amount of the virtual gripping point when the user performs the gripping operation of the virtual object. Then, calibrate the position of the virtual gripping point.
- the information processing device according to any one of (1) to (7) above.
- the control unit controls the display of the virtual gripping point by the display device.
- the information processing device according to any one of (1) to (8) above.
- the control unit controls the display device so as to change the display method of the virtual grip point in a situation where the visibility of the virtual grip point is impaired by the occlusion of the virtual grip point and the user's hand.
- the information processing device according to (9) above.
- a virtual gripping point is set at a position having a certain offset with respect to the hand, and the gripping operation of the virtual object is displayed based on the positional relationship between the virtual gripping point and the virtual object and the contact state between the fingertips.
- An acquisition unit that acquires the position of the user's hand and the gesture of the finger.
- a control unit that controls the display operation of a display device that superimposes and displays virtual objects in real space.
- the control unit sets a virtual gripping point at a position having a certain offset with respect to the hand, and displays the gripping operation of the virtual object based on the positional relationship between the virtual gripping point and the virtual object and the contact state between the fingertips. Control the display device so as to Computer program.
- a display device that superimposes and displays virtual objects in real space
- An acquisition unit that acquires the position of the user's hand and gestures of the fingers
- a control unit that controls the display operation of the display device, Equipped with The control unit sets a virtual gripping point at a position having a certain offset with respect to the hand, and displays the gripping operation of the virtual object based on the positional relationship between the virtual gripping point and the virtual object and the contact state between the fingertips. Control the display device so as to Augmented reality system.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- User Interface Of Digital Computer (AREA)
Priority Applications (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2022509330A JPWO2021192589A1 (enExample) | 2020-03-24 | 2021-01-27 | |
| EP21775558.6A EP4099135A4 (en) | 2020-03-24 | 2021-01-27 | Information processing device, information processing method, computer program, and augmented reality sensing system |
| CN202180021726.1A CN115298646A (zh) | 2020-03-24 | 2021-01-27 | 信息处理设备、信息处理方法、计算机程序和增强现实感测系统 |
| US17/906,321 US20230095328A1 (en) | 2020-03-24 | 2021-01-27 | Information processing apparatus, information processing method, computer program, and augmented reality system |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2020-053386 | 2020-03-24 | ||
| JP2020053386 | 2020-03-24 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2021192589A1 true WO2021192589A1 (ja) | 2021-09-30 |
Family
ID=77891293
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2021/002883 Ceased WO2021192589A1 (ja) | 2020-03-24 | 2021-01-27 | 情報処理装置及び情報処理方法、コンピュータプログラム、並びに拡張現実感システム |
Country Status (5)
| Country | Link |
|---|---|
| US (1) | US20230095328A1 (enExample) |
| EP (1) | EP4099135A4 (enExample) |
| JP (1) | JPWO2021192589A1 (enExample) |
| CN (1) | CN115298646A (enExample) |
| WO (1) | WO2021192589A1 (enExample) |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20230133168A1 (en) * | 2021-10-31 | 2023-05-04 | Hongfujin Precision Electrons (Yantai) Co., Ltd. | Method for identifying human postures and gestures for interaction purposes and portable hand-held device |
| WO2023176420A1 (ja) * | 2022-03-18 | 2023-09-21 | ソニーグループ株式会社 | 情報処理装置及び情報処理方法 |
Families Citing this family (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN117256024A (zh) * | 2021-04-12 | 2023-12-19 | 斯纳普公司 | 使用力反馈为视力障碍者提供ar |
| CN116820237A (zh) * | 2023-06-28 | 2023-09-29 | 中兴通讯股份有限公司 | 手势控制方法、可穿戴设备、计算机可读介质 |
| US20250093960A1 (en) * | 2023-09-18 | 2025-03-20 | Htc Corporation | Method for controlling view angle, host, and computer readable storage medium |
| CN120307310B (zh) * | 2025-06-19 | 2025-09-12 | 元梦空间文化传播(成都)有限公司 | 基于动态释放距离调整的抓取操作方法、系统及程序产品 |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2000501033A (ja) * | 1995-11-30 | 2000-02-02 | ヴァーチャル テクノロジーズ インコーポレイテッド | 触覚をフィードバックする人間/機械インターフェース |
| JP2004234253A (ja) * | 2003-01-29 | 2004-08-19 | Canon Inc | 複合現実感呈示方法 |
| JP2011128220A (ja) * | 2009-12-15 | 2011-06-30 | Toshiba Corp | 情報提示装置、情報提示方法及びプログラム |
| JP2018013938A (ja) * | 2016-07-20 | 2018-01-25 | 株式会社コロプラ | 仮想空間を提供する方法、仮想体験を提供する方法、プログラム、および記録媒体 |
| JP2019046291A (ja) | 2017-09-05 | 2019-03-22 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理装置および画像表示方法 |
Family Cites Families (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10198855B2 (en) * | 2016-07-20 | 2019-02-05 | Colopl, Inc. | Method of providing virtual space, method of providing virtual experience, system and medium for implementing the methods |
| US11875012B2 (en) * | 2018-05-25 | 2024-01-16 | Ultrahaptics IP Two Limited | Throwable interface for augmented reality and virtual reality environments |
| US11422623B2 (en) * | 2019-10-23 | 2022-08-23 | Interlake Research, Llc | Wrist worn computing device control systems and methods |
-
2021
- 2021-01-27 EP EP21775558.6A patent/EP4099135A4/en not_active Withdrawn
- 2021-01-27 JP JP2022509330A patent/JPWO2021192589A1/ja active Pending
- 2021-01-27 US US17/906,321 patent/US20230095328A1/en not_active Abandoned
- 2021-01-27 WO PCT/JP2021/002883 patent/WO2021192589A1/ja not_active Ceased
- 2021-01-27 CN CN202180021726.1A patent/CN115298646A/zh not_active Withdrawn
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2000501033A (ja) * | 1995-11-30 | 2000-02-02 | ヴァーチャル テクノロジーズ インコーポレイテッド | 触覚をフィードバックする人間/機械インターフェース |
| JP2004234253A (ja) * | 2003-01-29 | 2004-08-19 | Canon Inc | 複合現実感呈示方法 |
| JP2011128220A (ja) * | 2009-12-15 | 2011-06-30 | Toshiba Corp | 情報提示装置、情報提示方法及びプログラム |
| JP2018013938A (ja) * | 2016-07-20 | 2018-01-25 | 株式会社コロプラ | 仮想空間を提供する方法、仮想体験を提供する方法、プログラム、および記録媒体 |
| JP2019046291A (ja) | 2017-09-05 | 2019-03-22 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理装置および画像表示方法 |
Cited By (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20230133168A1 (en) * | 2021-10-31 | 2023-05-04 | Hongfujin Precision Electrons (Yantai) Co., Ltd. | Method for identifying human postures and gestures for interaction purposes and portable hand-held device |
| JP2023067744A (ja) * | 2021-10-31 | 2023-05-16 | 鴻富錦精密電子(煙台)有限公司 | 姿勢認識方法、姿勢認識機器及び記憶媒体 |
| JP7402941B2 (ja) | 2021-10-31 | 2023-12-21 | 鴻富錦精密電子(煙台)有限公司 | 姿勢認識方法、姿勢認識機器及び記憶媒体 |
| US12380726B2 (en) * | 2021-10-31 | 2025-08-05 | Hongfujin Precision Electrons (Yantai) Co., Ltd. | Method for identifying human postures and gestures for interaction purposes and portable hand-held device |
| WO2023176420A1 (ja) * | 2022-03-18 | 2023-09-21 | ソニーグループ株式会社 | 情報処理装置及び情報処理方法 |
Also Published As
| Publication number | Publication date |
|---|---|
| EP4099135A1 (en) | 2022-12-07 |
| JPWO2021192589A1 (enExample) | 2021-09-30 |
| EP4099135A4 (en) | 2023-10-11 |
| US20230095328A1 (en) | 2023-03-30 |
| CN115298646A (zh) | 2022-11-04 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2021192589A1 (ja) | 情報処理装置及び情報処理方法、コンピュータプログラム、並びに拡張現実感システム | |
| JP7589696B2 (ja) | 情報処理装置及び情報処理方法、コンピュータプログラム、並びに拡張現実感システム | |
| US10191281B2 (en) | Head-mounted display for visually recognizing input | |
| CN105026983B (zh) | 提供眼睛注视校准的头戴式显示器及其控制方法 | |
| US20150002475A1 (en) | Mobile device and method for controlling graphical user interface thereof | |
| US10534432B2 (en) | Control apparatus | |
| US20230341936A1 (en) | Information processing device, information processing method, computer program, and augmented reality system | |
| US20170293351A1 (en) | Head mounted display linked to a touch sensitive input device | |
| WO2016189372A2 (en) | Methods and apparatus for human centric "hyper ui for devices"architecture that could serve as an integration point with multiple target/endpoints (devices) and related methods/system with dynamic context aware gesture input towards a "modular" universal controller platform and input device virtualization | |
| KR20140090968A (ko) | 시선 캘리브레이션을 제공하는 헤드 마운트 디스플레이 및 그 제어 방법 | |
| AU2013347935A1 (en) | Computing interface system | |
| WO2018003862A1 (ja) | 制御装置、表示装置、プログラムおよび検出方法 | |
| EP3943167A1 (en) | Device provided with plurality of markers | |
| JPWO2017134732A1 (ja) | 入力装置、入力支援方法および入力支援プログラム | |
| US12197652B2 (en) | Control device and control method with set priorities for input operations in competitive relationship | |
| CN115777091A (zh) | 检测装置及检测方法 | |
| WO2021145068A1 (ja) | 情報処理装置及び情報処理方法、コンピュータプログラム、並びに拡張現実感システム | |
| US9940900B2 (en) | Peripheral electronic device and method for using same | |
| US20250165078A1 (en) | Information processing apparatus and information processing method | |
| JPWO2020026380A1 (ja) | 表示装置、表示方法、プログラム、ならびに、非一時的なコンピュータ読取可能な情報記録媒体 | |
| EP4610784A1 (en) | Information processing device and information processing method | |
| CN205620935U (zh) | 一种用于虚拟现实系统的道具 | |
| US20250377740A1 (en) | Dock tracking for an ar/vr device | |
| EP4610795A1 (en) | Information processing device and information processing method | |
| JP2024154673A (ja) | 処理装置、制御方法、プログラム |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21775558 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2022509330 Country of ref document: JP Kind code of ref document: A |
|
| ENP | Entry into the national phase |
Ref document number: 2021775558 Country of ref document: EP Effective date: 20220903 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |