WO2022174575A1 - 头戴设备交互模式的切换方法及系统 - Google Patents
头戴设备交互模式的切换方法及系统 Download PDFInfo
- Publication number
- WO2022174575A1 WO2022174575A1 PCT/CN2021/116300 CN2021116300W WO2022174575A1 WO 2022174575 A1 WO2022174575 A1 WO 2022174575A1 CN 2021116300 W CN2021116300 W CN 2021116300W WO 2022174575 A1 WO2022174575 A1 WO 2022174575A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- standard deviation
- tracking
- data
- handle
- interaction mode
- Prior art date
Links
- 230000003993 interaction Effects 0.000 title claims abstract description 135
- 238000000034 method Methods 0.000 title claims abstract description 64
- 230000002452 interceptive effect Effects 0.000 claims description 23
- 230000008569 process Effects 0.000 claims description 18
- 230000003287 optical effect Effects 0.000 claims description 6
- 238000004590 computer program Methods 0.000 claims description 3
- 238000005259 measurement Methods 0.000 claims description 3
- 230000001133 acceleration Effects 0.000 claims description 2
- 230000003213 activating effect Effects 0.000 abstract 1
- 238000004891 communication Methods 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/211—Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/213—Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/25—Output arrangements for video game devices
- A63F13/26—Output arrangements for video game devices having at least one additional display device, e.g. on the game controller or outside a game booth
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0338—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of limited linear or angular displacement of an operating part of the device from a neutral position, e.g. isotonic or isometric joysticks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
- G06F3/0383—Signal control means within the pointing device
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/28—Recognition of hand or arm movements, e.g. recognition of deaf sign language
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D30/00—Reducing energy consumption in communication networks
- Y02D30/70—Reducing energy consumption in communication networks in wireless communication networks
Definitions
- the present disclosure relates to the technical field of head-mounted integrated devices, and more particularly, to a method and system for switching an interaction mode of a head-mounted device.
- the mainstream VR (Virtual Reality)/AR (Augmented Reality)/MR (Mixed Reality) headsets can support both handle tracking controller and gesture recognition interaction, but the current use of the two interaction methods requires switching.
- Manually set the switching generally in the user UI (User Interface, user interface), set the switching options of the two interaction modes of the handle control tracker and gesture recognition, and manually switch the settings by the user.
- the processing of the handle tracking controller and the calculation processing of the gesture recognition interaction are generally XOR processing, that is When the current system is performing the handle tracking interaction, the processing module of the gesture recognition interaction will be suspended by the system, or when the current system is performing the processing module of the gesture recognition interaction, the processing module of the handle tracking interaction will be suspended by the system.
- the traditional processing mechanism is that the system accepts a control command through the application layer, which module should be processed at present, but in the field of VR/AR/MR, from the data of multi-user use, the frequency of use of the joystick control tracker is usually greater than The frequency of use of gesture recognition, through tedious manual switching of the use of the two interaction methods, seriously affects the frequency of use of gesture recognition and reduces user experience.
- the embodiments of the present disclosure provide a method and system for switching the interaction mode of a head mounted device, which can solve the problems of cumbersome operation, poor flexibility, and impact on user experience in the mode interaction process of the current head mounted device.
- An embodiment of the present disclosure provides a method for switching an interaction mode of a head-mounted device, wherein the interaction mode of the head-mounted device includes a switchable handle tracking interaction mode and a bare-hand tracking interaction mode; the switching method includes: obtaining the degree of freedom of the handle ( Six Degree of Freedom, referred to as 6Dof) tracking data and Inertial Measurement Unit (Inertial Measurement Unit, referred to as IMU) data, 6Dof tracking data includes the position data and attitude data of the handle; respectively obtain the standard deviation of the position data and the standard of the attitude data.
- 6Dof degree of freedom of the handle
- IMU Inertial Measurement Unit
- the standard deviation of the position data, the standard deviation of the attitude data and the standard deviation of the accelerometer data within the current first preset time meet the first preset condition; wherein, When the standard deviation of the position data, the standard deviation of the attitude data, and the standard deviation of the accelerometer data meet the first preset condition, it is determined that the handle tracking interaction mode is not activated, and the bare-hand tracking interaction mode is activated; Set the standard deviation of the accelerometer data within the time period, and determine whether the standard deviation of the accelerometer data meets the second preset condition; wherein, when the standard deviation of the accelerometer data meets the second preset condition, suspend the bare-hand tracking interaction mode , and activate the controller tracking interactive mode.
- the process of judging whether the standard deviation of the position data, the standard deviation of the attitude data and the standard deviation of the accelerometer data meet the first preset condition includes: judging the standard deviation of the position data and the standard deviation of the attitude data Whether the difference is within the first preset threshold, and when the standard deviation of the position data and the standard deviation of the attitude data are both within the first preset threshold, continue to determine whether the standard deviation of the accelerometer data is within the second preset threshold ; When the standard deviation of the position data and the standard deviation of the attitude data are both within the first preset threshold, and the standard deviation of the accelerometer data is within the second preset threshold, it is determined that the first preset condition is met.
- the first preset time is 1s ⁇ 4s
- the second preset time is 1s ⁇ 4s
- the first preset threshold is 0.6
- the second preset threshold is 0.2.
- the first preset time is 1.5s.
- the process of judging whether the standard deviation of the accelerometer data meets the second preset condition includes: judging whether the standard deviations of the consecutive preset number of frames of the accelerometer data within the second preset time are uniform is greater than the third preset threshold; when the standard deviation of the consecutive preset number of frames of the accelerometer data within the second preset time is greater than the third preset threshold, it is determined that the second preset condition is met.
- the second preset time is 1s ⁇ 4s; the third preset threshold is 0.03.
- the second preset time is 1.5s.
- the handle tracking interaction mode includes: a signal connection between the handle and the head-mounted device, and performing virtual reality interaction through the handle;
- the bare-hand tracking interaction mode includes: collecting and recognizing the gesture information and tracking information of the bare hand, and performing virtual reality interaction through the handle. Gesture information and tracking information for virtual reality interaction.
- At least one bare-hand tracking camera and at least two environment tracking cameras are provided on the head-mounted device; wherein, the bare-hand tracking cameras are configured to capture gesture information and tracking information of the bare hands, and the environment tracking The camera is set to obtain 6DoF positioning information of the head-mounted device relative to the physical environment; the bare-hand tracking camera and/or the environment tracking camera includes: a depth camera, a binocular infrared camera, an RGB camera or a monochrome camera.
- At least two handle tracking controllers are provided in the handle, and the handle tracking controllers are configured to obtain 6DoF tracking information of the handle relative to the headset in real time; wherein, the handle tracking controllers include ultrasonic sensors, Electromagnetic sensors, optical sensors, and 9-axis inertial navigation sensors.
- the switching method further includes a default mode setting process, wherein the default mode setting process includes: setting the default mode to be a handle tracking interactive mode or a bare hand through a control system connected to the head mounted device Tracking interaction mode; the control system is also set to control the switching between the handle tracking interaction mode and the bare-hand tracking interaction mode.
- the default mode setting process includes: setting the default mode to be a handle tracking interactive mode or a bare hand through a control system connected to the head mounted device Tracking interaction mode; the control system is also set to control the switching between the handle tracking interaction mode and the bare-hand tracking interaction mode.
- the control system when the standard deviation of the position data, the standard deviation of the attitude data and the standard deviation of the accelerometer data meet the first preset condition, the control system starts the bare-hand tracking interaction mode, and correspondingly closing the handle tracking interaction mode; when the standard deviation of the accelerometer data meets the second preset condition, the handle tracking interaction mode is activated through the control system, And correspondingly close the bare-hand tracking interaction mode.
- the switching method further includes: when the standard deviation of the accelerometer data does not meet the second preset condition, continuing to execute the bare-hand tracking interaction mode, and suspending the handle tracking interaction mode.
- An embodiment of the present disclosure provides a head-mounted device interaction mode switching system, including: a tracking data and IMU data acquisition unit, configured to acquire 6Dof tracking data and IMU data of a handle, where the 6Dof tracking data includes position data and attitude data of the handle
- the standard deviation obtaining unit is set to obtain the standard deviation of the position data, the standard deviation of the attitude data, and the standard deviation of the accelerometer data in the IMU data respectively
- the first preset condition judgment unit is set to judge the current first preset Whether the standard deviation of the position data, the standard deviation of the attitude data, and the standard deviation of the accelerometer data in time meet the first preset condition; wherein, when the standard deviation of the position data, the standard deviation of the attitude data and the standard deviation of the accelerometer data meet the In the first preset condition, it is determined that the handle tracking interaction mode is not activated, and the bare-hand tracking interaction mode is activated; the second preset condition judgment unit is set to obtain the standard deviation of the accelerometer data in the second preset time in real time,
- the first preset condition judging unit is configured to judge whether the standard deviation of the position data, the standard deviation of the attitude data and the standard deviation of the accelerometer data conform to the first Preset condition: determine whether the standard deviation of the position data and the standard deviation of the attitude data are both within the first preset threshold, and when the standard deviation of the position data and the standard deviation of the attitude data are both within the when the standard deviation of the accelerometer data is within the second preset threshold; when both the standard deviation of the position data and the standard deviation of the attitude data are within the first Within a preset threshold, and when the standard deviation of the accelerometer data is within the second preset threshold, it is determined that the first preset condition is met.
- the second preset condition judgment unit is configured to judge whether the standard deviation of the accelerometer data meets the second preset condition by: judging the continuous preset of the accelerometer data Whether the standard deviation of the number of frames within the second preset time is greater than the third preset threshold; when the standard deviation of the consecutive preset number of frames of the accelerometer data within the second preset time If it is greater than the third preset threshold, it is determined that the second preset condition is met.
- the handle tracking interaction mode includes: the handle is signal-connected to the headset, and virtual reality interaction is performed through the handle; the bare-hand tracking interaction mode includes: collecting and identifying Gesture information and tracking information of the bare hand, and virtual reality interaction is performed through the gesture information and the tracking information.
- At least one bare-hand tracking camera and at least two environment tracking cameras are provided on the head-mounted device; wherein, the bare-hand tracking cameras are used to capture gesture information of the bare hands and tracking information, the environment tracking camera is used to obtain the 6DoF positioning information of the head-mounted device relative to the physical environment in which it is located; the bare-hand tracking camera and/or the environment tracking camera include: a depth camera, a binocular infrared camera , RGB camera, or monochrome camera.
- At least two handle tracking controllers are provided in the handle, and the handle tracking controllers are used to obtain 6DoF tracking information of the handle relative to the head mounted device in real time; wherein, the The handle tracking controller includes ultrasonic sensors, electromagnetic sensors, optical sensors and 9-axis inertial navigation sensors.
- An embodiment of the present disclosure provides a non-transitory computer-readable storage medium on which a computer program is stored, and when the computer program is executed by a processor, implements the method described in any of the foregoing embodiments or exemplary embodiments. method.
- the handle tracking interaction mode and the bare hand tracking are determined. Enable or disable the interactive mode; at the same time, determine in real time whether the standard deviation of the accelerometer data meets the second preset condition, and determine whether to enable or disable the bare-hand tracking interaction mode and the handle tracking interaction mode, which can be based on the user's current actions or behavior habits. , stably, naturally, and smoothly switch the interactive mode, with good user experience and a wide range of applications.
- FIG. 1 is a flowchart of a method for switching an interaction mode of a head mounted device according to an embodiment of the present disclosure
- FIG. 2 is a schematic block diagram of a system for switching an interactive mode of a head mounted device according to an embodiment of the present disclosure
- FIG. 3 is a logic diagram of an electronic device according to an embodiment of the present disclosure.
- FIG. 1 shows a flow of a method for switching an interaction mode of a head mounted device according to an embodiment of the present disclosure.
- a method for switching the interaction mode of a head-mounted device according to an embodiment of the present disclosure, wherein the interaction mode of the head-mounted device includes a switchable handle tracking interaction mode and a bare-hand tracking interaction mode; wherein, the steps of the switching method are include:
- S120 respectively obtain the standard deviation of the position data, the standard deviation of the attitude data, and the standard deviation of the accelerometer data in the IMU data;
- S130 Determine whether the standard deviation of the position data, the standard deviation of the attitude data, and the standard deviation of the accelerometer data within the current first preset time meet the first preset condition;
- the process of judging whether the standard deviation of the position data, the standard deviation of the attitude data and the standard deviation of the accelerometer data within the first preset time meets the first preset condition includes:
- the first preset time may be set to 1s ⁇ 4s, preferably 1.5s, the second preset time is 1s ⁇ 4s; the first preset threshold is 0.6, and the second preset threshold is 0.2.
- the process of judging whether the standard deviation of the accelerometer data meets the second preset condition includes: judging whether the standard deviation of the consecutive preset number of frames of the accelerometer data within the second preset time is greater than the third preset threshold ; When the standard deviation of the consecutive preset number of frames of the accelerometer data within the second preset time is greater than the third preset threshold, it is determined that the second preset condition is met.
- the second preset time can be set to 1s to 4s, preferably 1.5s
- the third preset threshold can be set to 0.03
- the continuous preset number of frames can be selected as continuous 3-10 or 5-10 frames, etc., preferably 5 frames, in other words, when the standard deviation of the accelerometer data is greater than 0.03 within 1.5s of 5 consecutive frames, it can be considered that the user starts to use the handle tracking interaction mode again, and then the bare hand tracking interaction mode can be suspended, and the handle tracking can be started. interactive mode.
- first preset time, the second preset interval, the first preset threshold, the second preset threshold, the third preset threshold, and the relevant data of the preset number of frames can all be based on specific data.
- the setting and adjustment of the application product, application environment or customer needs are not limited to the above specific values.
- the handle tracking interaction mode includes: the handle is signally connected to the headset, and the handle is used to operate the handle to perform virtual reality interaction, and the bare-hand tracking interaction mode includes: using the handle set on the headset to perform virtual reality interaction.
- Each camera collects and recognizes the gesture information and tracking information of the bare hand, and performs virtual reality interaction through the gesture information and tracking information.
- At least one bare-hand tracking camera and at least two environment tracking cameras are provided on the head-mounted device; wherein, the bare-hand tracking camera is set to capture gesture information and tracking information of the bare hand, and the environment tracking camera is set to obtain the head-mounted camera.
- At least two handle tracking controllers are arranged in the handle, and the handle tracking controllers are set to obtain the 6DoF tracking information of the handle relative to the headset in real time; wherein, the handle tracking controllers include ultrasonic sensors, electromagnetic sensors, optical sensors, and 9 Axis inertial navigation sensors and other types of sensors.
- the method for switching the interaction mode of the head-mounted device of the present disclosure further includes a default mode setting process: the default mode is set as the handle tracking interaction mode or the bare-hand tracking interaction mode through the control system connected with the head-mounted device; the control system also It is set to control the switching between the handle tracking interaction mode and the bare-hand tracking interaction mode.
- the bare-hand tracking interaction mode When the standard deviation of the above parameters satisfies the first preset condition, the bare-hand tracking interaction mode is activated through the control system, and the handle tracking interaction mode is correspondingly disabled; when When the standard deviation of the parameters satisfies the second preset condition, the handle tracking interaction mode is activated through the control system, and the bare hand tracking interaction mode is correspondingly disabled.
- the present disclosure proposes an automatic switching method according to the user's use characteristics of the handle control tracker and gesture recognition.
- the system can automatically trigger the handle control in real time.
- the tracking module of the tracker when the user wants to interact with virtual reality through gesture recognition, the system can automatically trigger the gesture recognition tracking module in real time (bare hand tracking interaction mode). This way of interaction can be stable, natural, and the process can be switched in real time.
- the present disclosure also provides a switching system of the interaction mode of the head-mounted device.
- FIG. 2 shows the schematic logic of the switching system of the interaction mode of the head mounted device according to the embodiment of the present disclosure.
- the switching system 200 of the interaction mode of the head mounted device includes:
- the tracking data and IMU data acquisition unit 210 is configured to acquire 6Dof tracking data and IMU data of the handle, where the 6Dof tracking data includes position data and attitude data of the handle.
- the standard deviation obtaining unit 220 is configured to obtain the standard deviation of the position data, the standard deviation of the attitude data, and the standard deviation of the accelerometer data in the IMU data, respectively.
- the first preset condition judgment unit is 230 yuan, and is set to judge whether the standard deviation of the position data, the standard deviation of the attitude data and the standard deviation of the accelerometer data meet the first preset condition within the current first preset time; When the standard deviation of the data, the standard deviation of the attitude data, and the standard deviation of the accelerometer data meet the first preset condition, it is determined that the handle tracking interaction mode is not activated, and the bare hand tracking interaction mode is activated.
- the second preset condition determination unit 240 is configured to acquire the standard deviation of the accelerometer data in the second preset time in real time, and determine whether the standard deviation of the accelerometer data meets the second preset condition; When the standard deviation meets the second preset condition, the bare-hand tracking interaction mode is suspended, and the handle tracking interaction mode is activated.
- FIG. 3 shows a schematic structure of an electronic device according to an embodiment of the present disclosure.
- the electronic device 1 of the present disclosure may be a terminal device with computing functions, such as a VR/AR/MR head-mounted device, a server, a smart phone, a tablet computer, a portable computer, a desktop computer, and the like.
- the electronic device 1 includes: a processor 12 , a memory 11 , a network interface 14 and a communication bus 15 .
- the memory 11 includes at least one type of readable storage medium.
- the at least one type of readable storage medium may be a non-volatile storage medium such as a flash memory, a hard disk, a multimedia card, a card-type memory 11 or the like.
- the readable storage medium may be an internal storage unit of the electronic device 1 , such as a hard disk of the electronic device 1 .
- the readable storage medium may also be the external memory 11 of the electronic device 1, for example, a plug-in hard disk or a smart memory card (Smart Media Card, SMC) equipped on the electronic device 1 , Secure Digital (Secure Digital, SD) card, flash memory card (Flash Card) and so on.
- Smart Media Card Smart Media Card, SMC
- Secure Digital Secure Digital
- SD Secure Digital
- flash memory card Flash Card
- the readable storage medium of the memory 11 is generally configured to store the interactive mode switching program 10 of the head mounted device installed in the electronic device 1 and the like.
- the memory 11 may also be arranged to temporarily store data that has been output or is to be output.
- the processor 12 may in some embodiments be a central processing unit (Central Processing Unit, CPU), a microprocessor or other data processing chips, configured to run program codes or process data stored in the memory 11, for example, to execute a head-mounted device The interactive mode switching program 10 and so on.
- CPU Central Processing Unit
- microprocessor or other data processing chips, configured to run program codes or process data stored in the memory 11, for example, to execute a head-mounted device The interactive mode switching program 10 and so on.
- the network interface 14 may optionally include a standard wired interface, a wireless interface (such as a WI-FI interface), and is usually configured to establish a communication connection between the electronic device 1 and other electronic devices.
- a standard wired interface such as a WI-FI interface
- a communication bus 15 is provided to enable connection communication between these components.
- Figure 1 only shows the electronic device 1 with components 11-15, but it should be understood that implementation of all of the components shown is not a requirement and that more or fewer components may be implemented instead.
- the electronic device 1 may further include a user interface
- the user interface may include an input unit such as a keyboard (Keyboard), a voice input device such as a microphone (microphone) and other equipment with a voice recognition function, a voice output device such as a sound box, a headset, etc.
- the user interface may also include a standard wired interface and a wireless interface.
- the electronic device 1 may further include a display, which may also be referred to as a display screen or a display unit.
- a display which may also be referred to as a display screen or a display unit.
- it can be an LED display, a liquid crystal display, a touch-sensitive liquid crystal display, an organic light-emitting diode (Organic Light-Emitting Diode, OLED) touch device, and the like.
- the display is arranged to display information processed in the electronic device 1 and arranged to display a visual user interface.
- the electronic device 1 further includes a touch sensor.
- the area provided by the touch sensor for the user to perform a touch operation is called a touch area.
- the touch sensor described herein may be a resistive touch sensor, a capacitive touch sensor, or the like.
- the touch sensor includes not only a contact-type touch sensor, but also a proximity-type touch sensor and the like.
- the touch sensor may be a single sensor, or may be a plurality of sensors arranged in an array, for example.
- the memory 11 as a computer storage medium may include an operating system and a switching program 10 for the interactive mode of the head mounted device; the processor 12 executes the interactive mode of the head mounted device stored in the memory 11
- the steps shown in the switching method of the interactive mode of the head-mounted device are implemented when the switching program 10 is described.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Signal Processing (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims (20)
- 一种头戴设备交互模式的切换方法,其中,所述头戴设备的交互模式包括可相互切换的手柄追踪交互模式以及裸手追踪交互模式;所述切换方法包括:获取手柄的自由度6Dof追踪数据和惯性测量单元IMU数据,所述6Dof追踪数据包括所述手柄的位置数据和姿态数据;分别获取所述位置数据的标准差、所述姿态数据的标准差,以及所述IMU数据中的加速度计数据的标准差;判断当前第一预设时间内所述位置数据的标准差、所述姿态数据的标准差以及所述加速度计数据的标准差是否符合第一预设条件;其中,当所述位置数据的标准差、所述姿态数据的标准差以及所述加速度计数据的标准差符合所述第一预设条件时,确定所述手柄追踪交互模式未启动,并启动所述裸手追踪交互模式;同时,实时获取第二预设时间内的所述加速度计数据的标准差,并判断所述加速度计数据的标准差是否符合第二预设条件;其中,当所述加速度计数据的标准差符合所述第二预设条件时,暂停所述裸手追踪交互模式,并启动所述手柄追踪交互模式。
- 如权利要求1所述的头戴设备交互模式的切换方法,其中,判断所述位置数据的标准差、所述姿态数据的标准差以及所述加速度计数据的标准差是否符合第一预设条件的过程包括:判断所述位置数据的标准差和所述姿态数据的标准差是否均在第一预设阈值内,并当所述位置数据的标准差和所述姿态数据的标准差均在所述第一预设阈值内时,继续判断所述加速度计数据的标准差是否在第二预设阈值内;当所述位置数据的标准差和所述姿态数据的标准差均在所述第一预设阈值内,且所述加速度计数据的标准差在所述第二预设阈值内时,确定符合所述第一预设条件。
- 如权利要求2所述的头戴设备交互模式的切换方法,其中,所述第一预设时间为1s~4s,所述第二预设时间为1s~4s;所述第一预设阈值为0.6,所述第二预设阈值为0.2。
- 如权利要求3所述的头戴设备交互模式的切换方法,其中,所述第一预设时间为1.5s。
- 如权利要求1所述的头戴设备交互模式的切换方法,其中,判断所述加速度计数据的标准差是否符合第二预设条件的过程包括:判断所述加速度计数据的连续预设个数帧在所述第二预设时间内的标准差是否均大于第三预设阈值;当所述加速度计数据的连续预设个数帧在所述第二预设时间内的标准差均大于所述第三预设阈值,确定符合所述第二预设条件。
- 如权利要求5所述的头戴设备交互模式的切换方法,其中,所述第二预设时间为1s~4s;所述第三预设阈值为0.03。
- 如权利要求6所述的头戴设备交互模式的切换方法,其中,所述第二预设时间为1.5s。
- 如权利要求1所述的头戴设备交互模式的切换方法,其中,所述手柄追踪交互模式包括:所述手柄与所述头戴设备信号连接,通过所述手柄进行虚拟现实交互;所述裸手追踪交互模式包括:采集并识别裸手的手势信息及追踪信息,通过所述手势信息及所述追踪信息进行虚拟现实交互。
- 如权利要求8所述的头戴设备交互模式的切换方法,其中,在所述头戴设备上设置有至少一个裸手追踪相机和至少两个环境追踪相机;其中,所述裸手追踪相机用于捕捉所述裸手的手势信息及追踪信息,所述环境 追踪相机用于获取所述头戴设备相对所处物理环境的6DoF定位信息;所述裸手追踪相机和/或所述环境追踪相机包括:深度相机、双目红外相机、RGB相机或者单色相机。
- 如权利要求8所述的头戴设备交互模式的切换方法,其中,在所述手柄内设置有至少两个手柄追踪控制器,所述手柄追踪控制器用于实时获取所述手柄相对所述头戴设备的6DoF追踪信息;其中,所述手柄追踪控制器包括超声波传感器、电磁传感器、光学传感器以及9轴惯性导航传感器。
- 如权利要求1所述的头戴设备交互模式的切换方法,其中,还包括默认模式设定过程;其中,所述默认模式设定过程包括:通过与所述头戴设备连接的控制系统设定所述默认模式为所述手柄追踪交互模式或所述裸手追踪交互模式;所述控制系统还用于控制所述手柄追踪交互模式与所述裸手追踪交互模式的切换。
- 如权利要求11所述的头戴设备交互模式的切换方法,其中,当所述位置数据的标准差、所述姿态数据的标准差以及所述加速度计数据的标准差符合所述第一预设条件时,通过所述控制系统启动所述裸手追踪交互模式,并对应关闭所述手柄追踪交互模式;当所述加速度计数据的标准差符合所述第二预设条件时,通过所述控制系统启动所述手柄追踪交互模式,并对应关闭所述裸手追踪交互模式。
- 如权利要求1所述的头戴设备交互模式的切换方法,其中,还包括:当所述加速度计数据的标准差不符合所述第二预设条件时,继续执行裸手追踪交互模式,并暂停手柄追踪交互模式。
- 一种头戴设备交互模式的切换系统,其中,包括:追踪数据和IMU数据获取单元,设置为获取手柄的6Dof追踪数据和IMU 数据,所述6Dof追踪数据包括所述手柄的位置数据和姿态数据;标准差获取单元,设置为分别获取所述位置数据的标准差、所述姿态数据的标准差,以及所述IMU数据中的加速度计数据的标准差;第一预设条件判断单元,设置为判断当前第一预设时间内所述位置数据的标准差、所述姿态数据的标准差以及所述加速度计数据的标准差是否符合第一预设条件;其中,当所述位置数据的标准差、所述姿态数据的标准差以及所述加速度计数据的标准差符合所述第一预设条件时,确定所述手柄追踪交互模式未启动,并启动所述裸手追踪交互模式;第二预设条件判断单元,设置为实时获取第二预设时间内的所述加速度计数据的标准差,并判断所述加速度计数据的标准差是否符合第二预设条件;其中,当所述加速度计数据的标准差符合所述第二预设条件时,暂停所述裸手追踪交互模式,并启动所述手柄追踪交互模式。
- 如权利要求14所述的头戴设备交互模式的切换系统,其中,第一预设条件判断单元设置为通过以下方式判断所述位置数据的标准差、所述姿态数据的标准差以及所述加速度计数据的标准差是否符合第一预设条件:判断所述位置数据的标准差和所述姿态数据的标准差是否均在第一预设阈值内,并当所述位置数据的标准差和所述姿态数据的标准差均在所述第一预设阈值内时,继续判断所述加速度计数据的标准差是否在第二预设阈值内;当所述位置数据的标准差和所述姿态数据的标准差均在所述第一预设阈值内,且所述加速度计数据的标准差在所述第二预设阈值内时,确定符合所述第一预设条件。
- 如权利要求14所述的头戴设备交互模式的切换系统,其中,所述第二预设条件判断单元设置为通过以下方式判断所述加速度计数据的标准差是否符合第二预设条件:判断所述加速度计数据的连续预设个数帧在所述第二预设时间内的标准差是否均大于第三预设阈值;当所述加速度计数据的连续预设个数帧在所述第二预设时间内的标准差 均大于所述第三预设阈值,确定符合所述第二预设条件。
- 如权利要求14所述的头戴设备交互模式的切换系统,其中,所述手柄追踪交互模式包括:所述手柄与所述头戴设备信号连接,通过所述手柄进行虚拟现实交互;所述裸手追踪交互模式包括:采集并识别裸手的手势信息及追踪信息,通过所述手势信息及所述追踪信息进行虚拟现实交互。
- 如权利要求17所述的头戴设备交互模式的切换系统,其中,在所述头戴设备上设置有至少一个裸手追踪相机和至少两个环境追踪相机;其中,所述裸手追踪相机用于捕捉所述裸手的手势信息及追踪信息,所述环境追踪相机用于获取所述头戴设备相对所处物理环境的6DoF定位信息;所述裸手追踪相机和/或所述环境追踪相机包括:深度相机、双目红外相机、RGB相机或者单色相机。
- 如权利要求17所述的头戴设备交互模式的切换系统,其中,在所述手柄内设置有至少两个手柄追踪控制器,所述手柄追踪控制器用于实时获取所述手柄相对所述头戴设备的6DoF追踪信息;其中,所述手柄追踪控制器包括超声波传感器、电磁传感器、光学传感器以及9轴惯性导航传感器。
- 一种非暂时性计算机可读存储介质,其中,其上存储有计算机程序,所述计算机程序在被处理器执行时实现根据权利要求1-13中任一项所述的方法。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/816,411 US11709543B2 (en) | 2021-02-18 | 2022-07-30 | Switching method and system of interactive modes of head-mounted device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110190095.8 | 2021-02-18 | ||
CN202110190095.8A CN112947754B (zh) | 2021-02-18 | 2021-02-18 | 头戴设备交互模式的切换方法及系统 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/816,411 Continuation US11709543B2 (en) | 2021-02-18 | 2022-07-30 | Switching method and system of interactive modes of head-mounted device |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022174575A1 true WO2022174575A1 (zh) | 2022-08-25 |
Family
ID=76244387
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2021/116300 WO2022174575A1 (zh) | 2021-02-18 | 2021-09-02 | 头戴设备交互模式的切换方法及系统 |
Country Status (3)
Country | Link |
---|---|
US (1) | US11709543B2 (zh) |
CN (1) | CN112947754B (zh) |
WO (1) | WO2022174575A1 (zh) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112947754B (zh) | 2021-02-18 | 2023-03-28 | 青岛小鸟看看科技有限公司 | 头戴设备交互模式的切换方法及系统 |
CN116185201A (zh) * | 2023-03-07 | 2023-05-30 | 北京字跳网络技术有限公司 | 扩展现实交互方法、装置、电子设备和存储介质 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090197635A1 (en) * | 2008-02-01 | 2009-08-06 | Kim Joo Min | user interface for a mobile device |
CN101604203A (zh) * | 2008-06-12 | 2009-12-16 | 纬创资通股份有限公司 | 自动切换操作模式的手持式电子产品及控制方法 |
CN102270038A (zh) * | 2010-06-04 | 2011-12-07 | 索尼公司 | 操作终端、电子单元和电子单元系统 |
CN105117016A (zh) * | 2015-09-07 | 2015-12-02 | 众景视界(北京)科技有限公司 | 用于虚拟现实和增强现实交互控制中的交互手柄 |
CN111857337A (zh) * | 2020-07-10 | 2020-10-30 | 成都天翼空间科技有限公司 | 一种vr头控的切换方法 |
CN112947754A (zh) * | 2021-02-18 | 2021-06-11 | 青岛小鸟看看科技有限公司 | 头戴设备交互模式的切换方法及系统 |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10078377B2 (en) * | 2016-06-09 | 2018-09-18 | Microsoft Technology Licensing, Llc | Six DOF mixed reality input by fusing inertial handheld controller with hand tracking |
US11580849B2 (en) * | 2018-10-11 | 2023-02-14 | Google Llc | Wearable electronic systems having variable interactions based on device orientation |
US11054923B2 (en) * | 2019-10-11 | 2021-07-06 | Finch Technologies Ltd. | Automatic switching between different modes of tracking user motions to control computer applications |
-
2021
- 2021-02-18 CN CN202110190095.8A patent/CN112947754B/zh active Active
- 2021-09-02 WO PCT/CN2021/116300 patent/WO2022174575A1/zh active Application Filing
-
2022
- 2022-07-30 US US17/816,411 patent/US11709543B2/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090197635A1 (en) * | 2008-02-01 | 2009-08-06 | Kim Joo Min | user interface for a mobile device |
CN101604203A (zh) * | 2008-06-12 | 2009-12-16 | 纬创资通股份有限公司 | 自动切换操作模式的手持式电子产品及控制方法 |
CN102270038A (zh) * | 2010-06-04 | 2011-12-07 | 索尼公司 | 操作终端、电子单元和电子单元系统 |
CN105117016A (zh) * | 2015-09-07 | 2015-12-02 | 众景视界(北京)科技有限公司 | 用于虚拟现实和增强现实交互控制中的交互手柄 |
CN111857337A (zh) * | 2020-07-10 | 2020-10-30 | 成都天翼空间科技有限公司 | 一种vr头控的切换方法 |
CN112947754A (zh) * | 2021-02-18 | 2021-06-11 | 青岛小鸟看看科技有限公司 | 头戴设备交互模式的切换方法及系统 |
Also Published As
Publication number | Publication date |
---|---|
US11709543B2 (en) | 2023-07-25 |
CN112947754B (zh) | 2023-03-28 |
CN112947754A (zh) | 2021-06-11 |
US20220365589A1 (en) | 2022-11-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11461000B2 (en) | User interface adjustment methods and systems | |
KR102206054B1 (ko) | 지문 처리 방법 및 그 전자 장치 | |
US9547391B2 (en) | Method for processing input and electronic device thereof | |
US9690475B2 (en) | Information processing apparatus, information processing method, and program | |
US8368723B1 (en) | User input combination of touch and user position | |
US10346027B2 (en) | Information processing apparatus, information processing method, and program | |
US10564712B2 (en) | Information processing device, information processing method, and program | |
WO2022174575A1 (zh) | 头戴设备交互模式的切换方法及系统 | |
US10191603B2 (en) | Information processing device and information processing method | |
KR102338835B1 (ko) | 증강 및/또는 가상 현실 환경에서의 세션 종료 검출 | |
US20130016129A1 (en) | Region-Specific User Input | |
US20170153804A1 (en) | Display device | |
JP2017510868A (ja) | 把持状態検出 | |
CN105138247B (zh) | 检测到第二设备接近第一设备而在第一设备呈现用户界面 | |
US20220171522A1 (en) | Object position adjustment method and electronic device | |
CN104239029A (zh) | 用于控制摄像头模式的装置和关联的方法 | |
US10362028B2 (en) | Information processing apparatus | |
CN107451439B (zh) | 用于计算设备的多功能按钮 | |
US11848007B2 (en) | Method for operating voice recognition service and electronic device supporting same | |
WO2020088268A1 (zh) | 桌面图标的整理方法及终端 | |
US11886643B2 (en) | Information processing apparatus and information processing method | |
US9400575B1 (en) | Finger detection for element selection | |
US9471154B1 (en) | Determining which hand is holding a device | |
US11144091B2 (en) | Power save mode for wearable device | |
GB2522748A (en) | Detecting pause in audible input to device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21926289 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21926289 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 08.12.2023) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21926289 Country of ref document: EP Kind code of ref document: A1 |