WO2022174575A1 - 头戴设备交互模式的切换方法及系统 - Google Patents

头戴设备交互模式的切换方法及系统 Download PDF

Info

Publication number
WO2022174575A1
WO2022174575A1 PCT/CN2021/116300 CN2021116300W WO2022174575A1 WO 2022174575 A1 WO2022174575 A1 WO 2022174575A1 CN 2021116300 W CN2021116300 W CN 2021116300W WO 2022174575 A1 WO2022174575 A1 WO 2022174575A1
Authority
WO
WIPO (PCT)
Prior art keywords
standard deviation
tracking
data
handle
interaction mode
Prior art date
Application number
PCT/CN2021/116300
Other languages
English (en)
French (fr)
Inventor
吴涛
Original Assignee
青岛小鸟看看科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 青岛小鸟看看科技有限公司 filed Critical 青岛小鸟看看科技有限公司
Priority to US17/816,411 priority Critical patent/US11709543B2/en
Publication of WO2022174575A1 publication Critical patent/WO2022174575A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/26Output arrangements for video game devices having at least one additional display device, e.g. on the game controller or outside a game booth
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0338Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of limited linear or angular displacement of an operating part of the device from a neutral position, e.g. isotonic or isometric joysticks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • G06F3/0383Signal control means within the pointing device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Definitions

  • the present disclosure relates to the technical field of head-mounted integrated devices, and more particularly, to a method and system for switching an interaction mode of a head-mounted device.
  • the mainstream VR (Virtual Reality)/AR (Augmented Reality)/MR (Mixed Reality) headsets can support both handle tracking controller and gesture recognition interaction, but the current use of the two interaction methods requires switching.
  • Manually set the switching generally in the user UI (User Interface, user interface), set the switching options of the two interaction modes of the handle control tracker and gesture recognition, and manually switch the settings by the user.
  • the processing of the handle tracking controller and the calculation processing of the gesture recognition interaction are generally XOR processing, that is When the current system is performing the handle tracking interaction, the processing module of the gesture recognition interaction will be suspended by the system, or when the current system is performing the processing module of the gesture recognition interaction, the processing module of the handle tracking interaction will be suspended by the system.
  • the traditional processing mechanism is that the system accepts a control command through the application layer, which module should be processed at present, but in the field of VR/AR/MR, from the data of multi-user use, the frequency of use of the joystick control tracker is usually greater than The frequency of use of gesture recognition, through tedious manual switching of the use of the two interaction methods, seriously affects the frequency of use of gesture recognition and reduces user experience.
  • the embodiments of the present disclosure provide a method and system for switching the interaction mode of a head mounted device, which can solve the problems of cumbersome operation, poor flexibility, and impact on user experience in the mode interaction process of the current head mounted device.
  • An embodiment of the present disclosure provides a method for switching an interaction mode of a head-mounted device, wherein the interaction mode of the head-mounted device includes a switchable handle tracking interaction mode and a bare-hand tracking interaction mode; the switching method includes: obtaining the degree of freedom of the handle ( Six Degree of Freedom, referred to as 6Dof) tracking data and Inertial Measurement Unit (Inertial Measurement Unit, referred to as IMU) data, 6Dof tracking data includes the position data and attitude data of the handle; respectively obtain the standard deviation of the position data and the standard of the attitude data.
  • 6Dof degree of freedom of the handle
  • IMU Inertial Measurement Unit
  • the standard deviation of the position data, the standard deviation of the attitude data and the standard deviation of the accelerometer data within the current first preset time meet the first preset condition; wherein, When the standard deviation of the position data, the standard deviation of the attitude data, and the standard deviation of the accelerometer data meet the first preset condition, it is determined that the handle tracking interaction mode is not activated, and the bare-hand tracking interaction mode is activated; Set the standard deviation of the accelerometer data within the time period, and determine whether the standard deviation of the accelerometer data meets the second preset condition; wherein, when the standard deviation of the accelerometer data meets the second preset condition, suspend the bare-hand tracking interaction mode , and activate the controller tracking interactive mode.
  • the process of judging whether the standard deviation of the position data, the standard deviation of the attitude data and the standard deviation of the accelerometer data meet the first preset condition includes: judging the standard deviation of the position data and the standard deviation of the attitude data Whether the difference is within the first preset threshold, and when the standard deviation of the position data and the standard deviation of the attitude data are both within the first preset threshold, continue to determine whether the standard deviation of the accelerometer data is within the second preset threshold ; When the standard deviation of the position data and the standard deviation of the attitude data are both within the first preset threshold, and the standard deviation of the accelerometer data is within the second preset threshold, it is determined that the first preset condition is met.
  • the first preset time is 1s ⁇ 4s
  • the second preset time is 1s ⁇ 4s
  • the first preset threshold is 0.6
  • the second preset threshold is 0.2.
  • the first preset time is 1.5s.
  • the process of judging whether the standard deviation of the accelerometer data meets the second preset condition includes: judging whether the standard deviations of the consecutive preset number of frames of the accelerometer data within the second preset time are uniform is greater than the third preset threshold; when the standard deviation of the consecutive preset number of frames of the accelerometer data within the second preset time is greater than the third preset threshold, it is determined that the second preset condition is met.
  • the second preset time is 1s ⁇ 4s; the third preset threshold is 0.03.
  • the second preset time is 1.5s.
  • the handle tracking interaction mode includes: a signal connection between the handle and the head-mounted device, and performing virtual reality interaction through the handle;
  • the bare-hand tracking interaction mode includes: collecting and recognizing the gesture information and tracking information of the bare hand, and performing virtual reality interaction through the handle. Gesture information and tracking information for virtual reality interaction.
  • At least one bare-hand tracking camera and at least two environment tracking cameras are provided on the head-mounted device; wherein, the bare-hand tracking cameras are configured to capture gesture information and tracking information of the bare hands, and the environment tracking The camera is set to obtain 6DoF positioning information of the head-mounted device relative to the physical environment; the bare-hand tracking camera and/or the environment tracking camera includes: a depth camera, a binocular infrared camera, an RGB camera or a monochrome camera.
  • At least two handle tracking controllers are provided in the handle, and the handle tracking controllers are configured to obtain 6DoF tracking information of the handle relative to the headset in real time; wherein, the handle tracking controllers include ultrasonic sensors, Electromagnetic sensors, optical sensors, and 9-axis inertial navigation sensors.
  • the switching method further includes a default mode setting process, wherein the default mode setting process includes: setting the default mode to be a handle tracking interactive mode or a bare hand through a control system connected to the head mounted device Tracking interaction mode; the control system is also set to control the switching between the handle tracking interaction mode and the bare-hand tracking interaction mode.
  • the default mode setting process includes: setting the default mode to be a handle tracking interactive mode or a bare hand through a control system connected to the head mounted device Tracking interaction mode; the control system is also set to control the switching between the handle tracking interaction mode and the bare-hand tracking interaction mode.
  • the control system when the standard deviation of the position data, the standard deviation of the attitude data and the standard deviation of the accelerometer data meet the first preset condition, the control system starts the bare-hand tracking interaction mode, and correspondingly closing the handle tracking interaction mode; when the standard deviation of the accelerometer data meets the second preset condition, the handle tracking interaction mode is activated through the control system, And correspondingly close the bare-hand tracking interaction mode.
  • the switching method further includes: when the standard deviation of the accelerometer data does not meet the second preset condition, continuing to execute the bare-hand tracking interaction mode, and suspending the handle tracking interaction mode.
  • An embodiment of the present disclosure provides a head-mounted device interaction mode switching system, including: a tracking data and IMU data acquisition unit, configured to acquire 6Dof tracking data and IMU data of a handle, where the 6Dof tracking data includes position data and attitude data of the handle
  • the standard deviation obtaining unit is set to obtain the standard deviation of the position data, the standard deviation of the attitude data, and the standard deviation of the accelerometer data in the IMU data respectively
  • the first preset condition judgment unit is set to judge the current first preset Whether the standard deviation of the position data, the standard deviation of the attitude data, and the standard deviation of the accelerometer data in time meet the first preset condition; wherein, when the standard deviation of the position data, the standard deviation of the attitude data and the standard deviation of the accelerometer data meet the In the first preset condition, it is determined that the handle tracking interaction mode is not activated, and the bare-hand tracking interaction mode is activated; the second preset condition judgment unit is set to obtain the standard deviation of the accelerometer data in the second preset time in real time,
  • the first preset condition judging unit is configured to judge whether the standard deviation of the position data, the standard deviation of the attitude data and the standard deviation of the accelerometer data conform to the first Preset condition: determine whether the standard deviation of the position data and the standard deviation of the attitude data are both within the first preset threshold, and when the standard deviation of the position data and the standard deviation of the attitude data are both within the when the standard deviation of the accelerometer data is within the second preset threshold; when both the standard deviation of the position data and the standard deviation of the attitude data are within the first Within a preset threshold, and when the standard deviation of the accelerometer data is within the second preset threshold, it is determined that the first preset condition is met.
  • the second preset condition judgment unit is configured to judge whether the standard deviation of the accelerometer data meets the second preset condition by: judging the continuous preset of the accelerometer data Whether the standard deviation of the number of frames within the second preset time is greater than the third preset threshold; when the standard deviation of the consecutive preset number of frames of the accelerometer data within the second preset time If it is greater than the third preset threshold, it is determined that the second preset condition is met.
  • the handle tracking interaction mode includes: the handle is signal-connected to the headset, and virtual reality interaction is performed through the handle; the bare-hand tracking interaction mode includes: collecting and identifying Gesture information and tracking information of the bare hand, and virtual reality interaction is performed through the gesture information and the tracking information.
  • At least one bare-hand tracking camera and at least two environment tracking cameras are provided on the head-mounted device; wherein, the bare-hand tracking cameras are used to capture gesture information of the bare hands and tracking information, the environment tracking camera is used to obtain the 6DoF positioning information of the head-mounted device relative to the physical environment in which it is located; the bare-hand tracking camera and/or the environment tracking camera include: a depth camera, a binocular infrared camera , RGB camera, or monochrome camera.
  • At least two handle tracking controllers are provided in the handle, and the handle tracking controllers are used to obtain 6DoF tracking information of the handle relative to the head mounted device in real time; wherein, the The handle tracking controller includes ultrasonic sensors, electromagnetic sensors, optical sensors and 9-axis inertial navigation sensors.
  • An embodiment of the present disclosure provides a non-transitory computer-readable storage medium on which a computer program is stored, and when the computer program is executed by a processor, implements the method described in any of the foregoing embodiments or exemplary embodiments. method.
  • the handle tracking interaction mode and the bare hand tracking are determined. Enable or disable the interactive mode; at the same time, determine in real time whether the standard deviation of the accelerometer data meets the second preset condition, and determine whether to enable or disable the bare-hand tracking interaction mode and the handle tracking interaction mode, which can be based on the user's current actions or behavior habits. , stably, naturally, and smoothly switch the interactive mode, with good user experience and a wide range of applications.
  • FIG. 1 is a flowchart of a method for switching an interaction mode of a head mounted device according to an embodiment of the present disclosure
  • FIG. 2 is a schematic block diagram of a system for switching an interactive mode of a head mounted device according to an embodiment of the present disclosure
  • FIG. 3 is a logic diagram of an electronic device according to an embodiment of the present disclosure.
  • FIG. 1 shows a flow of a method for switching an interaction mode of a head mounted device according to an embodiment of the present disclosure.
  • a method for switching the interaction mode of a head-mounted device according to an embodiment of the present disclosure, wherein the interaction mode of the head-mounted device includes a switchable handle tracking interaction mode and a bare-hand tracking interaction mode; wherein, the steps of the switching method are include:
  • S120 respectively obtain the standard deviation of the position data, the standard deviation of the attitude data, and the standard deviation of the accelerometer data in the IMU data;
  • S130 Determine whether the standard deviation of the position data, the standard deviation of the attitude data, and the standard deviation of the accelerometer data within the current first preset time meet the first preset condition;
  • the process of judging whether the standard deviation of the position data, the standard deviation of the attitude data and the standard deviation of the accelerometer data within the first preset time meets the first preset condition includes:
  • the first preset time may be set to 1s ⁇ 4s, preferably 1.5s, the second preset time is 1s ⁇ 4s; the first preset threshold is 0.6, and the second preset threshold is 0.2.
  • the process of judging whether the standard deviation of the accelerometer data meets the second preset condition includes: judging whether the standard deviation of the consecutive preset number of frames of the accelerometer data within the second preset time is greater than the third preset threshold ; When the standard deviation of the consecutive preset number of frames of the accelerometer data within the second preset time is greater than the third preset threshold, it is determined that the second preset condition is met.
  • the second preset time can be set to 1s to 4s, preferably 1.5s
  • the third preset threshold can be set to 0.03
  • the continuous preset number of frames can be selected as continuous 3-10 or 5-10 frames, etc., preferably 5 frames, in other words, when the standard deviation of the accelerometer data is greater than 0.03 within 1.5s of 5 consecutive frames, it can be considered that the user starts to use the handle tracking interaction mode again, and then the bare hand tracking interaction mode can be suspended, and the handle tracking can be started. interactive mode.
  • first preset time, the second preset interval, the first preset threshold, the second preset threshold, the third preset threshold, and the relevant data of the preset number of frames can all be based on specific data.
  • the setting and adjustment of the application product, application environment or customer needs are not limited to the above specific values.
  • the handle tracking interaction mode includes: the handle is signally connected to the headset, and the handle is used to operate the handle to perform virtual reality interaction, and the bare-hand tracking interaction mode includes: using the handle set on the headset to perform virtual reality interaction.
  • Each camera collects and recognizes the gesture information and tracking information of the bare hand, and performs virtual reality interaction through the gesture information and tracking information.
  • At least one bare-hand tracking camera and at least two environment tracking cameras are provided on the head-mounted device; wherein, the bare-hand tracking camera is set to capture gesture information and tracking information of the bare hand, and the environment tracking camera is set to obtain the head-mounted camera.
  • At least two handle tracking controllers are arranged in the handle, and the handle tracking controllers are set to obtain the 6DoF tracking information of the handle relative to the headset in real time; wherein, the handle tracking controllers include ultrasonic sensors, electromagnetic sensors, optical sensors, and 9 Axis inertial navigation sensors and other types of sensors.
  • the method for switching the interaction mode of the head-mounted device of the present disclosure further includes a default mode setting process: the default mode is set as the handle tracking interaction mode or the bare-hand tracking interaction mode through the control system connected with the head-mounted device; the control system also It is set to control the switching between the handle tracking interaction mode and the bare-hand tracking interaction mode.
  • the bare-hand tracking interaction mode When the standard deviation of the above parameters satisfies the first preset condition, the bare-hand tracking interaction mode is activated through the control system, and the handle tracking interaction mode is correspondingly disabled; when When the standard deviation of the parameters satisfies the second preset condition, the handle tracking interaction mode is activated through the control system, and the bare hand tracking interaction mode is correspondingly disabled.
  • the present disclosure proposes an automatic switching method according to the user's use characteristics of the handle control tracker and gesture recognition.
  • the system can automatically trigger the handle control in real time.
  • the tracking module of the tracker when the user wants to interact with virtual reality through gesture recognition, the system can automatically trigger the gesture recognition tracking module in real time (bare hand tracking interaction mode). This way of interaction can be stable, natural, and the process can be switched in real time.
  • the present disclosure also provides a switching system of the interaction mode of the head-mounted device.
  • FIG. 2 shows the schematic logic of the switching system of the interaction mode of the head mounted device according to the embodiment of the present disclosure.
  • the switching system 200 of the interaction mode of the head mounted device includes:
  • the tracking data and IMU data acquisition unit 210 is configured to acquire 6Dof tracking data and IMU data of the handle, where the 6Dof tracking data includes position data and attitude data of the handle.
  • the standard deviation obtaining unit 220 is configured to obtain the standard deviation of the position data, the standard deviation of the attitude data, and the standard deviation of the accelerometer data in the IMU data, respectively.
  • the first preset condition judgment unit is 230 yuan, and is set to judge whether the standard deviation of the position data, the standard deviation of the attitude data and the standard deviation of the accelerometer data meet the first preset condition within the current first preset time; When the standard deviation of the data, the standard deviation of the attitude data, and the standard deviation of the accelerometer data meet the first preset condition, it is determined that the handle tracking interaction mode is not activated, and the bare hand tracking interaction mode is activated.
  • the second preset condition determination unit 240 is configured to acquire the standard deviation of the accelerometer data in the second preset time in real time, and determine whether the standard deviation of the accelerometer data meets the second preset condition; When the standard deviation meets the second preset condition, the bare-hand tracking interaction mode is suspended, and the handle tracking interaction mode is activated.
  • FIG. 3 shows a schematic structure of an electronic device according to an embodiment of the present disclosure.
  • the electronic device 1 of the present disclosure may be a terminal device with computing functions, such as a VR/AR/MR head-mounted device, a server, a smart phone, a tablet computer, a portable computer, a desktop computer, and the like.
  • the electronic device 1 includes: a processor 12 , a memory 11 , a network interface 14 and a communication bus 15 .
  • the memory 11 includes at least one type of readable storage medium.
  • the at least one type of readable storage medium may be a non-volatile storage medium such as a flash memory, a hard disk, a multimedia card, a card-type memory 11 or the like.
  • the readable storage medium may be an internal storage unit of the electronic device 1 , such as a hard disk of the electronic device 1 .
  • the readable storage medium may also be the external memory 11 of the electronic device 1, for example, a plug-in hard disk or a smart memory card (Smart Media Card, SMC) equipped on the electronic device 1 , Secure Digital (Secure Digital, SD) card, flash memory card (Flash Card) and so on.
  • Smart Media Card Smart Media Card, SMC
  • Secure Digital Secure Digital
  • SD Secure Digital
  • flash memory card Flash Card
  • the readable storage medium of the memory 11 is generally configured to store the interactive mode switching program 10 of the head mounted device installed in the electronic device 1 and the like.
  • the memory 11 may also be arranged to temporarily store data that has been output or is to be output.
  • the processor 12 may in some embodiments be a central processing unit (Central Processing Unit, CPU), a microprocessor or other data processing chips, configured to run program codes or process data stored in the memory 11, for example, to execute a head-mounted device The interactive mode switching program 10 and so on.
  • CPU Central Processing Unit
  • microprocessor or other data processing chips, configured to run program codes or process data stored in the memory 11, for example, to execute a head-mounted device The interactive mode switching program 10 and so on.
  • the network interface 14 may optionally include a standard wired interface, a wireless interface (such as a WI-FI interface), and is usually configured to establish a communication connection between the electronic device 1 and other electronic devices.
  • a standard wired interface such as a WI-FI interface
  • a communication bus 15 is provided to enable connection communication between these components.
  • Figure 1 only shows the electronic device 1 with components 11-15, but it should be understood that implementation of all of the components shown is not a requirement and that more or fewer components may be implemented instead.
  • the electronic device 1 may further include a user interface
  • the user interface may include an input unit such as a keyboard (Keyboard), a voice input device such as a microphone (microphone) and other equipment with a voice recognition function, a voice output device such as a sound box, a headset, etc.
  • the user interface may also include a standard wired interface and a wireless interface.
  • the electronic device 1 may further include a display, which may also be referred to as a display screen or a display unit.
  • a display which may also be referred to as a display screen or a display unit.
  • it can be an LED display, a liquid crystal display, a touch-sensitive liquid crystal display, an organic light-emitting diode (Organic Light-Emitting Diode, OLED) touch device, and the like.
  • the display is arranged to display information processed in the electronic device 1 and arranged to display a visual user interface.
  • the electronic device 1 further includes a touch sensor.
  • the area provided by the touch sensor for the user to perform a touch operation is called a touch area.
  • the touch sensor described herein may be a resistive touch sensor, a capacitive touch sensor, or the like.
  • the touch sensor includes not only a contact-type touch sensor, but also a proximity-type touch sensor and the like.
  • the touch sensor may be a single sensor, or may be a plurality of sensors arranged in an array, for example.
  • the memory 11 as a computer storage medium may include an operating system and a switching program 10 for the interactive mode of the head mounted device; the processor 12 executes the interactive mode of the head mounted device stored in the memory 11
  • the steps shown in the switching method of the interactive mode of the head-mounted device are implemented when the switching program 10 is described.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本公开提供一种头戴设备交互模式的切换方法及系统,头戴设备的交互模式包括可相互切换的手柄追踪交互模式以及裸手追踪交互模式; 其中,切换方法包括: 分别获取位置数据的标准差、姿态数据的标准差,以及IMU数据中的加速度计数据的标准差; 判断位置数据的标准差、姿态数据的标准差以及加速度计数据的标准差是否符合第一预设条件; 同时,实时获取第二预设时间内的加速度计数据的标准差,并判断加速度计数据的标准差是否符合第二预设条件; 其中,当加速度计数据的标准差符合第二预设条件时,暂停裸手追踪交互模式,并启动手柄追踪交互模式。利用上述方案能够根据采集当前用户的标准差数据,灵活的对头戴设备的交互模式进行切换。

Description

头戴设备交互模式的切换方法及系统 技术领域
本公开涉及头戴一体设备技术领域,更为具体地,涉及一种头戴设备交互模式的切换方法及系统。
背景技术
目前,主流的VR(Virtual Reality)/AR(Augmented Reality)/MR(Mixed Reality)头戴一体机设备,均能够同时支持手柄追踪控制器和手势识别交互,但是目前两种交互方式的使用切换需要人工手动设置切换,一般是在用户UI(User Interface,用户界面)中,设置手柄控制追踪器和手势识别两种交互方式的切换选项,通过用户手动切换设置。
由于,手柄追踪控制器和手势识别交互的计算处理资源在VR/AR/MR一体机设备上相对占用比较大,所以一般手柄追踪控制器的处理和手势识别交互的计算处理是异或处理,即当前系统在进行手柄追踪交互时,手势识别交互的处理模块会被系统暂停,或者当前系统在进行手势识别交互的处理模块时,手柄追踪交互的处理模块会被系统暂停。
可知,传统的处理机制是系统通过应用层接受一个控制指令,当前应该处理哪个模块,但在VR/AR/MR领域中,从多用户使用的数据来看,手柄控制追踪器的使用频率通常大于手势识别的使用频率,通过繁琐的人工手动切换两种交互方式的使用,严重影响了手势识别的使用频率,降低用户体验。
发明内容
鉴于上述问题,本公开实施例提供了一种头戴设备交互模式的切换方法及系统,能够解决目前的头戴设备在模式交互过程中存在的操作繁琐、灵活性差、影响用户体验等问题。
本公开实施例提供的头戴设备交互模式的切换方法,其中,头戴设备的交互模式包括可相互切换的手柄追踪交互模式以及裸手追踪交互模式;该切 换方法包括:获取手柄的自由度(Six Degree of Freedom,简称为6Dof)追踪数据和惯性测量单元(Inertial Measurement Unit,简称为IMU)数据,6Dof追踪数据包括手柄的位置数据和姿态数据;分别获取位置数据的标准差、姿态数据的标准差,以及IMU数据中的加速度计数据的标准差;判断当前第一预设时间内位置数据的标准差、姿态数据的标准差以及加速度计数据的标准差是否符合第一预设条件;其中,当位置数据的标准差、姿态数据的标准差以及加速度计数据的标准差符合第一预设条件时,确定手柄追踪交互模式未启动,并启动裸手追踪交互模式;同时,实时获取第二预设时间内的加速度计数据的标准差,并判断加速度计数据的标准差是否符合第二预设条件;其中,当加速度计数据的标准差符合第二预设条件时,暂停裸手追踪交互模式,并启动手柄追踪交互模式。
在至少一个示例性实施例中,判断位置数据的标准差、姿态数据的标准差以及加速度计数据的标准差是否符合第一预设条件的过程包括:判断位置数据的标准差和姿态数据的标准差是否均在第一预设阈值内,并当位置数据的标准差和姿态数据的标准差均在第一预设阈值内时,继续判断加速度计数据的标准差是否在第二预设阈值内;当位置数据的标准差和姿态数据的标准差均在第一预设阈值内,且加速度计数据的标准差在第二预设阈值内时,确定符合第一预设条件。
在至少一个示例性实施例中,第一预设时间为1s~4s,第二预设时间为1s~4s;第一预设阈值为0.6,第二预设阈值为0.2。
在至少一个示例性实施例中,所述第一预设时间为1.5s。
在至少一个示例性实施例中,判断加速度计数据的标准差是否符合第二预设条件的过程包括:判断加速度计数据的连续预设个数帧在第二预设时间内的标准差是否均大于第三预设阈值;当加速度计数据的连续预设个数帧在第二预设时间内的标准差均大于第三预设阈值,确定符合第二预设条件。
在至少一个示例性实施例中,第二预设时间为1s~4s;第三预设阈值为0.03。
在至少一个示例性实施例中,所述第二预设时间为1.5s。
在至少一个示例性实施例中,手柄追踪交互模式包括:手柄与头戴设备信号连接,通过手柄进行虚拟现实交互;裸手追踪交互模式包括:采集并识 别裸手的手势信息及追踪信息,通过手势信息及追踪信息进行虚拟现实交互。
在至少一个示例性实施例中,在头戴设备上设置有至少一个裸手追踪相机和至少两个环境追踪相机;其中,裸手追踪相机设置为捕捉裸手的手势信息及追踪信息,环境追踪相机设置为获取头戴设备相对所处物理环境的6DoF定位信息;裸手追踪相机和/或环境追踪相机包括:深度相机、双目红外相机、RGB相机或者单色相机。
在至少一个示例性实施例中,在手柄内设置有至少两个手柄追踪控制器,手柄追踪控制器设置为实时获取手柄相对头戴设备的6DoF追踪信息;其中,手柄追踪控制器包括超声波传感器、电磁传感器、光学传感器以及9轴惯性导航传感器。
在至少一个示例性实施例中,该切换方法还包括默认模式设定过程;其中,默认模式设定过程包括:通过与头戴设备连接的控制系统设定默认模式为手柄追踪交互模式或裸手追踪交互模式;控制系统还设置为控制手柄追踪交互模式与裸手追踪交互模式的切换。
在至少一个示例性实施例中,当所述位置数据的标准差、所述姿态数据的标准差以及所述加速度计数据的标准差符合所述第一预设条件时,通过所述控制系统启动所述裸手追踪交互模式,并对应关闭所述手柄追踪交互模式;当所述加速度计数据的标准差符合所述第二预设条件时,通过所述控制系统启动所述手柄追踪交互模式,并对应关闭所述裸手追踪交互模式。
在至少一个示例性实施例中,该切换方法还包括:当所述加速度计数据的标准差不符合所述第二预设条件时,继续执行裸手追踪交互模式,并暂停手柄追踪交互模式。
本公开实施例提供的一种头戴设备交互模式的切换系统,包括:追踪数据和IMU数据获取单元,设置为获取手柄的6Dof追踪数据和IMU数据,6Dof追踪数据包括手柄的位置数据和姿态数据;标准差获取单元,设置为分别获取位置数据的标准差、姿态数据的标准差,以及IMU数据中的加速度计数据的标准差;第一预设条件判断单元,设置为判断当前第一预设时间内位置数据的标准差、姿态数据的标准差以及加速度计数据的标准差是否符合第一预设条件;其中,当位置数据的标准差、姿态数据的标准差以及加速度计数据的标准差符合第一预设条件时,确定手柄追踪交互模式未启动,并启动裸手 追踪交互模式;第二预设条件判断单元,设置为实时获取第二预设时间内的加速度计数据的标准差,并判断加速度计数据的标准差是否符合第二预设条件;其中,当加速度计数据的标准差符合第二预设条件时,暂停裸手追踪交互模式,并启动手柄追踪交互模式。
在至少一个示例性实施例中,第一预设条件判断单元设置为通过以下方式判断所述位置数据的标准差、所述姿态数据的标准差以及所述加速度计数据的标准差是否符合第一预设条件:判断所述位置数据的标准差和所述姿态数据的标准差是否均在第一预设阈值内,并当所述位置数据的标准差和所述姿态数据的标准差均在所述第一预设阈值内时,继续判断所述加速度计数据的标准差是否在第二预设阈值内;当所述位置数据的标准差和所述姿态数据的标准差均在所述第一预设阈值内,且所述加速度计数据的标准差在所述第二预设阈值内时,确定符合所述第一预设条件。
在至少一个示例性实施例中,所述第二预设条件判断单元设置为通过以下方式判断所述加速度计数据的标准差是否符合第二预设条件:判断所述加速度计数据的连续预设个数帧在所述第二预设时间内的标准差是否均大于第三预设阈值;当所述加速度计数据的连续预设个数帧在所述第二预设时间内的标准差均大于所述第三预设阈值,确定符合所述第二预设条件。
在至少一个示例性实施例中,所述手柄追踪交互模式包括:所述手柄与所述头戴设备信号连接,通过所述手柄进行虚拟现实交互;所述裸手追踪交互模式包括:采集并识别裸手的手势信息及追踪信息,通过所述手势信息及所述追踪信息进行虚拟现实交互。
在至少一个示例性实施例中,在所述头戴设备上设置有至少一个裸手追踪相机和至少两个环境追踪相机;其中,所述裸手追踪相机用于捕捉所述裸手的手势信息及追踪信息,所述环境追踪相机用于获取所述头戴设备相对所处物理环境的6DoF定位信息;所述裸手追踪相机和/或所述环境追踪相机包括:深度相机、双目红外相机、RGB相机或者单色相机。
在至少一个示例性实施例中,在所述手柄内设置有至少两个手柄追踪控制器,所述手柄追踪控制器用于实时获取所述手柄相对所述头戴设备的6DoF追踪信息;其中,所述手柄追踪控制器包括超声波传感器、电磁传感器、光学传感器以及9轴惯性导航传感器。
本公开实施例提供的一种非暂时性计算机可读存储介质,其上存储有计算机程序,所述计算机程序在被处理器执行时实现根据前述任一项实施例或示例性实施例所述的方法。
利用上述头戴设备交互模式的切换方法及系统,通过判断位置数据的标准差、姿态数据的标准差以及加速度计数据的标准差是否符合第一预设条件,确定手柄追踪交互模式和裸手追踪交互模式的启动或关闭;同时,实时判断加速度计数据的标准差是否符合第二预设条件,确定裸手追踪交互模式和手柄追踪交互模式的启动或关闭,能够根据用户当前的动作或行为习惯,稳定、自然、流畅的进行交互模式切换,用户体验效果好、可适用范围广。
为了实现上述以及相关目的,本公开的一个或多个方面包括后面将详细说明的特征。下面的说明以及附图详细说明了本公开的某些示例性方面。然而,这些方面指示的仅仅是可使用本公开的原理的各种方式中的一些方式。此外,本公开旨在包括所有这些方面以及它们的等同物。
附图说明
通过参考以下结合附图的说明,并且随着对本公开的更全面理解,本公开的其它目的及结果将更加明白及易于理解。在附图中:
图1为根据本公开实施例的头戴设备交互模式的切换方法的流程图;
图2为根据本公开实施例的头戴设备交互模式的切换系统的方框示意图;
图3为根据本公开实施例的电子装置的逻辑图。
在所有附图中相同的标号指示相似或相应的特征或功能。
具体实施方式
在下面的描述中,出于说明的目的,为了提供对一个或多个实施例的全面理解,阐述了许多具体细节。然而,很明显,也可以在没有这些具体细节的情况下实现这些实施例。在其它例子中,为了便于描述一个或多个实施例,公知的结构和设备以方框图的形式示出。
为详细描述本公开实施例的头戴设备交互模式的切换方法及系统,以下将结合附图对本公开的具体实施例进行详细描述。
图1示出了根据本公开实施例的头戴设备交互模式的切换方法的流程。
如图1所示,本公开实施例的头戴设备交互模式的切换方法,其中,头戴设备的交互模式包括可相互切换的手柄追踪交互模式以及裸手追踪交互模式;其中,切换方法的步骤包括:
S110:获取手柄的6Dof追踪数据和IMU数据,6Dof追踪数据包括手柄的位置数据和姿态数据;
S120:分别获取位置数据的标准差、姿态数据的标准差,以及IMU数据中的加速度计数据的标准差;
S130:判断当前第一预设时间内位置数据的标准差、姿态数据的标准差以及加速度计数据的标准差是否符合第一预设条件;
S140:当位置数据的标准差、姿态数据的标准差以及加速度计数据的标准差符合第一预设条件时,确定手柄追踪交互模式未启动,并启动裸手追踪交互模式。
具体地,判断第一预设时间内的位置数据的标准差、姿态数据的标准差以及加速度计数据的标准差是否符合第一预设条件的过程包括:
首先,判断位置数据的标准差和姿态数据的标准差是否均在第一预设阈值内,并当位置数据的标准差和姿态数据的标准差均在第一预设阈值内时,继续判断加速度计数据的标准差是否在第二预设阈值内;当位置数据的标准差和姿态数据的标准差均在第一预设阈值内,且加速度计数据的标准差在第二预设阈值内时,确定符合第一预设条件。
其中,第一预设时间可设置为1s~4s,优选为1.5s,所述第二预设时间为1s~4s;第一预设阈值为0.6,第二预设阈值为0.2。
需要说明的是,当上述位置数据的标准差、姿态数据的标准差以及加速度计数据的标准差不符合第一预设条件时,表示默认当前手柄追踪交互模式已经启动,此时不必启动裸手追踪交互模式。
S150:在上述对第一预设条件进行判断的过程中,实时获取第二预设时间内的加速度计数据的标准差,并判断加速度计数据的标准差是否符合第二预设条件。
S160:当加速度计数据的标准差符合第二预设条件时,暂停裸手追踪交互模式,并启动手柄追踪交互模式。否则,继续执行裸手追踪交互模式,并 暂停手柄追踪交互模式。
具体地,判断加速度计数据的标准差是否符合第二预设条件的过程包括:判断加速度计数据的连续预设个数帧在第二预设时间内的标准差是否均大于第三预设阈值;当加速度计数据的连续预设个数帧在第二预设时间内的标准差均大于第三预设阈值,确定符合第二预设条件。
其中,第二预设时间可设置为1s~4s,优选为1.5s,第三预设阈值可设置为0.03,连续预设个数帧可选用连续3-10或者5-10帧等,优选为5帧,换言之,当加速度计数据的在连续5帧的1.5s时间内标准差都大于0.03时,可认为用户又开始使用手柄追踪交互模式,进而可暂停裸手追踪交互模式,并启动手柄追踪交互模式。
需要说明的是,上述第一预设时间、第二预设之间、第一预设阈值、第二预设阈值、第三预设阈值以及预设个数帧的相关数据均可根据具体的应用产品、应用环境或客户需求进行设置及调整,并不限于上述具体数值。
在本公开的一个具体实施方式中,手柄追踪交互模式包括:手柄与头戴设备信号连接,通过用于操作手柄进行虚拟现实交互,而裸手追踪交互模式包括:通过设置在头戴设备上的各相机,采集并识别裸手的手势信息及追踪信息,通过手势信息及追踪信息进行虚拟现实交互。
具体地,在头戴设备上设置有至少一个裸手追踪相机和至少两个环境追踪相机;其中,裸手追踪相机设置为捕捉裸手的手势信息及追踪信息,环境追踪相机设置为获取头戴设备相对所处物理环境的6DoF定位信息;上述裸手追踪相机和/或环境追踪相机包括:深度相机、双目红外相机、RGB相机或者单色相机等多种类型的相机,在具体应用中可灵活选取。
此外,在手柄内设置有至少两个手柄追踪控制器,手柄追踪控制器设置为实时获取手柄相对头戴设备的6DoF追踪信息;其中,手柄追踪控制器包括超声波传感器、电磁传感器、光学传感器以及9轴惯性导航传感器等多种类型的传感器。
需要说明的是,在头戴设备启动之后,会进入一个默认的交互模式,该默认的交互模式可通过与头戴设备连接的控制系统进行设定。可知,本公开的头戴设备交互模式的切换方法,还包括默认模式设定过程:通过与头戴设备连接的控制系统设定默认模式为手柄追踪交互模式或裸手追踪交互模式; 控制系统还设置为控制手柄追踪交互模式与裸手追踪交互模式的切换,当上述各参数的标准差满足第一预设条件时,通过控制系统启动裸手追踪交互模式,并对应关闭手柄追踪交互模式;当参数的标准差满足第二预设条件时,通过控制系统启动手柄追踪交互模式,并对应关闭裸手追踪交互模式。
本公开根据用户对手柄控制追踪器和手势识别的使用特征,提出了一种自动切换方式,用户想通过手柄控制追踪器(手柄追踪交互方式)进行虚拟现实交互时,系统可以自动实时触发手柄控制追踪器的追踪模块,用户想通过手势识别进行虚拟现实交互时,系统可以自动实时触发手势识别追踪模块(裸手追踪交互方式),在整个环节过程中,不需要用户手动人工干预任何操作,两种方式的交互可以稳定,自然,流程进行实时切换。
与上述头戴设备交互模式的切换相对应,本公开还提供一种头戴设备交互模式的切换系统。
具体地,图2示出了根据本公开实施例的头戴设备交互模式的切换系统的示意逻辑。
如图2所示,本公开实施例的头戴设备交互模式的切换系统200,包括:
追踪数据和IMU数据获取单元210,设置为获取手柄的6Dof追踪数据和IMU数据,6Dof追踪数据包括手柄的位置数据和姿态数据。
标准差获取单元220,设置为分别获取位置数据的标准差、姿态数据的标准差,以及IMU数据中的加速度计数据的标准差。
第一预设条件判断单230元,设置为判断当前第一预设时间内位置数据的标准差、姿态数据的标准差以及加速度计数据的标准差是否符合第一预设条件;其中,当位置数据的标准差、姿态数据的标准差以及加速度计数据的标准差符合第一预设条件时,确定手柄追踪交互模式未启动,并启动裸手追踪交互模式。
第二预设条件判断单元240,设置为实时获取第二预设时间内的加速度计数据的标准差,并判断加速度计数据的标准差是否符合第二预设条件;其中,当加速度计数据的标准差符合第二预设条件时,暂停裸手追踪交互模式,并启动手柄追踪交互模式。
对应地,本公开还提供一种电子装置,图3示出了根据本公开实施例的电子装置的示意结构。
如图3所示,本公开的电子装置1可以是VR/AR/MR头戴式一体机设备、服务器、智能手机、平板电脑、便携计算机、桌上型计算机等具有运算功能的终端设备。其中,该电子装置1包括:处理器12、存储器11、网络接口14及通信总线15。
其中,存储器11包括至少一种类型的可读存储介质。所述至少一种类型的可读存储介质可为如闪存、硬盘、多媒体卡、卡型存储器11等的非易失性存储介质。在一些实施例中,所述可读存储介质可以是所述电子装置1的内部存储单元,例如该电子装置1的硬盘。在另一些实施例中,所述可读存储介质也可以是所述电子装置1的外部存储器11,例如所述电子装置1上配备的插接式硬盘,智能存储卡(Smart Media Card,SMC),安全数字(Secure Digital,SD)卡,闪存卡(Flash Card)等。
在本实施例中,所述存储器11的可读存储介质通常设置为存储安装于所述电子装置1的头戴设备交互模式的切换程序10等。所述存储器11还可以设置为暂时地存储已经输出或者将要输出的数据。
处理器12在一些实施例中可以是一中央处理器(Central Processing Unit,CPU),微处理器或其他数据处理芯片,设置为运行存储器11中存储的程序代码或处理数据,例如执行头戴设备交互模式的切换程序10等。
网络接口14可选地可以包括标准的有线接口、无线接口(如WI-FI接口),通常设置为在该电子装置1与其他电子设备之间建立通信连接。
通信总线15设置为实现这些组件之间的连接通信。
图1仅示出了具有组件11-15的电子装置1,但是应理解的是,并不要求实施所有示出的组件,可以替代的实施更多或者更少的组件。
可选地,该电子装置1还可以包括用户接口,用户接口可以包括输入单元比如键盘(Keyboard)、语音输入装置比如麦克风(microphone)等具有语音识别功能的设备、语音输出装置比如音响、耳机等,可选地用户接口还可以包括标准的有线接口、无线接口。
可选地,该电子装置1还可以包括显示器,显示器也可以称为显示屏或显示单元。在一些实施例中可以是LED显示器、液晶显示器、触控式液晶显示器以及有机发光二极管(Organic Light-Emitting Diode,OLED)触摸器等。显示器设置为显示在电子装置1中处理的信息以及设置为显示可视化的用户 界面。
可选地,该电子装置1还包括触摸传感器。所述触摸传感器所提供的供用户进行触摸操作的区域称为触控区域。此外,这里所述的触摸传感器可以为电阻式触摸传感器、电容式触摸传感器等。而且,所述触摸传感器不仅包括接触式的触摸传感器,也可包括接近式的触摸传感器等。此外,所述触摸传感器可以为单个传感器,也可以为例如阵列布置的多个传感器。
在图1所示的装置实施例中,作为一种计算机存储介质的存储器11中可以包括操作系统以及头戴设备交互模式的切换程序10;处理器12执行存储器11中存储的头戴设备交互模式的切换程序10时实现如头戴设备交互模式的切换方法所示的步骤。
本公开还提供的计算机可读存储介质的具体实施方式与上述捏合手势检测识别程序方法、装置、系统的具体实施方式大致相同,在此不再赘述。
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、装置、物品或者方法不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、装置、物品或者方法所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、装置、物品或者方法中还存在另外的相同要素。
上述本公开实施例序号仅仅为了描述,不代表实施例的优劣。通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本公开的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在如上所述的一个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台终端设备(可以是手机,计算机,服务器,或者网络设备等)执行本公开各个实施例所述的方法。
如上参照附图以示例的方式描述根据本公开的头戴设备交互模式的切换方法、系统及电子装置。但是,本领域技术人员应当理解,对于上述本公开所提出的头戴设备交互模式的切换方法、系统及电子装置,还可以在不脱离 本公开内容的基础上做出各种改进。因此,本公开的保护范围应当由所附的权利要求书的内容确定。

Claims (20)

  1. 一种头戴设备交互模式的切换方法,其中,所述头戴设备的交互模式包括可相互切换的手柄追踪交互模式以及裸手追踪交互模式;所述切换方法包括:
    获取手柄的自由度6Dof追踪数据和惯性测量单元IMU数据,所述6Dof追踪数据包括所述手柄的位置数据和姿态数据;
    分别获取所述位置数据的标准差、所述姿态数据的标准差,以及所述IMU数据中的加速度计数据的标准差;
    判断当前第一预设时间内所述位置数据的标准差、所述姿态数据的标准差以及所述加速度计数据的标准差是否符合第一预设条件;其中,
    当所述位置数据的标准差、所述姿态数据的标准差以及所述加速度计数据的标准差符合所述第一预设条件时,确定所述手柄追踪交互模式未启动,并启动所述裸手追踪交互模式;同时,
    实时获取第二预设时间内的所述加速度计数据的标准差,并判断所述加速度计数据的标准差是否符合第二预设条件;其中,
    当所述加速度计数据的标准差符合所述第二预设条件时,暂停所述裸手追踪交互模式,并启动所述手柄追踪交互模式。
  2. 如权利要求1所述的头戴设备交互模式的切换方法,其中,判断所述位置数据的标准差、所述姿态数据的标准差以及所述加速度计数据的标准差是否符合第一预设条件的过程包括:
    判断所述位置数据的标准差和所述姿态数据的标准差是否均在第一预设阈值内,并当所述位置数据的标准差和所述姿态数据的标准差均在所述第一预设阈值内时,继续判断所述加速度计数据的标准差是否在第二预设阈值内;
    当所述位置数据的标准差和所述姿态数据的标准差均在所述第一预设阈值内,且所述加速度计数据的标准差在所述第二预设阈值内时,确定符合所述第一预设条件。
  3. 如权利要求2所述的头戴设备交互模式的切换方法,其中,
    所述第一预设时间为1s~4s,所述第二预设时间为1s~4s;
    所述第一预设阈值为0.6,所述第二预设阈值为0.2。
  4. 如权利要求3所述的头戴设备交互模式的切换方法,其中,所述第一预设时间为1.5s。
  5. 如权利要求1所述的头戴设备交互模式的切换方法,其中,判断所述加速度计数据的标准差是否符合第二预设条件的过程包括:
    判断所述加速度计数据的连续预设个数帧在所述第二预设时间内的标准差是否均大于第三预设阈值;
    当所述加速度计数据的连续预设个数帧在所述第二预设时间内的标准差均大于所述第三预设阈值,确定符合所述第二预设条件。
  6. 如权利要求5所述的头戴设备交互模式的切换方法,其中,
    所述第二预设时间为1s~4s;
    所述第三预设阈值为0.03。
  7. 如权利要求6所述的头戴设备交互模式的切换方法,其中,所述第二预设时间为1.5s。
  8. 如权利要求1所述的头戴设备交互模式的切换方法,其中,
    所述手柄追踪交互模式包括:所述手柄与所述头戴设备信号连接,通过所述手柄进行虚拟现实交互;
    所述裸手追踪交互模式包括:采集并识别裸手的手势信息及追踪信息,通过所述手势信息及所述追踪信息进行虚拟现实交互。
  9. 如权利要求8所述的头戴设备交互模式的切换方法,其中,
    在所述头戴设备上设置有至少一个裸手追踪相机和至少两个环境追踪相机;其中,
    所述裸手追踪相机用于捕捉所述裸手的手势信息及追踪信息,所述环境 追踪相机用于获取所述头戴设备相对所处物理环境的6DoF定位信息;
    所述裸手追踪相机和/或所述环境追踪相机包括:深度相机、双目红外相机、RGB相机或者单色相机。
  10. 如权利要求8所述的头戴设备交互模式的切换方法,其中,
    在所述手柄内设置有至少两个手柄追踪控制器,所述手柄追踪控制器用于实时获取所述手柄相对所述头戴设备的6DoF追踪信息;其中,
    所述手柄追踪控制器包括超声波传感器、电磁传感器、光学传感器以及9轴惯性导航传感器。
  11. 如权利要求1所述的头戴设备交互模式的切换方法,其中,还包括默认模式设定过程;其中,
    所述默认模式设定过程包括:通过与所述头戴设备连接的控制系统设定所述默认模式为所述手柄追踪交互模式或所述裸手追踪交互模式;
    所述控制系统还用于控制所述手柄追踪交互模式与所述裸手追踪交互模式的切换。
  12. 如权利要求11所述的头戴设备交互模式的切换方法,其中,
    当所述位置数据的标准差、所述姿态数据的标准差以及所述加速度计数据的标准差符合所述第一预设条件时,通过所述控制系统启动所述裸手追踪交互模式,并对应关闭所述手柄追踪交互模式;
    当所述加速度计数据的标准差符合所述第二预设条件时,通过所述控制系统启动所述手柄追踪交互模式,并对应关闭所述裸手追踪交互模式。
  13. 如权利要求1所述的头戴设备交互模式的切换方法,其中,还包括:
    当所述加速度计数据的标准差不符合所述第二预设条件时,继续执行裸手追踪交互模式,并暂停手柄追踪交互模式。
  14. 一种头戴设备交互模式的切换系统,其中,包括:
    追踪数据和IMU数据获取单元,设置为获取手柄的6Dof追踪数据和IMU 数据,所述6Dof追踪数据包括所述手柄的位置数据和姿态数据;
    标准差获取单元,设置为分别获取所述位置数据的标准差、所述姿态数据的标准差,以及所述IMU数据中的加速度计数据的标准差;
    第一预设条件判断单元,设置为判断当前第一预设时间内所述位置数据的标准差、所述姿态数据的标准差以及所述加速度计数据的标准差是否符合第一预设条件;其中,
    当所述位置数据的标准差、所述姿态数据的标准差以及所述加速度计数据的标准差符合所述第一预设条件时,确定所述手柄追踪交互模式未启动,并启动所述裸手追踪交互模式;
    第二预设条件判断单元,设置为实时获取第二预设时间内的所述加速度计数据的标准差,并判断所述加速度计数据的标准差是否符合第二预设条件;其中,当所述加速度计数据的标准差符合所述第二预设条件时,暂停所述裸手追踪交互模式,并启动所述手柄追踪交互模式。
  15. 如权利要求14所述的头戴设备交互模式的切换系统,其中,第一预设条件判断单元设置为通过以下方式判断所述位置数据的标准差、所述姿态数据的标准差以及所述加速度计数据的标准差是否符合第一预设条件:
    判断所述位置数据的标准差和所述姿态数据的标准差是否均在第一预设阈值内,并当所述位置数据的标准差和所述姿态数据的标准差均在所述第一预设阈值内时,继续判断所述加速度计数据的标准差是否在第二预设阈值内;
    当所述位置数据的标准差和所述姿态数据的标准差均在所述第一预设阈值内,且所述加速度计数据的标准差在所述第二预设阈值内时,确定符合所述第一预设条件。
  16. 如权利要求14所述的头戴设备交互模式的切换系统,其中,所述第二预设条件判断单元设置为通过以下方式判断所述加速度计数据的标准差是否符合第二预设条件:
    判断所述加速度计数据的连续预设个数帧在所述第二预设时间内的标准差是否均大于第三预设阈值;
    当所述加速度计数据的连续预设个数帧在所述第二预设时间内的标准差 均大于所述第三预设阈值,确定符合所述第二预设条件。
  17. 如权利要求14所述的头戴设备交互模式的切换系统,其中,
    所述手柄追踪交互模式包括:所述手柄与所述头戴设备信号连接,通过所述手柄进行虚拟现实交互;
    所述裸手追踪交互模式包括:采集并识别裸手的手势信息及追踪信息,通过所述手势信息及所述追踪信息进行虚拟现实交互。
  18. 如权利要求17所述的头戴设备交互模式的切换系统,其中,
    在所述头戴设备上设置有至少一个裸手追踪相机和至少两个环境追踪相机;其中,
    所述裸手追踪相机用于捕捉所述裸手的手势信息及追踪信息,所述环境追踪相机用于获取所述头戴设备相对所处物理环境的6DoF定位信息;
    所述裸手追踪相机和/或所述环境追踪相机包括:深度相机、双目红外相机、RGB相机或者单色相机。
  19. 如权利要求17所述的头戴设备交互模式的切换系统,其中,
    在所述手柄内设置有至少两个手柄追踪控制器,所述手柄追踪控制器用于实时获取所述手柄相对所述头戴设备的6DoF追踪信息;其中,
    所述手柄追踪控制器包括超声波传感器、电磁传感器、光学传感器以及9轴惯性导航传感器。
  20. 一种非暂时性计算机可读存储介质,其中,其上存储有计算机程序,所述计算机程序在被处理器执行时实现根据权利要求1-13中任一项所述的方法。
PCT/CN2021/116300 2021-02-18 2021-09-02 头戴设备交互模式的切换方法及系统 WO2022174575A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/816,411 US11709543B2 (en) 2021-02-18 2022-07-30 Switching method and system of interactive modes of head-mounted device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110190095.8 2021-02-18
CN202110190095.8A CN112947754B (zh) 2021-02-18 2021-02-18 头戴设备交互模式的切换方法及系统

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/816,411 Continuation US11709543B2 (en) 2021-02-18 2022-07-30 Switching method and system of interactive modes of head-mounted device

Publications (1)

Publication Number Publication Date
WO2022174575A1 true WO2022174575A1 (zh) 2022-08-25

Family

ID=76244387

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/116300 WO2022174575A1 (zh) 2021-02-18 2021-09-02 头戴设备交互模式的切换方法及系统

Country Status (3)

Country Link
US (1) US11709543B2 (zh)
CN (1) CN112947754B (zh)
WO (1) WO2022174575A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112947754B (zh) * 2021-02-18 2023-03-28 青岛小鸟看看科技有限公司 头戴设备交互模式的切换方法及系统
CN116185201A (zh) * 2023-03-07 2023-05-30 北京字跳网络技术有限公司 扩展现实交互方法、装置、电子设备和存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090197635A1 (en) * 2008-02-01 2009-08-06 Kim Joo Min user interface for a mobile device
CN101604203A (zh) * 2008-06-12 2009-12-16 纬创资通股份有限公司 自动切换操作模式的手持式电子产品及控制方法
CN102270038A (zh) * 2010-06-04 2011-12-07 索尼公司 操作终端、电子单元和电子单元系统
CN105117016A (zh) * 2015-09-07 2015-12-02 众景视界(北京)科技有限公司 用于虚拟现实和增强现实交互控制中的交互手柄
CN111857337A (zh) * 2020-07-10 2020-10-30 成都天翼空间科技有限公司 一种vr头控的切换方法
CN112947754A (zh) * 2021-02-18 2021-06-11 青岛小鸟看看科技有限公司 头戴设备交互模式的切换方法及系统

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10078377B2 (en) * 2016-06-09 2018-09-18 Microsoft Technology Licensing, Llc Six DOF mixed reality input by fusing inertial handheld controller with hand tracking
US11580849B2 (en) * 2018-10-11 2023-02-14 Google Llc Wearable electronic systems having variable interactions based on device orientation
US11054923B2 (en) * 2019-10-11 2021-07-06 Finch Technologies Ltd. Automatic switching between different modes of tracking user motions to control computer applications

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090197635A1 (en) * 2008-02-01 2009-08-06 Kim Joo Min user interface for a mobile device
CN101604203A (zh) * 2008-06-12 2009-12-16 纬创资通股份有限公司 自动切换操作模式的手持式电子产品及控制方法
CN102270038A (zh) * 2010-06-04 2011-12-07 索尼公司 操作终端、电子单元和电子单元系统
CN105117016A (zh) * 2015-09-07 2015-12-02 众景视界(北京)科技有限公司 用于虚拟现实和增强现实交互控制中的交互手柄
CN111857337A (zh) * 2020-07-10 2020-10-30 成都天翼空间科技有限公司 一种vr头控的切换方法
CN112947754A (zh) * 2021-02-18 2021-06-11 青岛小鸟看看科技有限公司 头戴设备交互模式的切换方法及系统

Also Published As

Publication number Publication date
CN112947754A (zh) 2021-06-11
US20220365589A1 (en) 2022-11-17
CN112947754B (zh) 2023-03-28
US11709543B2 (en) 2023-07-25

Similar Documents

Publication Publication Date Title
KR102206054B1 (ko) 지문 처리 방법 및 그 전자 장치
US9547391B2 (en) Method for processing input and electronic device thereof
US9690475B2 (en) Information processing apparatus, information processing method, and program
US11461000B2 (en) User interface adjustment methods and systems
US8368723B1 (en) User input combination of touch and user position
US10346027B2 (en) Information processing apparatus, information processing method, and program
US10191603B2 (en) Information processing device and information processing method
US10564712B2 (en) Information processing device, information processing method, and program
WO2022174575A1 (zh) 头戴设备交互模式的切换方法及系统
US20130016129A1 (en) Region-Specific User Input
US20170153804A1 (en) Display device
US20170169611A1 (en) Augmented reality workspace transitions based on contextual environment
JP2017510868A (ja) 把持状態検出
US20220171522A1 (en) Object position adjustment method and electronic device
US20180157344A1 (en) End of session detection in an augmented and/or virtual reality environment
CN104239029A (zh) 用于控制摄像头模式的装置和关联的方法
US10362028B2 (en) Information processing apparatus
CN107451439B (zh) 用于计算设备的多功能按钮
WO2020088268A1 (zh) 桌面图标的整理方法及终端
WO2019076371A1 (zh) 资源数据展示方法及移动终端
US11886643B2 (en) Information processing apparatus and information processing method
US9400575B1 (en) Finger detection for element selection
US9471154B1 (en) Determining which hand is holding a device
US11144091B2 (en) Power save mode for wearable device
US9898183B1 (en) Motions for object rendering and selection

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21926289

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21926289

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 08.12.2023)

122 Ep: pct application non-entry in european phase

Ref document number: 21926289

Country of ref document: EP

Kind code of ref document: A1