WO2023179369A1 - 控制装置的定位方法、装置、设备、存储介质及计算机程序产品 - Google Patents

控制装置的定位方法、装置、设备、存储介质及计算机程序产品 Download PDF

Info

Publication number
WO2023179369A1
WO2023179369A1 PCT/CN2023/080420 CN2023080420W WO2023179369A1 WO 2023179369 A1 WO2023179369 A1 WO 2023179369A1 CN 2023080420 W CN2023080420 W CN 2023080420W WO 2023179369 A1 WO2023179369 A1 WO 2023179369A1
Authority
WO
WIPO (PCT)
Prior art keywords
point cloud
control device
data
cloud data
cloud map
Prior art date
Application number
PCT/CN2023/080420
Other languages
English (en)
French (fr)
Inventor
张秀志
Original Assignee
北京字跳网络技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京字跳网络技术有限公司 filed Critical 北京字跳网络技术有限公司
Publication of WO2023179369A1 publication Critical patent/WO2023179369A1/zh

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/398Synchronisation thereof; Control thereof

Definitions

  • Embodiments of the present application relate to the field of virtual reality (Virtual Reality, VR) technology, and in particular to a positioning method, device, equipment, storage medium and computer program product for a control device.
  • VR Virtual Reality
  • VR equipment can adjust the image displayed on the Head Mounted Display (HMD) according to changes in the user's head and hand movements, thereby giving the user an immersive experience and therefore responding to the user's movements.
  • the positioning is an important factor affecting the user's VR experience.
  • the relative positional relationship between the handle and the HMD is determined through optical positioning, ultrasonic positioning, electromagnetic positioning and other solutions.
  • This application provides a positioning method, device, equipment, storage medium and computer program product for a control device to improve the positioning accuracy of the control device.
  • a positioning method of a control device is provided.
  • the method is applied to a first control device.
  • a camera is installed on the first control device.
  • the method includes: using the camera to determine the external environment of the first control device at time T1. Collect and obtain the first point cloud data; obtain the inertial measurement unit data of the first control device at time T1 and the first point cloud map of the external environment; based on the first point cloud data, inertial measurement unit data and first point cloud map Determine the 6-degree-of-freedom data of the first control device; wherein the first point cloud map includes historical point cloud data collected by the first control device and the second control device from the external environment relative to time T1.
  • a positioning device including: a collection module, an acquisition module, and a first determination module, wherein the collection module is used to collect the external environment of the first control device through a camera at time T1 to obtain the first Point cloud data; the acquisition module is used to obtain the inertial measurement unit data of the first control device at time T1 and the first point cloud map of the external environment; the first determination module is used to obtain the inertial measurement unit data based on the first point cloud data and the inertial measurement unit data. and the first point cloud map determine the 6 degrees of freedom data of the first control device; wherein the first point cloud map includes historical point cloud data collected by the first control device and the second control device from the external environment relative to time T1.
  • a control device including: a processor and a memory.
  • the memory is used to store a computer program.
  • the processor is used to call and run the computer program stored in the memory to execute the first aspect or its respective implementations. method in.
  • a fourth aspect provides a computer-readable storage medium for storing a computer program, the computer program causing a computer to execute the method as in the first aspect or its respective implementations.
  • a fifth aspect provides a computer program product, including computer program instructions, which cause a computer to execute the method in the first aspect or its respective implementations.
  • a sixth aspect provides a computer program, which causes a computer to execute the method in the first aspect or its respective implementations.
  • the first control device can collect the external environment of the first control device at time T1 through the installed camera, obtain the first point cloud data, and combine it with the acquired data of the first control device at time T1.
  • the inertial measurement unit data and the first point cloud map of the external environment determine the 6-degree-of-freedom data of the first control device, where the first point cloud map includes the first control device and the second control device collecting the external environment.
  • the historical point cloud data relative to time T1.
  • the first control device is a handle
  • the 6-degree-of-freedom data of the handle is determined based on the current point cloud data of the environment where the handle is located, the IMU data of the handle, and the point cloud map of the environment, so that self-tracking of the handle can be achieved, so there will be no problems with low positioning accuracy.
  • the point cloud map used when determining the 6 degrees of freedom data of the handle includes the historical point cloud data collected by the second control device such as the HMD and the first control device such as the handle relative to the current moment in the environment, so the The point cloud map includes relatively rich point cloud data, which can further improve the positioning accuracy. Therefore, this application solves the problem of low positioning accuracy of control devices such as handles in the prior art and improves the positioning accuracy of the control device.
  • Figure 1 is an application scenario diagram provided by an embodiment of the present application.
  • Figure 2 is a flow chart of a positioning method of a control device provided by an embodiment of the present application
  • Figure 3 is a schematic diagram of a positioning device provided by an embodiment of the present application.
  • FIG. 5 is a schematic block diagram of the second control device 500 provided by the embodiment of the present application.
  • the words “exemplary” or “for example” are used to mean examples, illustrations or illustrations. In the embodiments of this application, any embodiment or solution described as “exemplary” or “for example” It is not to be construed as being preferred or advantageous over other embodiments or aspects. Rather, use of the words “exemplary” or “such as” is intended to present the concept in a concrete manner.
  • the existing technology often determines the relative positional relationship between the handle and the HMD through optical positioning, ultrasonic positioning, electromagnetic positioning and other solutions, and then combines the 6-degree-of-freedom tracking of the HMD to achieve the positioning of the handle.
  • problems with this The problem of low accuracy of handle positioning. For example, in the ultrasonic positioning solution, if other objects are inserted between the sound wave receiver installed on the HMD and the sound wave transmitter installed on the handle, the electromagnetic waves transmitted between the two will be affected. Cause occlusion or reflection, thus affecting the accuracy of handle positioning.
  • the inventive concept of the present application is: the first control device can Collect the point cloud data of the external environment where the first control device is located at time T1, and then determine the 6 degrees of freedom of the first control device by combining the acquired IMU data of the first control device at time T1 and the point cloud map of the external environment. Data, wherein the point cloud map includes historical point cloud data relative to time T1 collected by the first control device and the second control device on the above-mentioned external environment.
  • FIG. 1 is an application scenario diagram provided by an embodiment of the present application.
  • the application scenario may include a first control device 110 and a second control device 120 .
  • a camera is installed on the first control device 110 .
  • Communication is possible between the first control device 110 and the second control device 120 .
  • the first control device 110 may be a handle in an all-in-one VR machine, which is not limited in this application.
  • the cameras installed on the first control device 110 may be three fisheye cameras located on the first control device 110 , which is not limited in this application.
  • the fisheye camera has a larger viewing angle range, generally reaching about 160 degrees, and can better capture a wide range of scenery at close range.
  • the second control device 120 may be an HMD, such as a head-mounted display in an all-in-one VR machine, which is not limited in this application.
  • first control devices second control devices
  • cameras can be provided according to actual needs. This application does not limit this.
  • Figure 2 is a flow chart of a positioning method of a control device provided by an embodiment of the present application. This method can be executed by the first control device 110 and the second control device 120 as shown in Figure 1, but is not limited thereto, as shown in Figure As shown in 2, the method may include the following steps:
  • the first control device collects the external environment where the first control device is located at time T1 through the camera, and obtains the first point cloud data;
  • the first control device acquires the inertial measurement unit data of the first control device at time T1 and the first point cloud map of the external environment, where the first point cloud map includes the external environment collected by the first control device and the second control device.
  • the historical point cloud data relative to time T1;
  • the first control device determines the 6 degrees of freedom data of the first control device based on the first point cloud data, the inertial measurement unit data and the first point cloud map.
  • a camera is installed on the first control device.
  • this application will use the first control device 110 as the handheld device in the VR all-in-one machine. Taking the handle and the second control device 120 as a head-mounted display in an all-in-one VR machine as an example, the technical solution of the present application will be described in detail.
  • the external environment where the handle is located may be the safe area set by the handle in the initial mode.
  • the safe area is an area where the user can move before experiencing the VR scene.
  • the safe area can be set as a square area with a side length of 1 meter. This application does not limit this.
  • the first point cloud map in S202 includes historical point cloud data relative to time T1 collected by the handle and the head-mounted display of the external environment, among which, for points in the area that are difficult to collect by the head-mounted display, Cloud data can be collected through the camera of the handle, so the first point cloud map includes relatively rich point cloud data, which can further improve the accuracy of handle positioning.
  • the inertial measurement unit is a device that measures the angular velocity and acceleration of an object in three-dimensional space.
  • the inertial measurement unit mainly includes components such as a gyroscope, an accelerometer and a magnetometer.
  • an inertial measurement unit can generally include three single-axis gyroscopes and three single-axis accelerometers.
  • the gyroscope can detect the angular velocity of the object, and the accelerometer can detect the acceleration of the object, so the inertial measurement unit directly measures The data is acceleration and angular velocity, but the inertial measurement unit can achieve indirect measurement values through integration, that is, the inertial measurement unit can integrate the angular velocity once to calculate the angle, that is, calculate the relative attitude, and integrate the acceleration twice to obtain the distance, that is, calculate Relative position, so the inertial measurement unit data of the handle can represent the relative position and relative attitude of the handle.
  • SLAM technology can be divided into 2D/3D SLAM based on lidar-based two-dimensional/three-dimensional simultaneous localization and map creation (2Dimensions/3Dimensions Simultaneous Localization And Mapping), and depth image simultaneous localization and map creation (RGB) based on depth cameras. +Depth map Simultaneous Localization And Mapping, RGB-D SLAM), Visual Simultaneous Localization And Mapping (VSLAM) based on visual sensors, Visual Inertial Odometry (VIO) based on visual sensors and IMU .
  • VIO Visual Inertial Odometry
  • the 6DOF data of the handle can be obtained through VIO.
  • the handle can compare whether there is a difference between the first point cloud data collected at time T1 and the above-mentioned first point cloud map. If there is a difference between the first point cloud data and the first point cloud map, the handle can Update the difference between the first point cloud data and the first point cloud map into the first point cloud map, and send the difference between the first point cloud data and the first point cloud map to the head-mounted display.
  • the handle when the target conditions are met, can update the difference between the first point cloud data and the first point cloud map to the first point cloud map, and combine the first point cloud data and the first point cloud map.
  • the difference portion of the cloud map is sent to the head-mounted display.
  • the head-mounted display can also use a method similar to that of the handle to determine the 6-degree-of-freedom data.
  • the head-mounted display can collect the external environment through its own camera at time T1 to obtain the second point. cloud data and combine it with your own inertial measurement unit data and the first point cloud map to determine your own 6 degrees of freedom number according to.
  • the handle can collect the first point cloud data and combine its own inertial measurement unit data and the first point cloud map to determine its own 6 Degree of freedom data
  • the head-mounted display can also collect point cloud data of the environment, and combine its own inertial measurement unit data and the first point cloud map to determine its own 6-degree-of-freedom data.
  • the handle and head-mounted display The first point cloud map used for the separately determined 6-degree-of-freedom data is the same, thus ensuring that the movement of the handle in the real world and the movement of the head-mounted display in the real world correspond to each other when displayed in the virtual space. Consistent.
  • the handle can receive the difference between the second point cloud data and the first point cloud map sent by the head-mounted display device, and the handle can also send the difference between the first point cloud data and the first point cloud map to the head-mounted display device.
  • the difference part, and both the handle and the headset can update the difference part between the second point cloud data and the first point cloud map and the difference part between the first point cloud data and the first point cloud map to the first point cloud map, that is
  • the controller and the head-mounted display device can update the first point cloud map by interacting with the collected point cloud data. In this way, not only the controller or the head-mounted display device can use the first point cloud when determining the 6-degree-of-freedom data.
  • the map includes richer point cloud data, which improves the accuracy of the determined 6-degree-of-freedom data. It also allows the controller and the head-mounted display device to use the same point cloud data when determining their respective 6-degree-of-freedom data, making the controller and The head-mounted display device can complete tracking in the same coordinate system, thereby ensuring that the corresponding motion change mapping of the handle and the head-mounted display device is consistent with the image changes displayed in the head-mounted display device, ensuring the user's experience.
  • the same coordinate system mentioned above can be a world coordinate system, which is not limited in this application.
  • the handle can store the first point cloud map in the target database.
  • the target database can be a local database of the controller or a cloud database of the controller, which is not limited in this application.
  • the head-mounted display can also store the first point cloud map in the target database.
  • the safety area set by the user is a square area with a side length of 1 meter located in the center of conference room 1
  • the identity of the safety area is safety area 1
  • the first point cloud map is the safety area 1.
  • point cloud map when the controller needs to store the first point cloud map in the target database, it can also store the identifier of the safe area: safe area 1.
  • the method for the head-mounted display to store the point cloud map is similar to the method for the handle to store the first point cloud map, and will not be described in detail here.
  • the above target database can be a database shared by the controller and the head-mounted display.
  • the shared database can be a local database of the controller, a local database of the head-mounted display, or a cloud database. There are no restrictions on this application.
  • the handle can send the 6-degree-of-freedom data to the head-mounted display device.
  • the head mounted display device receives the 6 degrees of freedom data sent by the handle, it can adjust the image displayed on the head mounted display device accordingly based on the 6 degrees of freedom data.
  • the handle can send the inertial measurement unit data to the head-mounted display after acquiring the inertial measurement unit data.
  • the head-mounted display After the head-mounted display receives the inertial measurement unit data sent by the handle, it can predict the movement of the handle, so that the image displayed on the head-mounted display can be pre-adjusted before obtaining the 6-degree-of-freedom data of the handle to reduce the risk of determination due to the handle.
  • the 6-DOF data and the delay caused by sending the 6-DOF data to the head-mounted display will cause the image changes displayed on the head-mounted display to be inconsistent with the movement changes on the handle.
  • the controller can send a time synchronization message to the head mounted display.
  • the head-mounted display After the head-mounted display receives the time synchronization message, it can determine the time difference between the time system of the handle and the time system of the head-mounted display based on the time synchronization message, so that during data interaction between the handle and the head-mounted display, The controller's time system is consistent with the head-mounted display's time system.
  • the time synchronization message includes the time in the controller's time system.
  • the head-mounted display can send a time synchronization message to the handle, and the handle can receive the time synchronization information and determine the time synchronization between the time system of the head-mounted display and the time system of the handle based on the time synchronization message.
  • the time difference allows the time system of the controller to be consistent with the time system of the head-mounted display during data interaction between the controller and the head-mounted display.
  • the time synchronization message includes the time in the time system of the head-mounted display.
  • the head mounted display can send a time synchronization message including time T1.
  • the controller can determine the time difference between the time system of the head-mounted display and its own time system based on time T1 and time T2, so that the time system of the handle and the time of the head-mounted display can be adjusted. The system remains consistent.
  • the mode of the handle can be switched from the initial mode to the self-tracking mode through any of the following methods, but is not limited to this:
  • Method 1 Before the handle collects the above-mentioned first point cloud data, the handle can obtain a mode switching instruction. In response to the mode switching instruction, the handle can switch the mode of the handle from the initial mode to the self-tracking mode according to the mode switching instruction.
  • the initial mode may be the mode in which the handle is operated through the arm model, and the self-tracking mode is the mode in which the handle is in when performing the above-mentioned S201 to S204.
  • the above mode switching instruction can be the user's pressing operation on a certain button on the handle, or the user's single click, double click, long press, slide, floating touch gesture and other operations on a certain area on the handle screen, or other operations. It can be the user's pressing operation on a certain button on the head-mounted display, or the user's click, double-click, long press, slide, floating touch gesture, etc. on a certain area on the head-mounted display screen. This application does not cover this. Make restrictions.
  • Method 2 After the initial mode ends, automatically switch the mode of the handle from the initial mode to the self-tracking mode. Among them, this application will describe the initial mode of the handle in detail through the following embodiments.
  • the handle after the handle and head-mounted display are turned on, the handle will enter the initial mode. In the initial mode, menu selection and safe area settings need to be completed.
  • the point cloud data collected by the handle and the head-mounted display do not have a unified coordinate system.
  • the handle can send its own inertial measurement unit data to the head-mounted display, and the head-mounted display can adjust the coordinate system according to the handle.
  • the sent inertial measurement unit data determines the three degrees of freedom data of the handle: three types of rotational degree of freedom data of pitch, roll, and pitch, that is, the relative attitude of the handle.
  • the head-mounted display can determine the position of the handle in combination with the arm model in the head-mounted display, or the head-mounted display can also photograph the handle through the head-mounted display's camera to determine The relative position and posture of the handle and the head-mounted display are calculated through conversion. In this way, the head-mounted display can obtain the position and posture of the handle, and then the safe area settings and menu selections can be completed. Since the coordinate system is It is established based on the point cloud data in the safe area, so after the safe area setting is completed, the handle and head-mounted display can be unified into the same coordinate system.
  • the menu selections in the initial mode may include: selection of VR scenes, selection of networks that need to be connected such as wireless fidelity (Wireless Fidelity, WiFi) selection, etc.
  • the setting of the safe area refers to the area where the user can move before the user experiences the VR scene.
  • the safe area can be a square area with a side length of 1 meter in the user's environment when the user experiences the VR scene, or it can be There is a square area with a length of 1 meter and a width of 0.5 meters in this environment. This application does not impose restrictions on the specific content of the menu selection and the scope of the safe area.
  • the user when the handle is in the initial mode, the user can press the buttons on the handle, and the head-mounted display can respond to the user's pressing operation to complete the menu selection or safe zone in the above-mentioned initial operation. Domain settings; the user can also use the handle to rotate. For example, the user can use the handle to draw the range of the safe area that needs to be set. The head-mounted display can respond to the user's rotation operation to complete the menu selection in the above-mentioned initial manipulation or the safe area. set up. This application does not impose restrictions on menu selection and safe area setting methods.
  • the head-mounted display when setting the above-mentioned safe area, can collect point cloud data of the environment at time T2, and find whether the point cloud data is stored in the target database. If the point cloud data is stored in the target database, Point cloud data, that is, the point cloud map stored in the target database includes the point cloud data collected at time T2, then the head-mounted display can directly set the safe area corresponding to the point cloud map to a safe area.
  • the head-mounted display when setting the above-mentioned safe area, can collect the point cloud data of the environment at time T2, and find whether the point cloud data is stored in the target database. If the point cloud data is not stored in the target database, If there is this point cloud data, that is, the point cloud map stored in the target database does not include the point cloud data collected at T2, the head mounted display can cooperate with the handle to reset the safe area.
  • the head-mounted display when setting the above-mentioned safe area, can collect point cloud data of the environment at time T2, and search whether the point cloud data is stored in the target database. If the point cloud data is stored in the target database, The point cloud data, that is, the point cloud map stored in the target database includes the point cloud data collected at the T2 time. Then the head mounted display can ask the user whether he needs to use the safe area corresponding to the point cloud map. If the user confirms that he needs to use the point cloud data, If the user confirms that they do not need to use the safe area corresponding to the point cloud map, the head mounted display can reset the safety area with the handle. area.
  • the head-mounted display in the above-mentioned safe area that needs to be reset, can send the point cloud data of the environment collected at time T2 to the handle, that is, the handle can receive the head-mounted display's collection of the external environment.
  • the historical point cloud data relative to the T2 time and then both the handle and the headset can construct a second point cloud map based on the historical point cloud data.
  • the second point cloud map is the point cloud map corresponding to the reset safety area.
  • the handle Both the headset and headset can use this second point cloud map to determine their own 6-DOF data in self-tracking mode.
  • the point cloud map stored in the target database includes the point cloud data collected at time T2
  • the head-mounted display can store the point cloud data collected at time T2.
  • the difference between the point cloud map and the point cloud map is sent to the handle, and the difference is updated in the point cloud map.
  • the handle receives the point cloud data collected at T2 sent by the head-mounted display and the difference between the point cloud map and the point cloud map. , you can update the difference part in the above point cloud map stored by yourself.
  • the first control device can collect the external environment where the first control device is located at time T1 through the installed camera, and obtain the first point cloud data. , and combined with the acquired inertial measurement unit data of the first control device at time T1 and the first point cloud map of the external environment, the 6-degree-of-freedom data of the first control device is determined, where the first point cloud map includes The first control device and the The second control device collects historical point cloud data relative to time T1 from the external environment.
  • the first control device is a handle
  • the 6-degree-of-freedom data of the handle is determined based on the current point cloud data of the environment where the handle is located, the IMU data of the handle, and the point cloud map of the environment, so that self-tracking of the handle can be achieved, so there will be no low positioning accuracy.
  • the point cloud map used when determining the 6 degrees of freedom data of the handle includes the historical point cloud data collected by the second control device such as the HMD and the first control device such as the handle relative to the current moment in the environment, so the The point cloud map includes relatively rich point cloud data, which can further improve the positioning accuracy. Therefore, this application solves the problem of low positioning accuracy of control devices such as handles in the prior art and improves the positioning accuracy of the control device.
  • the handle can update the difference between the first point cloud data and the first point cloud map to the first point cloud map, and update the difference between the first point cloud data and the first point cloud map. Part is sent to the head-mounted display.
  • the target condition may include at least one of the following: reaching a preset time period from T1; the difference between the first point cloud data and the first point cloud map and the external environment at T1 through the handle camera. The cumulative size of the difference between the point cloud data collected after the time and the first point cloud map reaches the preset size. In this way, the number of times the handle is updated or the difference between the first point cloud data and the first point cloud map is sent can be reduced, thereby reducing power consumption.
  • the handle can receive point cloud data sent by the head-mounted display, and can update or construct the same point cloud map as the head-mounted display based on the point cloud data, thus ensuring that the handle and the head-mounted display are both
  • the respective 6-degree-of-freedom data can be determined based on the same point cloud map in the self-tracking mode of the handle, thereby ensuring that the movement of the handle in the real world and the movement of the head-mounted display in the real world are correspondingly displayed in the virtual world. are consistent in space.
  • FIG 3 is a schematic diagram of a positioning device provided by an embodiment of the present application.
  • the positioning device may be the handle 110 as shown in Figure 1.
  • the positioning device includes:
  • the collection module 301 is used to collect the external environment where the first control device is located at time T1 through a camera to obtain the first point cloud data;
  • the acquisition module 302 is used to acquire the inertial measurement unit data of the first control device at time T1 and the first point cloud map of the external environment;
  • the first determination module 303 is used to determine the 6 degrees of freedom data of the first control device according to the first point cloud data, the inertial measurement unit data and the first point cloud map;
  • the first point cloud map includes the external environment collected by the first control device and the second control device relative to T1 Historical point cloud data over time.
  • the positioning device further includes: a comparison module 304 and an update sending module 305.
  • the comparison module 304 is used to compare whether there is a difference between the first point cloud data and the first point cloud map;
  • the update sending module 305 is used to compare the first point cloud data and the first point cloud map if there is a difference between the first point cloud data and the first point cloud map.
  • the difference between the data and the first point cloud map is updated to the first point cloud map, and the difference between the first point cloud data and the first point cloud map is sent to the second control device.
  • the update sending module 305 is specifically used to: when the target conditions are met, update the difference between the first point cloud data and the first point cloud map to the first point cloud map, and update the first point cloud map to the first point cloud map.
  • the difference between the cloud data and the first point cloud map is sent to the second control device.
  • the target condition includes at least one of the following: reaching a preset time period from time T1; the difference between the first point cloud data and the first point cloud map and the external environment collected through the camera after time T1. The cumulative size of the difference between the point cloud data and the first point cloud map reaches the preset size.
  • the positioning device also includes: a first receiving module 306 and an update module 307.
  • the first receiving module 306 is used to receive the difference between the second point cloud data sent by the second control device and the first point cloud map.
  • the update module 307 is used to update the difference between the second point cloud data and the first point cloud map to the first point cloud map, where the second point cloud data is viewed from the external environment through its own camera through the second control device. Point cloud data collected at time T1.
  • the positioning device further includes: a switching module 308.
  • the switching module 308 is used to switch the mode of the first control device from the initial mode to the self-tracking mode; wherein the initial mode is to operate the first control device through the arm model. mode.
  • the positioning device further includes: a second receiving module 309 and a second determining module 310.
  • the second receiving module 309 is configured to receive the external response of the second control device when the mode of the first control device is the initial mode.
  • Historical point cloud data collected from the environment relative to time T2; the second determination module 310 is configured to obtain a second point cloud map based on historical point cloud data collected from the external environment relative to time T2 by the second control device.
  • the first point cloud map is stored in a target database; the target database is a database shared by the first control device and the second control device.
  • the device embodiment and the method embodiment on the control device side may correspond to each other, and similar descriptions may refer to the method embodiment corresponding to the control device. To avoid repetition, they will not be repeated here.
  • the positioning device shown in Figure 3 can perform the above method embodiments on the control device side, and the aforementioned and other operations and/or functions of each module in the positioning device shown in Figure 3 are respectively to implement the above control device side method.
  • the corresponding process of the embodiment will not be described again here.
  • the software module may be located in a mature storage medium in the field such as random access memory, flash memory, read-only memory, programmable read-only memory, electrically erasable programmable memory, register, etc.
  • the storage medium is located in the memory, and the processor reads the information in the memory and completes the steps in the above method embodiment in combination with its hardware.
  • FIG. 4 is a schematic block diagram of a first control device 400 provided by an embodiment of the present application.
  • the first control device 400 may be a handle for executing the above method embodiment.
  • the first control device 400 may include one or more of the following components: IMU sensor module 401, camera module 402, linear motor module 403, wireless communication module 404, ANT405, touch key input module 406, user input Module 407 and processor 408.
  • the IMU module 401 is configured to detect inertial measurement unit data of the first control device 400 , where the inertial measurement unit data includes the angular velocity and acceleration of the first control device 400 in a three-dimensional space.
  • the inertial measurement unit sensor module 401 may include three single-axis accelerometers, three single-axis gyroscopes, and three single-axis magnetometers, and may measure and report speed through a combination of accelerometers, gyroscopes, and magnetometers. , direction and gravity, etc.
  • the camera module 402 is configured to collect point cloud data of the environment where the first control device 400 is located.
  • the camera module 402 may include multiple fisheye cameras.
  • the fisheye cameras are configured to collect point cloud data of the environment where the first control device 400 is located.
  • the linear motor module 403 is configured to provide interactive feedback such as vibration to the user. For example, when the user presses a button of the first control device 400, the linear motor module 403 is configured to generate a vibration as feedback, or when the first control device When the movement range of the device or the second control device exceeds the safe area, the linear motor module 403 is configured to generate a vibration as a reminder, allowing the user to have a better experience.
  • the wireless communication module 404 is configured for communication between the first control device and the second control device.
  • the wireless communication module 404 can send the 6 degrees of freedom data of the first control device, inertial measurement unit data, key data, time synchronization messages, point cloud data, etc. to the second control device, and the wireless communication module 404 can also receive the second control device. Point cloud data, time synchronization messages, control information, etc. sent by the device to the first control device.
  • the wireless communication module 404 may include a wireless chip, and the wireless communication module 404 may further include an ANT405.
  • the touch key input module 406 is configured to provide the user with key or touch operations on the first control device.
  • the user input module 407 is configured to provide user input operations on the first control device.
  • the processor 408 may include, but is not limited to: a general-purpose processor, a digital signal processor (Digital Signal Processor, DSP), an application specific integrated circuit (Application Specific Integrated Circuit, ASIC), a field programmable gate. Array (Field Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • DSP Digital Signal Processor
  • ASIC Application Specific Integrated Circuit
  • FPGA Field Programmable Gate Array
  • FPGA Field Programmable Gate Array
  • the bus system also includes a power bus, a control bus, a status signal bus, etc.
  • FIG. 5 is a schematic block diagram of the second control device 500 provided by the embodiment of the present application.
  • the second control device 500 may be the head-mounted display in the above embodiment.
  • the second control device 500 may include one or more of the following components: key input and LED light display module 501, camera module 502, handle wireless communication module 503, IPD detection module 504, USB3.0 interface module 505. Display module 506, audio input and output module 507, power management module 508, WIFI/BT module 509, memory storage module 510, distance sensor detection module 511, IMU sensor module 512, processor 513 and ANT514.
  • the key input and LED light display module 501 is configured to provide the user with a display of prompt lights for key presses or input operations on the first control device, as well as for power-on and the like.
  • the camera module 502 is configured to collect point cloud data of the environment where the second control device 500 is located.
  • the camera module 502 may include multiple fisheye cameras configured to collect point cloud data of the environment where the second control device 500 is located.
  • the handle wireless communication module 503 is configured for communication between the first control device and the second control device. For example, the handle wireless communication module 503 can send the 6DOF data of the second control device, inertial measurement unit data, button data, time synchronization messages, point cloud data, etc. to the first control device. The handle wireless communication module 503 can also receive the first control device. Point cloud data, time synchronization messages, control information, etc. sent by the device to the second control device.
  • the handle wireless communication module 503 may include a wireless chip, such as a Bluetooth chip, a WiFi chip, an ad chip, and an Ultra Wide Band (UWB) chip.
  • the handle wireless communication module 503 may also include ANT514. Further, the handle wireless communication module 503 may also include a processor.
  • the IPD detection module 504 is configured to detect the interpupillary distance of the user.
  • the USB3.0 interface module 505 is configured to connect external devices.
  • the display module 506 is configured to display images in the second control device.
  • the display module 506 can adjust the image displayed in the second control device in real time according to the 6DOF data of the handle and HMD.
  • the audio input and output module 507 is configured to input or output audio data.
  • the audio input and output module 507 can receive voice data input by the user, and can also output audio data to the user.
  • the audio input and output module 507 may include speakers, microphones, speakers, etc.
  • the power management module 508 is configured to distribute and provide electrical power to various components of the second control device 500 .
  • the WIFI/BT module 509 is configured for communication between the first control device and the second control device.
  • the WIFI/BT module 509 can send the second control device's 6 degrees of freedom data, inertial measurement unit data, button data, time synchronization messages, point cloud data, etc. to the first control device.
  • the WIFI/BT module 509 can also receive the handle Sent point cloud data, time synchronization messages, control information, etc.
  • the memory storage module 510 is configured to store computer programs, point cloud data, and the like.
  • the memory storage module 510 includes but is not limited to: volatile memory and/or non-volatile memory.
  • non-volatile memory can be read-only memory (Read-Only Memory, ROM), programmable read-only memory (Programmable ROM, PROM), erasable programmable read-only memory (Erasable PROM, EPROM), electrically removable memory.
  • Erase programmable read-only memory Electrode EPROM, EEPROM
  • Volatile memory may be Random Access Memory (RAM), which is used as an external cache.
  • RAM static random access memory
  • DRAM dynamic random access memory
  • DRAM synchronous dynamic random access memory
  • SDRAM double data rate synchronous dynamic random access memory
  • Double Data Rate SDRAM DDR SDRAM
  • ESDRAM enhanced synchronous dynamic random access memory
  • SLDRAM synchronous link dynamic random access memory
  • Direct Rambus RAM Direct Rambus RAM
  • the distance sensor detection module 511 is configured to detect the movement range of the HMD and the handle to keep the movement ranges of the first control device and the second control device within a safe area.
  • the IMU sensor module 512 is configured to detect IMU data of the second control device 500 , the IMU data including the angular velocity and acceleration of the second control device 500 in a three-dimensional space.
  • the IMU sensor module 512 may include three single-axis accelerometers, three single-axis gyroscopes, and three single-axis magnetometers, and may measure and report speed and direction through a combination of sensors such as accelerometers, gyroscopes, and magnetometers. and gravity etc.
  • the processor 513 is configured to execute the above method embodiments according to instructions in the computer program, such as processing data interaction between the first control device and the second control device, determining 6 degrees of freedom data according to point cloud data and inertial measurement unit data. wait.
  • the processor 513 may include, but is not limited to: a general-purpose processor, a digital signal processor (Digital Signal Processor, DSP), an application specific integrated circuit (Application Specific Integrated Circuit, ASIC), a field programmable gate Array (Field Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • DSP Digital Signal Processor
  • ASIC Application Specific Integrated Circuit
  • FPGA Field Programmable Gate Array
  • FPGA Field Programmable Gate Array
  • the bus system also includes a power bus, a control bus, a status signal bus, etc.
  • This application also provides a computer storage medium on which a computer program is stored.
  • the computer program When the computer program is executed by a computer, the computer can perform the method of the above method embodiment.
  • An embodiment of the present application also provides a computer program product containing instructions, which when executed by a computer causes the computer to perform the method of the above method embodiment.
  • the computer program product includes one or more computer instructions.
  • the computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable device.
  • the computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted over a wired connection from a website, computer, server, or data center (such as coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (such as infrared, wireless, microwave, etc.) to another website, computer, server or data center.
  • the computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device such as a server or data center integrated with one or more available media.
  • the available media may be magnetic media (such as floppy disks, hard disks, magnetic tapes), optical media (such as digital video discs (DVD)), or semiconductor media (such as solid state disks (SSD)), etc.

Abstract

一种控制装置的定位方法、装置、设备、存储介质及计算机程序产品,该方法应用于第一控制装置,第一控制装置上安装有摄像头,该方法包括:通过摄像头对第一控制装置所处的外部环境在T1时刻进行采集,得到第一点云数据;获取第一控制装置在T1时刻的惯性测量单元数据和外部环境的第一点云地图;根据第一点云数据、惯性测量单元数据和第一点云地图确定第一控制装置的6自由度数据;其中,第一点云地图包括第一控制装置和第二控制装置对外部环境采集的相对于T1时刻的历史点云数据。该方法提高控制装置的定位精度。

Description

控制装置的定位方法、装置、设备、存储介质及计算机程序产品
优先权信息
本申请要求于2022年03月21日提交的,申请名称为“控制装置的定位、装置、设备、存储介质及计算机程序产品”的、中国专利申请号“2022103046088”的优先权,该申请的全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及虚拟现实(Virtual Reality,VR)技术领域,尤其涉及一种控制装置的定位方法、装置、设备、存储介质及计算机程序产品。
背景技术
VR设备可以根据用户的头部和手部等动作变化来相应调整头戴式显示器(Head Mounted Display,HMD)上显示的图像,从而给用户带来身临其境的体验感,因此对用户动作的定位是影响用户的VR体验感的一个重要因素。
目前,现有技术常使用inside-out空间定位技术来实现对HMD的6自由度追踪,然后在此基础上,通过光学定位、超声波定位和电磁定位等方案判断出手柄和HMD的相对位置关系,以确定出手柄的世界坐标系,实现对手柄的定位,从而可以确定出手柄的动作变化对应在HMD上显示的图像的变化,给用户以身临其境的体验感。
然而,上述方案存在手柄定位精度较低的问题。例如在上述超声波定位方案中,如果在HMD上安装的声波接收器和手柄上安装的声波发射器之间插入其他物体时,就会对两者之间传输的电磁波造成遮挡或者反射,从而影响手柄定位的精度。
发明内容
本申请提供一种控制装置的定位方法、装置、设备、存储介质及计算机程序产品,以提高控制装置的定位精度。
第一方面,提供一种控制装置的定位方法,该方法应用于第一控制装置,第一控制装置上安装有摄像头,该方法包括:通过摄像头对第一控制装置所处的外部环境在T1时刻进行采集,得到第一点云数据;获取第一控制装置在T1时刻的惯性测量单元数据和外部环境的第一点云地图;根据第一点云数据、惯性测量单元数据和第一点云地图确定第一控制装置的6自由度数据;其中,第一点云地图包括第一控制装置和第二控制装置对外部环境采集的相对于T1时刻的历史点云数据。
第二方面,提供一种定位装置,包括:采集模块、获取模块、第一确定模块,其中,采集模块用于通过摄像头对第一控制装置所处的外部环境在T1时刻进行采集,得到第一点云数据;获取模块用于获取第一控制装置在T1时刻的惯性测量单元数据和外部环境的第一点云地图;第一确定模块用于用于根据第一点云数据、惯性测量单元数据和第一点云地图确定第一控制装置的6自由度数据;其中,第一点云地图包括第一控制装置和第二控制装置对外部环境采集的相对于T1时刻的历史点云数据。
第三方面,提供一种控制装置,包括:处理器和存储器,该存储器用于存储计算机程序,该处理器用于调用并运行该存储器中存储的计算机程序,执行如第一方面或其各实现方式中的方法。
第四方面,提供一种计算机可读存储介质,用于存储计算机程序,计算机程序使得计算机执行如第一方面或其各实现方式中的方法。
第五方面,提供一种计算机程序产品,包括计算机程序指令,该计算机程序指令使得计算机执行如第一方面或其各实现方式中的方法。
第六方面,提供一种计算机程序,计算机程序使得计算机执行如第一方面或其各实现方式中的方法。
通过本申请技术方案,第一控制装置可以通过安装的摄像头对第一控制装置所处的外部环境在T1时刻进行采集,得到第一点云数据,并结合获取的第一控制装置在T1时刻的惯性测量单元数据和所处的外部环境的第一点云地图,确定出第一控制装置的6自由度数据,其中,第一点云地图包括第一控制装置和第二控制装置对外部环境采集的相对于T1时刻的历史点云数据。在上述过程中,当第一控制装置是手柄时,不需要采用现有技术中的根据HMD的6自由度数据,再结合手柄和HMD的相对位置关系的方法来实现对手柄的定位,而是根据手柄所处环境的当前时刻点云数据、手柄的IMU数据和所处环境的点云地图来确定手柄的6自由度数据,从而可以实现手柄的自追踪,因而不会存在定位精度较低的原因,例如:在上述根据超声波定位方案确定手柄和HMD之间相对位置关系中,如果在HMD上安装的声波接收器和手柄上安装的声波发射器之间插入其他物体时,就会对两者之间传输的电磁波造成遮挡或者反射,从而影响手柄定位精度。而且,在确定手柄的6自由度数据时所使用的点云地图包括第二控制装置如HMD和第一控制装置如手柄对所处环境采集的相对于上述当前时刻的历史点云数据,所以该点云地图包括的点云数据是较为丰富的,从而可以进一步提升定位的精度,因此本申请解决了现有技术存在控制装置如手柄定位精度较低的问题,提高了控制装置的定位精度。
附图说明
为了更清楚地说明本发明实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本申请实施例提供的一种应用场景图;
图2为本申请实施例提供的一种控制装置的定位方法的流程图;
图3为本申请实施例提供的一种定位装置的示意图;
图4是本申请实施例提供的第一控制装置400的示意性框图;
图5是本申请实施例提供的第二控制装置500的示意性框图。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动的前提下所获得的所有其他实施例,都属于本发明保护的范围。
需要说明的是,本发明的说明书和权利要求书及上述附图中的术语“第一”、“第二”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,以便这里描述的本发明的实施例能够以除了在这里图示或描述的那些以外的顺序实施。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,例如,包含了一系列步骤或单元的过程、方法、系统、产品或服务器不必限于清楚地列出的那些步骤或单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它步骤或单元。
本申请实施例中,“示例性的”或者“例如”等词用于表示作例子、例证或说明,本申请实施例中被描述为“示例性的”或者“例如”的任何实施例或方案不应被解释为比其它实施例或方案更优选或更具优势。确切而言,使用“示例性的”或者“例如”等词旨在以具体方式呈现相关概念。
如上所述,现有技术常通过光学定位、超声波定位和电磁定位等方案判断出手柄和HMD的相对位置关系,再结合对HMD的6自由度追踪,来实现对手柄的定位,但是这存在对手柄定位的精度较低的问题,例如在超声波定位方案中,如果在HMD上安装的声波接收器和手柄上安装的声波发射器之间插入其他物体时,就会对两者之间传输的电磁波造成遮挡或者反射,从而影响对手柄定位的精度。
为了解决上述技术问题,本申请的发明构思是:第一控制装置可以通过安装的摄像头 采集第一控制装置所处的外部环境在T1时刻的点云数据,然后结合获取的第一控制装置在T1时刻的IMU数据和上述外部环境的点云地图确定出第一控制装置的6自由度数据,其中,该点云地图包括第一控制装置和第二控制装置对上述外部环境采集的相对于T1时刻的历史点云数据。
应理解的是,本申请技术方案可以应用于如下场景,但不限于:
示例性的,图1为本申请实施例提供的一种应用场景图,如图1所示,该应用场景中可以包括第一控制装置110和第二控制装置120。其中,第一控制装置110上安装有摄像头。第一控制装置110和第二控制装置120之间可以进行通信。
在一些可实现方式中,第一控制装置110可以是VR一体机中的手柄,本申请对此不做限制。
在一些可实现方式中,如图1所示,第一控制装置110上安装的摄像头可以是位于第一控制装置110上的三个鱼眼摄像头,本申请对此不做限制。
应理解的,鱼眼摄像头的视角范围较大,一般可达到160度左右,可以较好地实现近距离内对大范围景物的拍摄。
在一些可实现方式中,第二控制装置120可以是HMD,如VR一体机中的头戴显示器,本申请对此不做限制。
应该理解,图1中的第一控制装置、第二控制装置、摄像头的数目仅仅是示意性的,实际上,根据实际情况需要可以设置任意数目的第一控制装置、第二控制装置和摄像头,本申请对此不做限制。
在介绍了本申请实施例的应用场景之后,下面将对本申请技术方案进行详细阐述:
图2为本申请实施例提供的一种控制装置的定位方法的流程图,该方法可以由如图1所示的第一控制装置110和第二控制装置120执行,但不限于此,如图2所示,该方法可以包括如下步骤:
S201:第一控制装置通过摄像头对第一控制装置所处的外部环境在T1时刻进行采集,得到第一点云数据;
S202:第一控制装置获取第一控制装置在T1时刻的惯性测量单元数据和外部环境的第一点云地图,其中,第一点云地图包括第一控制装置和第二控制装置对外部环境采集的相对于T1时刻的历史点云数据;
S203:第一控制装置根据第一点云数据、惯性测量单元数据和第一点云地图确定第一控制装置的6自由度数据。
其中,第一控制装置上安装有摄像头。
需要说明的是,本申请将在下面的实施例中以第一控制装置110为VR一体机中的手 柄、第二控制装置120为VR一体机中的头戴显示器为例对本申请的技术方案展开详细描述。
在一些可实现方式中,手柄所处的外部环境可以是手柄在初始模式时设置的安全区域。其中,安全区域是用户体验VR场景前设置的用户可活动的区域,例如安全区域可以被设置为一个边长为1米的正方形区域,本申请对此不做限制。
需要说明的是,安全区域的具体设置和手柄的初始模式将在下面的关于初始模式的实施例中详细描述,本申请在此不再赘述。
应理解的是,因为S202中的第一点云地图包括手柄和头戴显示器对所处外部环境采集的相对于T1时刻的历史点云数据,其中,对于头戴显示器不易采集的区域范围的点云数据,可以通过手柄的摄像头来采集,所以第一点云地图包括的点云数据是比较丰富的,可以进一步提升手柄定位的精度。
在一些可实现方式中,手柄可以通过手柄内部的惯性测量单元获取惯性测量单元数据。
应理解的是,惯性测量单元是一种测量物体在三维空间中的角速度和加速度的装置,惯性测量单元主要包括的元件有陀螺仪、加速度计和磁力计。一般情况下,一个惯性测量单元一般可以包括三个单轴的陀螺仪和三个单轴的加速度计,其中陀螺仪可以检测物体的角速度,加速度计可以检测物体的加速度,所以惯性测量单元直接测量的数据是加速度和角速度,但是惯性测量单元可以通过积分实现间接测量值,即惯性测量单元可以对角速度进行一次积分计算角度,也就是计算相对姿态,对加速度进行二次积分获取距离,也就是计算相对位置,所以手柄的惯性测量单元数据可以表示手柄的相对位置和相对姿态。
但是由于在计算相对位置和相对姿态时,惯性测量单元参考的数据是上时刻数据,所以误差是会随着时间来累积的,因而手柄在根据T1时刻的惯性测量单元数据确定手柄的6自由度数据时,需要采集手柄所处环境在T1时刻的点云数据来矫正该误差,另外还需要结合手柄所处环境的第一点云地图,再通过同时定位与地图创建(Simultaneous Localization And Mapping,SLAM)技术来确定出手柄的6自由度数据。该6自由度数据可以表示手柄在现实世界中在上述在T1时刻的点云数据对应的坐标系下的位置和姿态,也就是上述所处环境的第一点云地图对应的坐标系下的位置和姿态。
其中,自由度可以分为两类:平移和旋转,平移分为向前/后、向上/下、向左/右3种,旋转分为纵摇(Pitch)、横摇(Roll)、垂摇(Yaw)3种,即3种类型的平移自由度和3种类型的旋转自由度就构成了6自由度,三种平移自由度和惯性测量单元数据中的相对位置对应,三种旋转自由度和惯性测量单元数据中的相对姿态对应。无论刚体的任何可能性运动有多复杂,都可以通过平移和旋转的组合来表示,即都可以通过6自由度数据进行表达。因此,在确定出手柄的6自由度数据之后,就可以较准确地判断出手柄在现实世界中的运动情况, 从而可以以此为根据,实现VR场景中的映射,即相应调整头戴显示器上显示的图像,从而给用户带来身临其境的体验感。
应理解的是,SLAM技术可以应用的技术领域非常广泛,例如:VR技术领域、机器人技术领域、无人机技术领域、无人驾驶技术领域等,SLAM技术主要用于解决物体在未知环境中运动时的定位和地图构建问题。在VR技术领域,通过SLAM技术可以计算出手柄和HMD在三维空间中的6自由度数据,也可以构建视觉效果更为真实的地图,从而针对用户的当前视角来渲染虚拟物体的叠加效果,使之更真实没有违和感。
按照传感器的不同,SLAM技术可以分为基于激光雷达的二维/三维同时定位与地图创建(2Dimensions/3Dimensions Simultaneous Localization And Mapping)2D/3D SLAM、基于深度相机的深度图像同时定位与地图创建(RGB+Depth map Simultaneous Localization And Mapping,RGB-D SLAM)、基于视觉传感器的视觉同步定位与映射(Visual Simultaneous Localization And Mapping,VSLAM)、基于视觉传感器和IMU的视觉惯性里程计(Visual Inertial Odometry,VIO)。例如,在VR技术领域中,可以通过VIO获取手柄的6DOF数据。
在一些可实现方式中,手柄可以比较上述在T1时刻采集的第一点云数据和上述第一点云地图是否存在差异,若第一点云数据和第一点云地图存在差异,则手柄可以将第一点云数据和第一点云地图的差异部分更新至第一点云地图中,并将第一点云数据和第一点云地图的差异部分发送给头戴显示器。
需要说明的是,手柄在通过SLAM技术确定6自由度数据的过程中,会存在比较上述在T1时刻采集的第一点云数据和上述第一点云地图是否存在差异这样一个步骤,所以手柄可以利用在通过SLAM技术确定6自由度数据的过程来比较上述在T1时刻采集的第一点云数据和上述第一点云地图是否存在差异。
在一些可实现方式中,手柄可以在符合目标条件时,将第一点云数据和第一点云地图的差异部分更新至第一点云地图中,并将第一点云数据和第一点云地图的差异部分发送给头戴显示器。
在一些可实现方式中,上述目标条件包括以下至少一项:距离T1时刻达到预设时长;第一点云数据和第一点云地图的差异部分以及通过手柄摄像头对所处外部环境在T1时刻之后进行采集得到的点云数据与第一点云地图的差异部分的累计大小达到预设大小。如此一来,可以减少手柄更新或者发送第一点云数据和第一点云地图的差异部分的次数,从而降低功耗。
在一些可实现方式中,头戴显示器也可以采用和手柄类似的方法确定6自由度数据,例如,头戴显示器可以在在T1时刻通过自身的摄像头对外部环境在T1时刻进行采集得到第二点云数据,并结合自己的惯性测量单元数据和第一点云地图来确定自己的6自由度数 据。
需要说明的是,因为第一点云地图是可以更新的,所以在T1时刻时,手柄可以采集第一点云数据,并结合自己的惯性测量单元数据和第一点云地图来确定自己的6自由度数据,头戴显示器也可以采集所处环境的点云数据,并结合自己的惯性测量单元数据和第一点云地图来确定自己的6自由度数据,这样一来,手柄和头戴显示器分别确定的6自由度数据所使用的第一点云地图是一样的,从而可以保证手柄在现实世界中的运动情况和头戴显示器在现实世界中的运动情况分别对应显示在虚拟空间中时是一致的。
在一些可实现方式中,手柄可以接收头戴显示器发送的第二点云数据和第一点云地图的差异部分,并可以将第二点云数据和第一点云地图的差异部分更新至第一点云地图中。其中,第二点云数据是通过头戴显示器通过自身的摄像头对外部环境在T1时刻进行采集得到的点云数据。
应理解的是,手柄可以接收头戴显示设备发送的第二点云数据和第一点云地图的差异部分,手柄也可以向头戴显示设备发送第一点云数据和第一点云地图的差异部分,而且手柄和头戴都可以将第二点云数据和第一点云地图的差异部分和第一点云数据和第一点云地图的差异部分更新至第一点云地图中,即手柄和头戴显示设备之间可以通过交互采集的点云数据来更新第一点云地图,这样一来,不仅可以使手柄或者头戴显示设备在确定6自由度数据时使用到第一点云地图包括的点云数据更丰富,提高确定的6自由度数据的准确度,还可以使得使手柄和头戴显示设备在确定各自的6自由度数据时使用到的点云数据一样,使得手柄和头戴显示设备可以在同一坐标系下完成追踪,从而保证了手柄和头戴显示设备各自对应的运动变化映射在头戴显示设备中显示的图像变化是一致的,保证了用户的体验。例如上述同一坐标系可以是世界坐标系,本申请对此不做限制。
在一些可实现方式中,手柄可以将第一点云地图存储在目标数据库中。例如目标数据库可以是手柄本地的数据库或者手柄的云端数据库,本申请对此不做限制。类似的,头戴显示器也可以将第一点云地图存储在目标数据库中。
在一些可实现方式中,当手柄需要将第一点云地图存储在目标数据库中时,手柄可以同时存储第一点云地图对应的手柄所处的外部环境的标识,例如预先设置的安全区域的标识,安全区域的标识可以用来唯一表示该安全区域,例如安全区域的标识可以是安全区域的名称,本申请对此不做限制。
示例性的,假设用户设置的安全区域是位于会议室1的中心位置处的边长为1米的正方形区域,该安全区域的标识是安全区域1,第一点云地图是该安全区域1的点云地图,那么当手柄需要在目标数据库中存储该第一点云地图时,则可以同时存储该安全区域的标识:安全区域1。
应理解的是,头戴显示器存储点云地图的方法和上述手柄存储第一点云地图的方法是类似的,本申请在此不再赘述。
在一些可实现方式中,上述目标数据库可以是手柄和头戴显示器共享的数据库,该共享的数据库可以是手柄本地的数据库,也可以是头戴显示器本地的数据库,也可以是一个云端数据库,本申请对此不做限制。
在一些可实现方式中,手柄在确定出6自由度数据之后,可以将6自由度数据发送给头戴显示设备。头戴显示设备接收到手柄发送的6自由度数据之后,可以根据6自由度数据相应调整头戴显示设备上显示的图像。
示例性的,假设用户体验的VR场景中需要用户有一个打开窗户的动作,那么当用户通过手做出打开窗户的动作时,手柄就可以使用鱼眼摄像头采集此时的环境的点云数据,通过惯性测量单元获取到手柄的惯性测量单元数据,并结合第一点云地图,然后通过SLAM技术确定出手柄的6自由度数据,并将其发送给头戴显示设备。头戴显示设备接收到手柄发送的6自由度数据之后,可以调整头戴显示设备上显示的图像为窗户被打开了。
在一些可实现方式中,手柄可以在获取到惯性测量单元数据后,将惯性测量单元数据发送给头戴显示器。头戴显示器接收到手柄发送的惯性测量单元数据后,可以对手柄的运动做出预判,从而可以在得到手柄的6自由度数据前预先调整头戴显示器上显示的图像,以减少由于手柄确定6自由度数据和发送6自由度数据给头戴显示器而产生时延,进而造成的头戴显示器上显示的图像变化和手柄上的运动变化不一致的情况。
在一些可实现方式中,手柄可以向头戴显示器发送一个时间同步消息。头戴显示器收到该时间同步消息后,可以根据该时间同步消息确定出手柄的时间系统和头戴显示器的时间系统之间的时间差,从而可以使手柄和头戴显示器之间进行数据交互时,手柄的时间系统和头戴显示器的时间系统保持一致。其中,时间同步消息包括手柄的时间系统里的时间。
在另一些可实现方式中,头戴显示器可以向手柄发送一个时间同步消息,手柄可以接收该时间同步信息,并根据该时间同步消息确定出头戴显示器的时间系统和手柄的时间系统之间的时间差,从而可以使手柄和头戴显示器之间进行数据交互时,手柄的时间系统和头戴显示器的时间系统保持一致。其中,时间同步消息包括头戴显示器的时间系统里的时间。
示例性的,假设头戴显示器开机的时间是T1,手柄开机的时间是T2,其中T2晚于T1,那么在手柄和头戴显示器开机后,头戴显示器可以将包括时间T1的时间同步消息发送给手柄,手柄接收到该时间同步消息后,可以根据时间T1和时间T2确定出头戴显示器的时间系统和自己的时间系统之间的时间差,从而可以使手柄的时间系统和头戴显示器的时间系统保持一致。
在一些可实现方式中,在S201之前,可以通过以下任一方式将手柄的模式由初始模式切换至自追踪模式,但不限于此:
方式一,在手柄采集上述第一点云数据之前,手柄可以获取模式切换指令,响应于模式切换指令,手柄可以根据该模式切换指令将手柄的模式由初始模式切换至自追踪模式。其中,初始模式可以是通过手臂模型操作手柄的模式,自追踪模式是手柄进行上述S201至S204时所处的模式。
示例性的,上述模式切换指令可以是用户对手柄上某一按键的按压操作,或者是用户对手柄屏幕上某一区域的单击、双击、长按、滑动、悬浮触控手势等操作,也可以是用户对头戴显示器上某一按键的按压操作,或者是用户对头戴显示器屏幕上某一区域的单击、双击、长按、滑动、悬浮触控手势等操作,本申请对此不做限制。
方式二,在初始模式结束之后,自动将手柄的模式由初始模式切换至自追踪模式。其中,本申请将通过下面的实施例对手柄的初始模式进行详细描述。
应理解的是,本申请对将手柄的模式由初始模式切换至自追踪模式的方式不做限制。
在一些可实现方式中,在手柄和头戴显示器开机后,手柄会进入初始模式,在初始模式中,需要完成菜单选择和安全区域的设置。
应理解的是,在设置安全区域之前,手柄和头戴显示器各自采集的点云数据是没有统一的坐标系的,手柄可以发送自己的惯性测量单元数据给头戴显示器,头戴显示器可以根据手柄发送的惯性测量单元数据确定出手柄的3自由度数据:纵摇、横摇、垂摇3种类型的旋转自由度数据,即手柄的相对姿态,但是由于没有统一的坐标系,还不能确定出手柄在和头戴显示器在相同坐标系下的位置,基于此,头戴显示器可以结合头戴显示器中的手臂模型确定手柄的位置,或者头戴显示器也可以通过头戴显示器的摄像头拍摄手柄,确定手柄和头戴显示器的相对位置和姿态,通过转换计算出手柄的位置,这样头戴显示器就可以获取到手柄的位置和姿态了,才可以完成安全区域的设置和菜单选择,而由于坐标系是基于安全区域中的点云数据建立的,所以当完成安全区域设置之后,手柄和头戴显示器就可以统一到同一坐标系下。
在一些可实现方式中,初始模式中的菜单选择可以包括:VR场景的选择、需要连接的网络选择如无线保真(Wireless Fidelity,WiFi)的选择等。安全区域的设置是指在用户体验VR场景前设的用户可活动的区域,安全区域可以是用户体验VR场景时,用户所处环境里的一处边长为1米的正方形区域,也可以是该环境里的一处长为1米、宽为0.5米的正方形区域,本申请对菜单选择的具体内容和安全区域的范围等不做限制。
在一些可实现方式中,当手柄处于初始模式时,用户可以对手柄上的按键进行按压操作,头戴显示器可以响应于用户的按压操作,完成上述初始操纵中的菜单选择或者安全区 域的设置;用户也可以使用手柄进行旋转,例如用户可以使用手柄绘画出需要设置的安全区域的范围,头戴显示器可以响应于用户的旋转操作,完成上述初始操纵中的菜单选择或者安全区域的设置。本申请对菜单选择和安全区域的设置方法等不做限制。
在一些可实现方式中,在设置上述安全区域时,头戴显示器可以在T2时刻采集所处环境的点云数据,并查找目标数据库中是否存储有该点云数据,若目标数据库中存储有该点云数据,即目标数据库中存储的点云地图包括该T2时刻采集的点云数据,则头戴显示器可以直接将该点云地图对应的安全区域设置安全区域。
在另一些可实现方式中,在设置上述安全区域时,头戴显示器可以在T2时刻采集所处环境的点云数据,并查找目标数据库中是否存储有该点云数据,若目标数据库中没有存储有该点云数据,即目标数据库中存储的点云地图不包括该T2时刻采集的点云数据,则头戴显示器可以配合手柄重新设置安全区域。
在又一些可实现方式中,在设置上述安全区域时,头戴显示器可以在T2时刻采集所处环境的点云数据,并查找目标数据库中是否存储有该点云数据,若目标数据库中存储有该点云数据,即目标数据库中存储的点云地图包括该T2时刻采集的点云数据,那么头戴显示器可以询问用户是否需要使用该点云地图对应的安全区域,如果用户确认需要使用该点云地图对应的安全区域,则头戴显示器可以将该点云地图对应的安全区域设置安全区域;如果用户确认不需要使用该点云地图对应的安全区域,则头戴显示器可以配合手柄重新设置安全区域。
在一些可实现方式中,在上述需要重新设置安全区域中,头戴显示器可以将在T2时刻采集的所处环境的点云数据发送给手柄,即手柄可以接收头戴显示器对所处外部环境采集的相对于T2时刻的历史点云数据,然后手柄和头戴都可以根据该历史点云数据构建第二点云地图,该第二点云地图就是重新设置的安全区域对应的点云地图,手柄和头戴显示器都可以使用该第二点云地图在自追踪模式下确定自己的6自由度数据。
在另一些可实现方式中,在上述不需要重新设置安全区域时,即目标数据库中存储的点云地图包括该T2时刻采集的点云数据,头戴显示器可以将该T2时刻采集的点云数据和上述点云地图的差异部分发送给手柄,并将该差异部分更新在上述点云地图中,手柄在接收了头戴显示器发送的T2时刻采集的点云数据和上述点云地图的差异部分后,可以将该差异部分更新在自己存储的上述点云地图中。
综上所述,上述实施例提供的技术方案至少带来以下有益效果:第一控制装置可以通过安装的摄像头对第一控制装置所处的外部环境在T1时刻进行采集,得到第一点云数据,并结合获取的第一控制装置在T1时刻的惯性测量单元数据和所处的外部环境的第一点云地图,确定出第一控制装置的6自由度数据,其中,第一点云地图包括第一控制装置和第 二控制装置对外部环境采集的相对于T1时刻的历史点云数据。在上述过程中,当第一控制装置是手柄时,可以不需要采用现有技术中的根据HMD的6自由度数据,再结合手柄和HMD的相对位置关系的方法来实现对手柄的定位,而是根据手柄所处环境的当前时刻点云数据、手柄的IMU数据和所处环境的点云地图来确定手柄的6自由度数据,从而可以实现手柄的自追踪,因而不会存在定位精度较低的原因,例如:在上述根据超声波定位方案确定手柄和HMD之间相对位置关系中,如果在HMD上安装的声波接收器和手柄上安装的声波发射器之间插入其他物体时,就会对两者之间传输的电磁波造成遮挡或者反射,从而影响手柄定位精度。而且,在确定手柄的6自由度数据时所使用的点云地图包括第二控制装置如HMD和第一控制装置如手柄对所处环境采集的相对于上述当前时刻的历史点云数据,所以该点云地图包括的点云数据是较为丰富的,从而可以进一步提升定位的精度,因此本申请解决了现有技术存在控制装置如手柄定位精度较低的问题,提高了控制装置的定位精度。
进一步地,手柄可以在符合目标条件时,将第一点云数据和第一点云地图的差异部分更新至第一点云地图中,并将第一点云数据和第一点云地图的差异部分发送给头戴显示器,例如目标条件可以包括以下至少一项:距离T1时刻达到预设时长;第一点云数据和第一点云地图的差异部分以及通过手柄摄像头对所处外部环境在T1时刻之后进行采集得到的点云数据与第一点云地图的差异部分的累计大小达到预设大小。如此一来,可以减少手柄更新或者发送第一点云数据和第一点云地图的差异部分的次数,从而降低功耗。
更进一步地,在初始模式中,手柄可以接收头戴显示器发送的点云数据,并可以根据该点云数据更新或者构建和头戴显示器相同的点云地图,从而可以保证手柄和头戴显示器都可以根据该相同的点云地图在手柄的自追踪模式下确定各自的6自由度数据,从而可以保证手柄在现实世界中的运动情况和头戴显示器在现实世界中的运动情况分别对应显示在虚拟空间中时是一致的。
图3为本申请实施例提供的一种定位装置的示意图,该定位装置可以是如图1所示的手柄110,该定位装置包括:
采集模块301,用于通过摄像头对第一控制装置所处的外部环境在T1时刻进行采集,得到第一点云数据;
获取模块302,用于获取第一控制装置在T1时刻的惯性测量单元数据和外部环境的第一点云地图;
第一确定模块303,用于根据第一点云数据、惯性测量单元数据和第一点云地图确定第一控制装置的6自由度数据;
其中,第一点云地图包括第一控制装置和第二控制装置对外部环境采集的相对于T1 时刻的历史点云数据。
在一些可实现方式中,定位装置还包括:比较模块304、更新发送模块305。其中,比较模块304用于比较第一点云数据和第一点云地图是否存在差异;更新发送模块305用于若第一点云数据和第一点云地图存在差异,则将第一点云数据和第一点云地图的差异部分更新至第一点云地图中,并将第一点云数据和第一点云地图的差异部分发送给第二控制装置。
在一些可实现方式中,更新发送模块305具体用于:在符合目标条件时,将第一点云数据和第一点云地图的差异部分更新至第一点云地图中,并将第一点云数据和第一点云地图的差异部分发送给第二控制装置。
在一些可实现方式中,目标条件包括以下至少一项:距离T1时刻达到预设时长;第一点云数据和第一点云地图的差异部分以及通过摄像头对外部环境在T1时刻之后进行采集得到的点云数据与第一点云地图的差异部分的累计大小达到预设大小。
在一些可实现方式中,定位装置还包括:第一接收模块306、更新模块307,第一接收模块306用于接收第二控制装置发送的第二点云数据和第一点云地图的差异部分;更新模块307用于将第二点云数据和第一点云地图的差异部分更新至第一点云地图中,其中,第二点云数据是通过第二控制装置通过自身的摄像头对外部环境在T1时刻进行采集得到的点云数据。
在一些可实现方式中,定位装置还包括:切换模块308,切换模块308用于将第一控制装置的模式由初始模式切换至自追踪模式;其中,初始模式是通过手臂模型操作第一控制装置的模式。
在一些可实现方式中,定位装置还包括:第二接收模块309、第二确定模块310,第二接收模块309用于当第一控制装置的模式为初始模式时,接收第二控制装置对外部环境采集的相对于T2时刻的历史点云数据;第二确定模块310用于根据第二控制装置对外部环境采集的相对于T2时刻的历史点云数据得到第二点云地图。
在一些可实现方式中,第一点云地图存储在目标数据库中;目标数据库是第一控制装置与第二控制装置共享的数据库。
应理解的是,该装置实施例与控制装置侧的方法实施例可以相互对应,类似的描述可以参照控制装置对应的方法实施例。为避免重复,此处不再赘述。
具体地,图3所示的定位装置可以执行上述控制装置侧的方法实施例,并且图3所示的定位装置中的各个模块的前述和其它操作和/或功能分别为了实现上述控制装置侧方法实施例的相应流程,为了简洁,在此不再赘述。
上文中结合附图从功能模块的角度描述了本申请实施例的上述控制装置侧方法实施例。 应理解,该功能模块可以通过硬件形式实现,也可以通过软件形式的指令实现,还可以通过硬件和软件模块组合实现。具体地,本申请实施例中的方法实施例的各步骤可以通过处理器中的硬件的集成逻辑电路和/或软件形式的指令完成,结合本申请实施例公开的方法的步骤可以直接体现为硬件译码处理器执行完成,或者用译码处理器中的硬件及软件模块组合执行完成。可选地,软件模块可以位于随机存储器,闪存、只读存储器、可编程只读存储器、电可擦写可编程存储器、寄存器等本领域的成熟的存储介质中。该存储介质位于存储器,处理器读取存储器中的信息,结合其硬件完成上述方法实施例中的步骤。
图4是本申请实施例提供的第一控制装置400的示意性框图,该第一控制装置400可以是执行上述方法实施例中的手柄。
如图4所示,该第一控制装置400可包括以下一个或多个组件:IMU传感器模块401、摄像头模块402、线性马达模块403、无线通讯模块404、ANT405、触摸按键输入模块406、用户输入模块407和处理器408。
IMU模块401被配置为检测第一控制装置400的惯性测量单元数据,惯性测量单元数据包括第一控制装置400在三维空间中的角速度和加速度。惯性测量单元传感器模块401可以包括三个单轴的加速度计、三个单轴的陀螺仪、三个单轴的磁力计,可以通过加速度计、陀螺仪和磁力计等传感器组合来测量和报告速度、方向和重力等。
摄像头模块402被配置为采集第一控制装置400所处环境的点云数据。例如,摄像头模块402可以包括多个鱼眼摄像头,当第一控制装置400处于自追踪模式时,鱼眼摄像头被配置为采集第一控制装置400所处环境的点云数据。
线性马达模块403被配置为向用户提供震动等的交互反馈,例如当用户按压第一控制装置400的某个按键时,线性马达模块403被配置为产生一个震动作为反馈,或者,当第一控制装置或第二控制装置的运动范围超出安全区域时,线性马达模块403被配置为产生一个震动作为提醒,使用户拥有更好的体验感。
无线通讯模块404被配置为第一控制装置和第二控制装置之间的通信。例如,无线通讯模块404可以向第二控制装置发送第一控制装置的6自由度数据、惯性测量单元数据、按键数据、时间同步消息、点云数据等,无线通讯模块404还可以接收第二控制装置向第一控制装置发送的点云数据、时间同步消息、控制信息等。无线通讯模块404可以包括无线芯片,无线通讯模块404还可以进一步包括ANT405。
触摸按键输入模块406被配置为提供给用户在第一控制装置上的按键或者触摸操作。
用户输入模块407被配置为提供给用户在第一控制装置上的输入操作。
处理器408被配置为根据该计算机程序中的指令执行上述方法实施例,例如处理第一控制装置和第二控制装置之间的数据交互、根据点云数据和惯性测量单元数据确定6自由 度数据等。
在本申请的一些实施例中,处理器408可以包括但不限于:通用处理器、数字信号处理器(Digital Signal Processor,DSP)、专用集成电路(Application Specific Integrated Circuit,ASIC)、现场可编程门阵列(Field Programmable Gate Array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等等。
应当理解,该第一控制装置400中的各个组件通过总线系统相连,其中,总线系统除包括数据总线之外,还包括电源总线、控制总线和状态信号总线等。
图5是本申请实施例提供的第二控制装置500的示意性框图。该第二控制装置500可以是上述实施例中的头戴显示器。
如图5所示,该第二控制装置500可包括以下一个或多个组件:按键输入及LED灯显示模块501、摄像头模块502、手柄无线通讯模块503、IPD检测模块504、USB3.0接口模块505、显示模块506、音频输入输出模块507、电源管理模块508、WIFI/BT模块509、内存存储模块510、距离传感器检测模块511、IMU传感器模块512、处理器513和ANT514。
按键输入及LED灯显示模块501被配置为提供给用户在第一控制装置上的按键或者输入操作以及开机等的提示灯的显示。
摄像头模块502被配置为采集第二控制装置500所处环境的点云数据。例如,摄像头模块502可以包括多个鱼眼摄像头,鱼眼摄像头被配置为采集第二控制装置500所处环境的点云数据。
手柄无线通讯模块503被配置为第一控制装置和第二控制装置之间的通信。例如,手柄无线通讯模块503可以向第一控制装置发送第二控制装置的6DOF数据、惯性测量单元数据、按键数据、时间同步消息、点云数据等,手柄无线通讯模块503还可以接收第一控制装置向第二控制装置发送的点云数据、时间同步消息、控制信息等。手柄无线通讯模块503可以包括无线芯片,例如蓝牙芯片、WiFi芯片、ad芯片、超带宽技术(Ultra Wide Band,UWB)芯片。手柄无线通讯模块503还可以包括ANT514。进一步的,手柄无线通讯模块503还可以包括处理器。
IPD检测模块504被配置为检测检测用户的瞳孔间间距。
USB3.0接口模块505被配置为连接外部设备。
显示模块506被配置为显示第二控制装置中的图像。显示模块506可以根据手柄和HMD的6DOF数据实时调整第二控制装置中显示的图像。
音频输入输出模块507被配置为输入或者输出音频数据。音频输入输出模块507可以接收用户输入的语音数据,还可以向用户输出音频数据。音频输入输出模块507可以包括扬声器、麦克风、喇叭等。
电源管理模块508被配置为向第二控制装置500的各个组件分配并提供电能。
WIFI/BT模块509被配置为第一控制装置和第二控制装置之间的通信。例如,WIFI/BT模块509可以向第一控制装置发送第二控制装置的6自由度数据、惯性测量单元数据、按键数据、时间同步消息、点云数据等,WIFI/BT模块509还可以接收手柄发送的点云数据、时间同步消息、控制信息等。
内存存储模块510被配置为存储计算机程序、点云数据等。
在本申请的一些实施例中,内存存储模块510包括但不限于:易失性存储器和/或非易失性存储器。其中,非易失性存储器可以是只读存储器(Read-Only Memory,ROM)、可编程只读存储器(Programmable ROM,PROM)、可擦除可编程只读存储器(Erasable PROM,EPROM)、电可擦除可编程只读存储器(Electrically EPROM,EEPROM)或闪存。易失性存储器可以是随机存取存储器(Random Access Memory,RAM),其用作外部高速缓存。通过示例性但不是限制性说明,许多形式的RAM可用,例如静态随机存取存储器(Static RAM,SRAM)、动态随机存取存储器(Dynamic RAM,DRAM)、同步动态随机存取存储器(Synchronous DRAM,SDRAM)、双倍数据速率同步动态随机存取存储器(Double Data Rate SDRAM,DDR SDRAM)、增强型同步动态随机存取存储器(Enhanced SDRAM,ESDRAM)、同步连接动态随机存取存储器(synch link DRAM,SLDRAM)和直接内存总线随机存取存储器(Direct Rambus RAM,DR RAM)。
距离传感器检测模块511被配置为检测HMD和手柄的运动范围,使第一控制装置和第二控制装置的运动范围保持在安全区域内。
IMU传感器模块512被配置为被配置为检测第二控制装置500的IMU数据,IMU数据包括第二控制装置500在三维空间中的角速度和加速度。IMU传感器模块512可以包括三个单轴的加速度计、三个单轴的陀螺仪、三个单轴的磁力计,可以通过加速度计、陀螺仪和磁力计等传感器组合来测量和报告速度、方向和重力等。
处理器513被配置为根据该计算机程序中的指令执行上述方法实施例,例如处理第一控制装置和第二控制装置之间的数据交互、根据点云数据和惯性测量单元数据确定6自由度数据等。
在本申请的一些实施例中,处理器513可以包括但不限于:通用处理器、数字信号处理器(Digital Signal Processor,DSP)、专用集成电路(Application Specific Integrated Circuit,ASIC)、现场可编程门阵列(Field Programmable Gate Array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等等。
应当理解,该第二控制装置500中的各个组件通过总线系统相连,其中,总线系统除包括数据总线之外,还包括电源总线、控制总线和状态信号总线等。
本申请还提供了一种计算机存储介质,其上存储有计算机程序,该计算机程序被计算机执行时使得该计算机能够执行上述方法实施例的方法。
本申请实施例还提供一种包含指令的计算机程序产品,该指令被计算机执行时使得计算机执行上述方法实施例的方法。
当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。该计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行该计算机程序指令时,全部或部分地产生按照本申请实施例该的流程或功能。该计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。该计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,该计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线(digital subscriber line,DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。该计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。该可用介质可以是磁性介质(例如,软盘、硬盘、磁带)、光介质(例如数字视频光盘(digital video disc,DVD))、或者半导体介质(例如固态硬盘(solid state disk,SSD))等。
以上仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以该权利要求的保护范围为准。

Claims (12)

  1. 一种控制装置的定位方法,其特征在于,所述方法应用于第一控制装置,所述第一控制装置上安装有摄像头,所述方法包括:
    通过所述摄像头对所述第一控制装置所处的外部环境在T1时刻进行采集,得到第一点云数据;
    获取所述第一控制装置在所述T1时刻的惯性测量单元数据和所述外部环境的第一点云地图;
    根据所述第一点云数据、所述惯性测量单元数据和所述第一点云地图确定所述第一控制装置的6自由度数据;
    其中,所述第一点云地图包括所述第一控制装置和第二控制装置对所述外部环境采集的相对于所述T1时刻的历史点云数据。
  2. 根据权利要求1所述的方法,其特征在于,还包括:
    比较所述第一点云数据和所述第一点云地图是否存在差异;
    若所述第一点云数据和所述第一点云地图存在差异,则将所述第一点云数据和所述第一点云地图的差异部分更新至所述第一点云地图中,并将所述第一点云数据和所述第一点云地图的差异部分发送给所述第二控制装置。
  3. 根据权利要求2所述的方法,其特征在于,所述将所述第一点云数据和所述第一点云地图的差异部分更新至所述第一点云地图中,并将所述第一点云数据和所述第一点云地图的差异部分发送给所述第二控制装置,包括:
    在符合目标条件时,将所述第一点云数据和所述第一点云地图的差异部分更新至所述第一点云地图中,并将所述第一点云数据和所述第一点云地图的差异部分发送给所述第二控制装置。
  4. 根据权利要求3所述的方法,其特征在于,所述目标条件包括以下至少一项:
    距离所述T1时刻达到预设时长;
    所述第一点云数据和所述第一点云地图的差异部分以及通过所述摄像头对所述外部环境在T1时刻之后进行采集得到的点云数据与所述第一点云地图的差异部分的累计大小达到预设大小。
  5. 根据权利要求4所述的方法,其特征在于,还包括:
    接收所述第二控制装置发送的第二点云数据和所述第一点云地图的差异部分;
    将所述第二点云数据和所述第一点云地图的差异部分更新至所述第一点云地图中;
    其中,所述第二点云数据是通过所述第二控制装置通过自身的摄像头对所述外部环境在T1时刻进行采集得到的点云数据。
  6. 根据权利要求1-5任一项所述的方法,其特征在于,所述通过所述摄像头对所述第一控制装置所处的外部环境在T1时刻进行采集,得到第一点云数据之前,还包括:
    将所述第一控制装置的模式由初始模式切换至自追踪模式;
    其中,所述初始模式是通过手臂模型操作所述第一控制装置的模式。
  7. 根据权利要求6所述的方法,其特征在于,还包括:
    当所述第一控制装置的模式为所述初始模式时,接收所述第二控制装置对所述外部环境采集的相对于T2时刻的历史点云数据;
    根据所述第二控制装置对所述外部环境采集的相对于T2时刻的历史点云数据确定第二点云地图。
  8. 根据权利要求1-5任一项所述的方法,其特征在于,所述第一点云地图存储在目标数据库中;
    所述目标数据库是所述第一控制装置与所述第二控制装置共享的数据库。
  9. 一种定位装置,其特征在于,包括:
    采集模块,用于通过摄像头对第一控制装置所处的外部环境在T1时刻进行采集,得到第一点云数据;
    获取模块,用于获取所述第一控制装置在所述T1时刻的惯性测量单元数据和所述外部环境的第一点云地图;
    第一确定模块,用于根据所述第一点云数据、所述惯性测量单元数据和所述第一点云地图确定所述第一控制装置的6自由度数据;
    其中,所述第一点云地图包括所述第一控制装置和第二控制装置对所述外部环境采集的相对于所述T1时刻的历史点云数据。
  10. 一种控制装置,其特征在于,包括:
    处理器;以及
    存储器,用于存储所述处理器的可执行指令;
    其中,所述处理器配置为经由执行所述可执行指令来执行权利要求1-8任一项所述的控制装置的定位方法。
  11. 一种计算机可读存储介质,其上存储有计算机程序,其特征在于,所述计算机程序被处理器执行时实现权利要求1-8任一项所述的控制装置的定位方法。
  12. 一种包含指令的计算机程序产品,其特征在于,当所述计算机程序产品在电子设备上运行时,使得所述电子设备执行权利要求1-8中任一项所述的控制装置的定位方法。
PCT/CN2023/080420 2022-03-21 2023-03-09 控制装置的定位方法、装置、设备、存储介质及计算机程序产品 WO2023179369A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210304608.8A CN116823928A (zh) 2022-03-21 2022-03-21 控制装置的定位、装置、设备、存储介质及计算机程序产品
CN202210304608.8 2022-03-21

Publications (1)

Publication Number Publication Date
WO2023179369A1 true WO2023179369A1 (zh) 2023-09-28

Family

ID=88099805

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/080420 WO2023179369A1 (zh) 2022-03-21 2023-03-09 控制装置的定位方法、装置、设备、存储介质及计算机程序产品

Country Status (2)

Country Link
CN (1) CN116823928A (zh)
WO (1) WO2023179369A1 (zh)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN206096621U (zh) * 2016-07-30 2017-04-12 广州数娱信息科技有限公司 一种增强型虚拟现实感知设备
CN109358754A (zh) * 2018-11-02 2019-02-19 北京盈迪曼德科技有限公司 一种混合现实头戴显示系统
US20190098255A1 (en) * 2017-09-22 2019-03-28 Faro Technologies, Inc. Collaborative virtual reality online meeting platform
CN110915208A (zh) * 2017-07-31 2020-03-24 谷歌有限责任公司 使用深度传感器的虚拟现实环境边界
CN112822480A (zh) * 2020-12-31 2021-05-18 青岛小鸟看看科技有限公司 Vr系统及其定位追踪方法
CN113632030A (zh) * 2018-12-27 2021-11-09 奇跃公司 用于虚拟现实和增强现实的系统和方法
CN113689496A (zh) * 2021-08-06 2021-11-23 西南科技大学 一种基于vr的核辐射环境场景构建与人机交互方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN206096621U (zh) * 2016-07-30 2017-04-12 广州数娱信息科技有限公司 一种增强型虚拟现实感知设备
CN110915208A (zh) * 2017-07-31 2020-03-24 谷歌有限责任公司 使用深度传感器的虚拟现实环境边界
US20190098255A1 (en) * 2017-09-22 2019-03-28 Faro Technologies, Inc. Collaborative virtual reality online meeting platform
CN109358754A (zh) * 2018-11-02 2019-02-19 北京盈迪曼德科技有限公司 一种混合现实头戴显示系统
CN113632030A (zh) * 2018-12-27 2021-11-09 奇跃公司 用于虚拟现实和增强现实的系统和方法
CN112822480A (zh) * 2020-12-31 2021-05-18 青岛小鸟看看科技有限公司 Vr系统及其定位追踪方法
CN113689496A (zh) * 2021-08-06 2021-11-23 西南科技大学 一种基于vr的核辐射环境场景构建与人机交互方法

Also Published As

Publication number Publication date
CN116823928A (zh) 2023-09-29

Similar Documents

Publication Publication Date Title
JP7095602B2 (ja) 情報処理装置、情報処理方法及び記録媒体
AU2019279990B2 (en) Digital camera with audio, visual and motion analysis
JP2020534592A (ja) 仮想カメラを制御するシステム及び方法
CN111373347B (zh) 用于虚拟现实内容的提供的装置、方法和计算机程序
US20210142568A1 (en) Web-based remote assistance system with context & content-aware 3d hand gesture visualization
JP6939801B2 (ja) 情報処理装置、情報処理方法およびプログラム
JP7126008B2 (ja) 多点slamキャプチャ
CN112907652B (zh) 相机姿态获取方法、视频处理方法、显示设备和存储介质
WO2018160381A1 (en) 3d depth map
US20240094970A1 (en) Electronic system for producing a coordinated output using wireless localization of multiple portable electronic devices
EP4252195A1 (en) Real world beacons indicating virtual locations
JP2014153802A (ja) 情報処理プログラム、情報処理装置、情報処理システム、および情報処理方法
WO2023179369A1 (zh) 控制装置的定位方法、装置、设备、存储介质及计算机程序产品
WO2019054037A1 (ja) 情報処理装置、情報処理方法、およびプログラム
JP7196856B2 (ja) 情報処理装置、情報処理方法、およびプログラム
US20240078757A1 (en) Shared viewing of video with prevention of cyclical following among users
KR102614102B1 (ko) 실물 객체에 대한 정밀한 추적을 위한 자동화된 캘리브레이션 시스템, 캘리브레이션 방법 및 캘리브레이션 방법을 기초로 이미지 내에서 실물 객체를 추적하고 실물 객체에 가상 모델을 증강하는 방법
US20230405475A1 (en) Shooting method, apparatus, device and medium based on virtual reality space
WO2023219615A1 (en) Tracking of multiple extended reality devices
JP2023550773A (ja) 画像ベースの指の追跡とコントローラの追跡
CN113039508A (zh) 评估虚拟环境的输入和输出的对准
JP2017224358A (ja) 情報処理プログラム、情報処理装置、情報処理システム、および情報処理方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23773618

Country of ref document: EP

Kind code of ref document: A1