CN117740025A - Positioning accuracy determining method, device, equipment and medium for positioning device - Google Patents

Positioning accuracy determining method, device, equipment and medium for positioning device Download PDF

Info

Publication number
CN117740025A
CN117740025A CN202211117672.1A CN202211117672A CN117740025A CN 117740025 A CN117740025 A CN 117740025A CN 202211117672 A CN202211117672 A CN 202211117672A CN 117740025 A CN117740025 A CN 117740025A
Authority
CN
China
Prior art keywords
data
motion
motion trail
positioning
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211117672.1A
Other languages
Chinese (zh)
Inventor
闵豪
山君良
李汉振
陈庆林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zitiao Network Technology Co Ltd
Original Assignee
Beijing Zitiao Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zitiao Network Technology Co Ltd filed Critical Beijing Zitiao Network Technology Co Ltd
Priority to CN202211117672.1A priority Critical patent/CN117740025A/en
Publication of CN117740025A publication Critical patent/CN117740025A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The application provides a method, a device, equipment and a medium for determining positioning accuracy of a positioning device, wherein the method comprises the following steps: acquiring first motion trail data of a positioning device and second motion trail data of a positioning rigid body; determining a coordinate system conversion relation according to the first motion trail data and the second motion trail data; converting the first motion trail data into a coordinate system of the second motion trail data according to the coordinate system conversion relation; and determining the positioning precision of the positioning device according to the converted first motion trail data and the second motion trail data. The positioning accuracy of the positioning device can be determined.

Description

Positioning accuracy determining method, device, equipment and medium for positioning device
Technical Field
The embodiment of the application relates to the technical field of positioning, in particular to a method, a device, equipment and a medium for determining positioning accuracy of a positioning device.
Background
With the development of Virtual technologies such as Virtual Reality (VR), augmented Reality (Augmented Reality, AR), and Mixed Reality (MR), users can experience a seamless transition of immersion between the Virtual world and the real world when watching video or live broadcast by using electronic devices such as VR device, AR device, MR device, or XR device.
In the actual application process, the electronic equipment acquires information such as actions or positions of the user in real time through a space positioning function, and adjusts the displayed image according to the information such as the actions or the positions. However, if the information such as the user action or the position obtained by the space positioning is inaccurate, the problem that the user movement is not synchronous with the displayed image can occur, so that the user experience is poor. Therefore, it is necessary to determine the spatial positioning accuracy of the electronic device to ensure that the acquired information such as the user's motion or position is accurately available.
Disclosure of Invention
The application provides a method, a device, equipment and a medium for determining positioning accuracy of a positioning device, which can determine the positioning accuracy of the positioning device.
In a first aspect, the present application provides a positioning accuracy determining method of a positioning device, including:
acquiring first motion trail data of the positioning device and second motion trail data of a positioning rigid body;
determining a coordinate system conversion relation according to the first motion trail data and the second motion trail data;
converting the first motion trail data into a coordinate system of the second motion trail data according to the coordinate system conversion relation;
And determining the positioning precision of the positioning device according to the converted first movement track data and the second movement track data.
In a second aspect, an embodiment of the present application provides a positioning accuracy determining device of a positioning device, including:
the track acquisition module is used for acquiring first motion track data of the positioning device and second motion track data of the positioning rigid body;
the relation determining module is used for determining a coordinate system conversion relation according to the first motion trail data and the second motion trail data;
the track conversion module is used for converting the first motion track data into a coordinate system of the second motion track data according to the coordinate system conversion relation;
and the precision determining module is used for determining the positioning precision of the positioning device according to the converted first motion trail data and the second motion trail data.
In a third aspect, an embodiment of the present application provides an electronic device, including:
the positioning device comprises a processor and a memory, wherein the memory is used for storing a computer program, and the processor is used for calling and running the computer program stored in the memory to execute the positioning precision determining method of the positioning device in the embodiment of the first aspect or various implementation manners thereof.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium storing a computer program, where the computer program causes a computer to execute the positioning accuracy determining method of the positioning device described in the first aspect embodiment or each implementation manner thereof.
In a fifth aspect, embodiments of the present application provide a computer program product comprising program instructions which, when run on an electronic device, cause the electronic device to perform the positioning accuracy determining method of the positioning device described in the embodiments of the first aspect or implementations thereof.
The technical scheme disclosed by the embodiment of the application has at least the following beneficial effects:
the method comprises the steps of obtaining first motion track data of a positioning device and second motion track data of a positioning rigid body, determining a coordinate system conversion relation according to the first motion track data and the second motion track data, converting the first motion track data into a coordinate system of the second motion track data according to the coordinate system conversion relation, and determining positioning accuracy of the positioning device according to the converted first motion track data and second motion track data. According to the method and the device, the motion track data of the positioning device are converted into the coordinate system of the motion track data of the positioning rigid body, so that the positioning accuracy of the positioning device is determined by utilizing the two motion track data in the same coordinate system, the checking accuracy of the positioning device is improved, and the checking cost of the positioning accuracy of the positioning device is reduced.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flow chart of a positioning accuracy determining method of a positioning device according to an embodiment of the present application;
FIG. 2 is a schematic view of a positioning rigid body provided in an embodiment of the present application mounted on a positioning device;
FIG. 3 is a schematic diagram of an external tracking system provided by an embodiment of the present application;
fig. 4 is a schematic diagram of motion data of each coordinate axis after converting first motion trajectory data into a coordinate system of second motion trajectory data according to the embodiment of the present application;
fig. 5 is a flowchart of another positioning accuracy determining method of a positioning device according to an embodiment of the present application;
fig. 6 is a schematic diagram of a time synchronization process for the first motion trajectory data and the second motion trajectory data according to the embodiment of the present application;
fig. 7 is a flowchart of a positioning accuracy determining method of a positioning device according to another embodiment of the present application;
FIG. 8 is a schematic block diagram of a positioning accuracy determining apparatus of a positioning apparatus according to an embodiment of the present application;
FIG. 9 is a schematic block diagram of an electronic device provided by an embodiment of the present application;
fig. 10 is a schematic block diagram of an electronic device provided in an embodiment of the present application as an HMD.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present application based on the embodiments herein.
It should be noted that the terms "first," "second," and the like in the description and claims of the present application and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that embodiments of the present application described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or server that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed or inherent to such process, method, article, or apparatus, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The method and the device are suitable for determining the positioning accuracy scene of the positioning device, and considering that when a user wears electronic equipment such as XR equipment to watch videos or live broadcast, the electronic equipment acquires information such as actions or positions of the user in real time through a space positioning function, and adjusts displayed images according to the information such as the actions or the positions. However, if the information such as the user action or the position obtained by the space positioning is inaccurate, the problem that the user movement is different from the displayed image can occur, so that the user experience is poor. Therefore, the application designs a positioning accuracy determining method to determine the positioning accuracy of the positioning device by the method, so that whether the information such as the user action or the position obtained by the positioning device is accurately available or not is determined based on the positioning accuracy.
In order to facilitate understanding of embodiments of the present application, before describing various embodiments of the present application, some concepts related to all embodiments of the present application are first appropriately explained, specifically as follows:
1) Virtual Reality (VR) is a technology for creating and experiencing a Virtual world, determining to generate a Virtual environment, which is a multi-source information (the Virtual Reality mentioned herein at least includes visual perception, and may also include auditory perception, tactile perception, motion perception, and even include gustatory perception, olfactory perception, etc.), implementing a fused, interactive three-dimensional dynamic view of the Virtual environment and simulation of physical behavior, immersing a user in the simulated Virtual Reality environment, and implementing applications in various Virtual environments such as maps, games, videos, education, medical treatment, simulation, collaborative training, sales, assistance in manufacturing, maintenance, and repair.
2) A virtual reality device (VR device) may be provided in the form of glasses, a head mounted display (Head Mount Display, abbreviated as HMD), or a contact lens for realizing visual perception and other forms of perception, but the form of the virtual reality device is not limited thereto, and may be further miniaturized or enlarged according to actual needs.
Optionally, the virtual reality device described in the embodiments of the present application may include, but is not limited to, the following types:
2.1 Computer-side virtual reality (PCVR) equipment, which utilizes the PC side to perform the related computation of the virtual reality function and data output, and external computer-side virtual reality equipment utilizes the data output by the PC side to realize the effect of virtual reality.
2.2 Mobile virtual reality device, supporting the setting of a mobile terminal (e.g., a smart phone) in various ways (e.g., a head mounted display provided with a dedicated card slot), performing related calculations of virtual reality functions by the mobile terminal through wired or wireless connection with the mobile terminal, and outputting data to the mobile virtual reality device, e.g., viewing virtual reality video through the APP of the mobile terminal.
2.3 The integrated virtual reality device has a processor for performing the related computation of the virtual function, so that the integrated virtual reality device has independent virtual reality input and output functions, does not need to be connected with a PC end or a mobile terminal, and has high use freedom.
3) Augmented reality (Augmented Reality, abbreviated AR): a technique for calculating camera pose parameters of a camera in a real world (or three-dimensional world, real world) in real time during image acquisition by the camera, and adding virtual elements on the image acquired by the camera according to the camera pose parameters. Virtual elements include, but are not limited to: images, videos, and three-dimensional models. The goal of AR technology is to socket the virtual world over the real world on the screen for interaction.
4) Mixed Reality (MR for short): it means that new environments and visualizations are created in combination with the real and virtual world, physical entities and digital objects coexist and can interact in real time to simulate real objects. Reality, augmented virtual, and virtual reality technologies are mixed. MR is a kind of Virtual Reality (VR) plus the synthetic Mixed Reality (MR) of Augmented Reality (AR), is the extension of Virtual Reality (VR) technique, through the mode that presents virtual scene in real scene, can increase user experience's sense of realism. The MR field relates to computer vision, which is a science of researching how to make a machine "look at", and further refers to that a camera and a computer replace human eyes to perform machine vision such as recognition, tracking and measurement on a target, and further perform image processing, and the image is processed by the computer into an image more suitable for human eyes to observe or transmit to an instrument to detect.
That is, MR is a simulated scenery that integrates computer-created sensory input (e.g., virtual objects) with sensory input from a physical scenery or a representation thereof, in some MR sceneries, the computer-created sensory input may be adapted to changes in sensory input from the physical scenery. In addition, some electronic systems for rendering MR scenes may monitor orientation and/or position relative to the physical scene to enable virtual objects to interact with real objects (i.e., physical elements from the physical scene or representations thereof). For example, the system may monitor movement such that the virtual plants appear to be stationary relative to the physical building. .
5) Extended Reality (XR) refers to all real and virtual combined environments and human-machine interactions generated by computer technology and wearable devices, and includes multiple forms of Virtual Reality (VR), augmented Reality (AR), and Mixed Reality (MR).
Having described some of the concepts to which the present application relates, a method for determining positioning accuracy of a positioning device according to an embodiment of the present application will be specifically described below with reference to the accompanying drawings.
Fig. 1 is a flow chart of a positioning accuracy determining method of a positioning device according to an embodiment of the present application. The embodiment of the application can be suitable for determining the positioning precision scene of the positioning device, and the positioning precision determining method of the positioning device can be executed by the positioning precision determining device of the positioning device so as to realize management or control of the positioning precision determining process of the positioning device. The positioning accuracy determining means of the positioning device may consist of hardware and/or software and may be integrated in the electronic device. The electronic device may be any hardware device having a data processing function, such as a personal computer, a palm computer, a smart phone, and the like.
As shown in fig. 1, the positioning accuracy determining method of the positioning device may include the steps of:
s101, acquiring first motion trail data of a positioning device and second motion trail data of a positioning rigid body.
In the embodiment of the present application, the positioning device may be any hardware device having a spatial positioning function, such as an XR device. The XR device may be an AR (Augmented Reality) device or a VR (Virtual Reality) device, among others.
The positioning rigid body is used for assisting the electronic equipment in determining the positioning precision of the positioning device. The positioning rigid body comprises at least four non-coplanar marking points, and is fixedly arranged on the positioning device. Alternatively, as shown in fig. 2, the positioning rigid body is composed of 4 marking points, which are respectively: mark point 1, mark point 2, mark point 3, and mark point 4.
In view of the fact that the acquisition of the motion trajectory data of the positioning rigid body, i.e. the second motion trajectory data, is required to be implemented by means of an external tracking system, which may be composed of a plurality of optical cameras, as shown in particular in fig. 3. The coordinate system of the external tracking system is a world coordinate system, so the second motion track data acquired by the embodiment is composed of a plurality of pose data of the positioning rigid body under the world coordinate system corresponding to the external tracking system. Wherein the external tracking system can be selected as an OptiTrack-whole body motion capture system. It should be noted that, in this embodiment, the external tracking system may be a laser tracking system in addition to the OptiTrack-whole body motion capture system. And, when the external tracking system is a laser tracking system, the positioning rigid body is composed of a plurality of laser receivers, which is not particularly limited herein.
Specifically, before S101 is performed, the positioning rigid body including at least four marking points may be fixedly mounted on the positioning device, and the mounting position may be any position of the positioning device, which is not limited herein. The positioning rigid body is arranged on the positioning device so as to ensure that the positioning rigid body and the positioning device are relatively static. After that, the technician can carry or wear the positioning device to execute a preset action within the tracking range of the external tracking system, so that the positioning device can position the positioning device to acquire the first motion trail data of the positioning device and the external tracking system can acquire the second motion trail data of the positioning rigid body. The preset action can be preset according to the positioning accuracy test of the positioning device. For example, the preset motion may be selected to make a swivel motion within a tracking range of an external tracking system, or a lifting motion, a lowering motion, or the like, which is not particularly limited herein.
In the present application, when the positioning device performs positioning on the positioning device to obtain the first motion track data of the positioning device, the positioning device specifically uses a positioning system with 6 degrees of freedom to obtain the first motion track data, for example, the positioning device can optionally use an immediate positioning and map building (Simultaneous Localization and Mapping, abbreviated as SLAM) technology to obtain the first motion track data, and the specific obtaining process is a conventional technology, which is not described herein in detail.
In addition, when the external tracking system is used for acquiring the second motion trail data of the positioning rigid body, the optical signal can be transmitted through the optical signal catcher in the external tracking system to acquire the optical signal reflected by each marking point. After the optical signal reflected by each marker point is acquired, the external tracking system may determine pose data for each marker point based on the optical signal reflected by each marker point. Then, according to the pose data of each mark point, determining the pose data of the positioning rigid body; further, second motion trajectory data of the positioning rigid body is constructed from the pose data of the positioning rigid body.
Wherein the pose data of each marker point comprises: position data and attitude data. For the position data, the external tracking system can be directly determined based on the optical signals reflected by each mark point, while for the gesture data, the external tracking system is required to jointly determine one gesture data based on the optical signals reflected by four mark points, and then the gesture data is taken as the gesture data of each mark point. In the embodiment of the present application, the gesture data of each marker point is the same gesture data.
Wherein, determining an attitude data based on the optical signals reflected by the four mark points is a conventional technology in the art, and will not be repeated here.
S102, determining a coordinate system conversion relation according to the first motion trail data and the second motion trail data.
The coordinate system of the first motion trail data of the positioning device and the coordinate system of the second motion trail data of the positioning rigid body are not the same coordinate system. Therefore, if the positioning accuracy of the positioning device is determined by directly using the first motion trajectory data of the positioning device and the second motion trajectory data of the positioning rigid body, there are problems such as large error and low accuracy. Therefore, after the first motion trail data sent by the positioning device and the second motion trail data of the positioning rigid body sent by the external tracking system are obtained, the first motion trail data and the second motion trail data are optionally subjected to coordinate system one-step processing, so that the first motion trail data and the second motion trail data are unified into the same coordinate system. Furthermore, the positioning accuracy of the positioning device is determined based on the two motion trail data in the same coordinate system, so that the determined positioning accuracy of the positioning device is ensured to be small in error and higher in accuracy.
In the embodiment of the application, when the first motion trail data and the second motion trail data are unified to the same coordinate system, optionally, a coordinate system conversion relationship between the coordinate system corresponding to the first motion trail data and the coordinate system corresponding to the second motion trail data is determined according to the first motion trail data and the second motion trail data. Furthermore, based on the coordinate system conversion relationship, the first motion trajectory data and the second motion trajectory data can be unified into the same coordinate system.
As an optional implementation scheme, when determining a coordinate system conversion relation between a coordinate system corresponding to first motion track data and a coordinate system corresponding to second motion track data, the method can optionally perform time synchronization on the first motion track data and the second motion track data; and performing coordinate system alignment operation based on the first motion track data after time synchronization and the second motion track data after time synchronization to obtain a coordinate system conversion relation.
S103, converting the first motion trail data into a coordinate system of the second motion trail data according to the coordinate system conversion relation.
Considering that the second motion trail data is output based on a world coordinate system corresponding to the external tracking system; the first motion trajectory data of the positioning device is output based on a body coordinate system of the positioning device.
That is, the coordinate system corresponding to the first motion trajectory data of the positioning device is a body coordinate system, and the coordinate system corresponding to the second motion trajectory data is a world coordinate system.
And because the information such as the position of the space where the user is located is based on the world coordinate system when the positioning device is used. Therefore, after the coordinate system conversion relation is obtained, the first motion track data of the positioning device is optionally converted from the body coordinate system to the world coordinate system corresponding to the second motion track according to the coordinate system conversion relation, so that the two motion track data are in the same coordinate system (world coordinate system). Thereby laying a foundation for determining the positioning accuracy of the positioning device based on the two motion trail data in the same coordinate system.
Exemplary, as shown in fig. 4, after the first motion trajectory data is converted into the coordinate system of the second motion trajectory data, the motion data on each coordinate axis in the first motion trajectory data is schematically represented.
S104, determining the positioning precision of the positioning device according to the converted first movement track data and second movement track data.
Because the converted first motion profile data coincides with the second motion profile data of the positioning rigid body. Therefore, the positioning accuracy of the positioning device can be determined based on the two pieces of coincident motion track data, so that the purpose of determining the positioning accuracy of the positioning device is achieved.
Alternatively, the position error and the attitude (rotation) error of the positioning device may be calculated from the converted first motion trajectory data and second motion trajectory data. Further, the position error and the posture error of the positioning device are output and displayed to a technician, so that the technician can determine the positioning accuracy of the positioning device according to the displayed position error and posture error.
Further, the method can also calculate the position average error and the posture average error of the positioning device based on the calculated position error and the posture error. And then, outputting and displaying the position average error and the posture average error of the positioning device to a technician, so that the technician can determine the positioning precision of the positioning device according to the displayed position average error and the displayed posture average error.
It can be understood that the coordinate system conversion relation is determined according to the motion track data of the positioning device and the motion track data of the positioning rigid body arranged on the positioning device; then, converting the motion trail data of the positioning device into a world coordinate system in which the motion trail data of the positioning rigid body is positioned; and then, based on two motion trail data in the same world coordinate system, determining the positioning accuracy of the positioning device, thereby realizing the determination of the positioning accuracy of the positioning device on the basis of low cost and high accuracy and effectively improving the use experience of users.
According to the positioning accuracy determining method of the positioning device, the first motion track data of the positioning device and the second motion track data of the positioning rigid body are obtained, so that the coordinate system conversion relation is determined according to the first motion track data and the second motion track data, the first motion track data are converted into the coordinate system of the second motion track data according to the coordinate system conversion relation, and then the positioning accuracy of the positioning device is determined according to the converted first motion track data and the second motion track data. According to the method and the device, the motion track data of the positioning device are converted into the coordinate system of the motion track data of the positioning rigid body, so that the positioning accuracy of the positioning device is determined by utilizing the two motion track data in the same coordinate system, the checking accuracy of the positioning device is improved, and the checking cost of the positioning accuracy of the positioning device is reduced.
As can be seen from the above description, the present application determines the positioning accuracy of the positioning device by acquiring the first motion trajectory data of the positioning device and the second motion trajectory data of the positioning rigid body, so as to use the first motion trajectory data and the second motion trajectory data.
On the basis of the above embodiments, the embodiments of the present application further explain determining a coordinate system conversion relationship according to the first motion trajectory data and the second motion trajectory data. The above-mentioned optimization procedure provided in the embodiment of the present application is specifically described below with reference to fig. 5.
As shown in fig. 5, the method may include the steps of:
s201, acquiring first motion trail data of a positioning device and second motion trail data of a positioning rigid body.
S202, performing time synchronization processing on the first motion trail data and the second motion trail data to obtain synchronized first motion trail data and synchronized second motion trail data.
Because the positioning system corresponding to the positioning device and the external tracking system corresponding to the positioning rigid body are two different systems, the problem of non-uniform time exists. Therefore, the time synchronization processing can be carried out on the first motion track data and the second motion track data, so that the first motion track data and the second motion track data have the same time, and the subsequent determination of the positioning accuracy of the positioning device is facilitated.
Optionally, when the time synchronization processing is performed on the first motion trail data and the second motion trail data, the method can be implemented through the following steps:
s11, determining a target timestamp difference value according to the first motion trail data and the second motion trail data.
It is contemplated that there may be a start time and an end time of the movement when the technician carries or wears the positioning device to perform the movement within the tracking range of the external tracking system. Therefore, the method and the device can acquire the motion start time and the motion end time from the first motion trail data and the second motion trail data respectively. Then, a time stamp difference value is determined according to the movement start time and the movement end time.
In some optional implementations, when determining the timestamp difference, the present application may acquire first start timestamp information and first end timestamp information from the first motion trajectory data, and acquire second start timestamp information and second end timestamp information from the second motion trajectory data; then, determining a starting time stamp difference value according to the first starting time stamp information and the second starting time stamp information; determining an end timestamp difference value according to the first end timestamp information and the second end timestamp information; further, a target timestamp difference is determined from the start timestamp difference and the end timestamp difference.
The order of determining the starting timestamp difference and the ending timestamp difference may be that the starting timestamp difference is determined first and then the ending timestamp difference is determined. Or, firstly determining an end time stamp difference value, and then determining a start time stamp difference value; alternatively, the start timestamp difference value and the end timestamp difference value are determined simultaneously, which is not particularly limited in the present application.
Optionally, determining the starting timestamp difference according to the first starting timestamp information and the second starting timestamp information may be implemented by the following formula (1);
delta[0]=XR[0]-OP[0].....................(1)
wherein delta [0] is the start time stamp difference, XR [0] is the first start time stamp information, and OP [0] is the second start time stamp information.
In addition, determining the end timestamp difference according to the first end timestamp information and the second end timestamp information may be achieved by the following formula (2);
delta[1]=XR[1]-OP[1]....................(2)
wherein delta [1] is the end timestamp difference, XR [1] is the first end timestamp information, and OP [1] is the second end timestamp information.
After the starting timestamp difference value and the ending timestamp difference value are obtained, the target timestamp difference value can be determined according to the starting timestamp difference value and the ending timestamp difference value through the following formula (3):
Where delta_all is the target timestamp difference, delta [0] is the start timestamp difference, and delta [1] is the end timestamp difference.
And S12, performing time synchronization processing on the first motion trail data and the second motion trail data according to the target time stamp difference value to obtain synchronized first motion trail data and synchronized second motion trail data.
After the target timestamp difference is obtained, the target timestamp difference is optionally added to an external tracking system corresponding to the second motion trail data, so that the external tracking system is aligned with the positioning system of the positioning device in time. That is, the first motion trajectory data and the second motion trajectory data are made to have the same time stamp information.
Exemplary effects of time synchronization processing on the first motion profile data and the second motion profile data are shown in fig. 6. Wherein, the track 1 is the first motion track data of the positioning device, and the track 2 is the second motion track data of the positioning rigid body.
And S203, interpolation processing is carried out on the synchronized first motion trail data and the synchronized second motion trail, so that the first motion trail data and the second motion trail data with the same frequency are obtained.
Wherein the same frequency specifically refers to the same frame rate.
Since the positioning frequency of the positioning device is typically higher than that of the external tracking system, the motion data of the positioning device may be greater than that of the positioning rigid body. And the first motion trail data and the second motion trail data are subjected to time synchronization processing, and specifically, a target time stamp difference value is added to the second motion trail data. This results in that each motion data (each pose data) of the second motion trajectory data is delayed by one target time stamp difference value as a whole, compared to the motion data when not synchronously processed. In addition, compared with the first motion trail data, the motion data on the second motion trail data after the synchronization processing does not correspond to the motion data on the first motion trail data, that is, some time stamp information does not correspond to the position data and the gesture data.
Based on the method, interpolation processing can be performed on the synchronized first motion track data and the synchronized second motion track data, so that pose data on the second motion track data after interpolation processing can be in one-to-one correspondence with pose data on the first motion track data.
As some optional implementations, the application may perform interpolation processing on the synchronized first motion trajectory data and the synchronized second motion trajectory by:
s21, sequentially extracting time stamp information from the synchronized second motion trail data.
S22, determining a target time stamp section comprising time stamp information from the time stamp information of the synchronized first motion trail data according to the time stamp information.
And S23, carrying out interpolation processing on the synchronized second motion trail data according to the timestamp information and the target timestamp interval to obtain first motion trail data and second motion trail data with the same frequency.
The method comprises the steps of determining a target time interval comprising time stamp information from time stamp information of synchronized first motion trail data, and specifically traversing the time stamp information of the first motion trail data to find the target time stamp interval comprising the time stamp information.
After the target time stamp interval is determined, the maximum value in the target time stamp interval can be determined as a target start value, and the minimum value can be determined as a target end value. Then, interpolating the synchronized second motion trail data according to the timestamp information, the target start value, the target end value, pose data corresponding to the target start value and pose data corresponding to the target end value; further, the motion data on the interpolated second motion trajectory data corresponds to the motion data on the first motion trajectory data one by one.
The method specifically includes the steps of interpolating the synchronized second motion trail data according to timestamp information, a target start value, a target end value, pose data corresponding to the target start value and pose data corresponding to the target end value, and specifically includes the following steps:
s231, calculating a first time stamp difference value according to the time stamp information and the target end value, and calculating a second time stamp difference value according to the time stamp information and the target start value.
Specifically, the first timestamp difference value and the second timestamp difference value may be calculated using the following equation (4) and equation (5):
C1=endtime-curtime_op_t i ......................(4)
wherein C1 is the first timestamp difference, endtime is the target end value, and curtime_op_t i For the i-th time stamp information, i is an integer greater than or equal to 0.
C2=curtime_op_t i -starttime......................(5)
Wherein C2 is the second timestamp difference, curtime_op_t i I is the i-th time stamp information, and starttime is the target start value, i is an integer greater than or equal to 0.
S232, calculating the sum value of the time stamp difference values according to the first time stamp difference value and the second time stamp difference value.
Specifically, the sum of the timestamp differences can be calculated using the following equation (6):
sum=C1+C2
sum=(endtime-curtime_op_t i )+(curtime_op_t i -starttime)....................(6)
wherein sum is the sum of the time stamp differences, C1 is the first time stamp difference, C2 is the second time stamp difference, endtime is the target end value, curtime_op_t i I is the i-th time stamp information, and starttime is the target start value, i is an integer greater than or equal to 0.
S233, calculating a first proportion of the first timestamp difference value to the sum value of the first timestamp difference value and the timestamp difference value, and calculating a second proportion of the second timestamp difference value to the sum value of the second timestamp difference value and the timestamp difference value.
Specifically, the following formulas (7) and (8) may be used to calculate a first specific gravity of the sum value of the first timestamp difference value and a second specific gravity of the sum value of the second timestamp difference value;
wherein weight_c1 is the first specific gravity of the sum of the first timestamp difference values, C1 is the first timestamp difference value, sum is the sum of the timestamp difference values, endtime is the target end value, and curtime_op_t i I is the i-th time stamp information, and starttime is the target start value, i is an integer greater than 0.
Wherein weight_c2 is the second specific gravity of the sum of the second timestamp difference values, C2 is the second timestamp difference value, sum is the sum of the timestamp difference values, and curtime_op_t i For the i-th time stamp information, starttime is a target start value, and endtime is a target end value, i being an integer greater than or equal to 0.
S234, calculating the target position data corresponding to the time stamp information based on the first specific gravity, the position data corresponding to the target end value, the second specific gravity, and the position data corresponding to the target start value.
Specifically, the target position data corresponding to the time stamp information may be calculated by the following formula (9):
P_op_t i =weight_Cl*endtime_P+weight_C2*starttime_P......................(9)
wherein P_op_t i For the target position data corresponding to the ith timestamp information, weight_c1 is the first specific gravity of the sum value of the first timestamp difference value, endtime_p is the position data corresponding to the target end value, weight_c2 is the second specific gravity of the sum value of the second timestamp difference value, starttime_p is the position data corresponding to the target start value, and i is an integer greater than or equal to 0.
S235, calculating target posture data corresponding to the time stamp information according to the first specific gravity, the posture data corresponding to the target ending value, the second specific gravity and the posture data corresponding to the target starting value.
Specifically, the target pose data corresponding to the timestamp information may be calculated by the following formula (10):
Q_op_t i =starttime_Q.slerp(weight_Cl,endtime_Q))......................(10)
wherein Q_op_t i For the target gesture data corresponding to the ith timestamp information, starttime_q is gesture data corresponding to a target start value, slerp is called for four elements, slerp is quaternion interpolation, weight_c1 is the first proportion of the sum value of the first timestamp difference value, endtime_q is gesture data corresponding to a target end value, and i is an integer greater than or equal to 0.
S236, carrying out pose data interpolation processing on the timestamp information according to the target position data and the target pose data.
Specifically, the target position data and the target posture data are used as the posture data of the time stamp information, so that the interpolation processing of the second motion trail data is completed.
S204, determining a coordinate system conversion relation according to the first motion trail data and the second motion trail data with the same frequency.
After interpolation processing is performed on the first motion track data and the second motion track data, the pose data on the processed first motion track data and second motion track data are in one-to-one correspondence. Therefore, the coordinate system conversion relation can be determined based on the first motion trajectory data and the second motion trajectory data of the same frequency. The coordinate system conversion relation is specifically a conversion relation between a body coordinate system of the positioning device and a world coordinate system corresponding to the external tracking system. The coordinate system conversion relation is specifically used for converting the first motion track data from the body coordinate system of the positioning device to the conversion relation of the world coordinate system corresponding to the external tracking system so as to obtain the first motion track data in the world coordinate system.
Optionally, the pose data sets corresponding to the same time stamp information can be obtained from the first motion track data with the same frequency and the second motion track data with the same frequency, and the number of the pose data sets is multiple; and then, according to the acquired pose data set, determining a coordinate system conversion relation.
It should be noted that, in the embodiment of the present application, the coordinate system conversion relationship is specifically a pose conversion relationship. And, the pose conversion relation can be represented by a rotation matrix and a translation vector, specifically: t= [ R, T ], where R is the rotation matrix and T is the translation vector.
In some optional implementation schemes, after acquiring a plurality of pose data sets, the application can determine an initial pose conversion relationship according to the pose data set corresponding to the initial timestamp information, and the specific implementation process is as follows: t-xr_op_t0=t_xr_t0×t_op_t0; wherein t_xr_op_t0 is an initial pose conversion relationship, t_xr_t0 is pose data in the first motion trajectory data at an initial time (T0), and t_op_t0 is pose data in the second motion trajectory data at the initial time (T0). And according to the pose data sets corresponding to other timestamp information except the initial timestamp information, other pose conversion relations are determined, and the specific implementation process is as follows: T_XR_OP/u j =T_XR_t j *T_OP_t j WhereinT_XR_OP_t j For other pose conversion relations corresponding to other timestamp information except the initial time T0, T_XR_t j For pose data in the first motion trajectory data except for the initial time T0, t_op_t j J is an integer greater than 0, which is pose data in the second motion trajectory data except for the initial time t 0. Further, the pose conversion relationship is calculated from the initial pose conversion relationship and other pose conversion relationships.
As an optional implementation manner, when calculating the pose conversion relationship according to the initial pose conversion relationship and other pose conversion relationships, the method specifically can calculate a residual error according to pose data corresponding to each piece of timestamp information. Furthermore, the overall residual error is obtained according to the residual errors of all the pose conversion relation items, and the specific formula (11) is as follows:
wherein e Tsum Representing the overall residual, i representing the ith timestamp information, T N Representing the pose conversion relation corresponding to the N-th timestamp informationAnd representing the residual error of the ith pose conversion relation term, wherein N is an integer greater than 1.
In the embodiments of the present application,wherein (1)>Residual error of pose conversion relation item corresponding to ith timestamp information,/for the ith timestamp information>For the pose data of the ith timestamp information in the first motion trail data, +.>For the pose data of the ith timestamp information in the second motion trail data, +.>Is a four element multiplication, -1 is an inversion operation and xyz is the first 3 terms of the quaternion.
Further, the jacobian matrix of the overall residual corresponding to the above formula (11) is as follows formula (12):
wherein,representing the bias derivative, R and L represent linear transfer functions in the Jacobian matrix, e T Residual errors of pose conversion relation items corresponding to the timestamp information; t (T) XR Pose data of each time stamp information in the first motion trail data, T OP Pose data of each time stamp information in the second motion trail data is +.>Four element multiplication, -1 is the inversion operation, and T OP_XR And the pose conversion relation of the first motion trail data and the second motion trail data.
Furthermore, the jacobian matrix is iteratively optimized by using a gauss newton method, so that when the overall residual error is optimized to be minimum, a final pose conversion relation (coordinate system conversion relation) is obtained. The method comprises the following steps: t (T) OP_XP =arg min e Tsum
Further, in consideration of the positioning accuracy of the positioning device being actually determined, the respective body coordinate systems of the positioning device and the positioning rigid body do not completely coincide. Therefore, the present application also needs to determine the volume coordinate system conversion relationship between the volume coordinate system of the positioning device and the volume coordinate system of the positioning rigid body; further, the coordinate system conversion relationship is determined based on the conversion relationship between the body coordinate system of the positioning device and the world coordinate system of the external tracking system.
When determining the conversion relation between the body coordinate system of the positioning device and the body coordinate system of the positioning rigid body, the position data can be adjusted into the unit array I by adjusting the position data in the pose data on the second motion track data of the positioning rigid body. Then, a volume coordinate system conversion relationship is determined based on the adjusted second motion trajectory data and the converted first motion trajectory data.
It should be noted that, in the present application, based on the adjusted second motion track data and the converted first motion track data, the conversion relationship of the body coordinate system is determined, which is similar to or the same as the implementation principle of the conversion relationship between the body coordinate system of the determining and positioning device and the world coordinate system corresponding to the external tracking system, and the content of the conversion relationship between the body coordinate system and the world coordinate system of the determining and positioning device may be specifically referred to, which is not described in detail herein.
In an embodiment of the present application, a body coordinate system of the positioning device may be determined based on the first motion trajectory data of the positioning device.
Specifically, the coordinate system conversion relationship is determined according to the conversion relationship between the body coordinate system of the positioning device and the world coordinate system of the external tracking system, and the conversion relationship between the body coordinate system of the positioning device and the world coordinate system can be realized by the following formula (13):
T aligned =T2*T XR *T1.......................(13)
wherein T is aligned T2 is the conversion relation between the body coordinate system of the positioning device and the world coordinate system of the external tracking system XR Is the body coordinate system of the positioning device, and T1 is the conversion relation of the body coordinate system.
S205, converting the first motion trail data into a coordinate system of the second motion trail data according to the coordinate system conversion relation.
S206, determining the positioning precision of the positioning device according to the converted first movement track data and second movement track data.
According to the positioning accuracy determining method of the positioning device, the first motion track data of the positioning device and the second motion track data of the positioning rigid body are obtained, so that the coordinate system conversion relation is determined according to the first motion track data and the second motion track data, the first motion track data are converted into the coordinate system of the second motion track data according to the coordinate system conversion relation, and then the positioning accuracy of the positioning device is determined according to the converted first motion track data and the second motion track data. According to the method and the device, the motion track data of the positioning device are converted into the coordinate system of the motion track data of the positioning rigid body, so that the positioning accuracy of the positioning device is determined by utilizing the two motion track data in the same coordinate system, the checking accuracy of the positioning device is improved, and the checking cost of the positioning accuracy of the positioning device is reduced.
On the basis of the above embodiments, the embodiments of the present application further explain determining the positioning accuracy of the positioning device according to the converted first motion trajectory data and the second motion trajectory data. The above-described optimization process is specifically described below with reference to fig. 7.
As shown in fig. 7, the method may include the steps of:
s301, acquiring first motion trail data of a positioning device and second motion trail data of a positioning rigid body.
S302, determining a coordinate system conversion relation according to the first motion trail data and the second motion trail data.
S303, converting the first motion trail data into a coordinate system of the second motion trail data according to the coordinate system conversion relation.
S304, determining the position error and the posture error of the converted first motion track data according to the converted first motion track data and the second motion track data.
S305, determining the positioning precision of the positioning device according to the position error and the posture error of the converted first motion trail.
Optionally, determining the position error and the posture error of the converted first motion trajectory data according to the converted first motion trajectory data and the converted second motion trajectory data may be implemented by the following formula (14):
P error =P aligned -P op ......................(14)
wherein P is error For position error, P aligned For the position data in the converted first motion trail data, and P op Is the position data in the second motion trail data.
Wherein Q is error Is an attitude error, Q aligned For the gesture data in the converted first motion trail data, Q OP Is the gesture data in the second motion trail data, and-1 is the inversion operation.
Further, considering that each pose data in the motion trajectory data is three-dimensional data, after the position error and the pose error of the converted first motion trajectory data are obtained, the position average error and the pose average error can be calculated based on the position error and the converted first motion trajectory data, which can be specifically realized by the following formulas (16) to (18):
wherein vec is P_X Vec is the position average error of the X-axis P_Y Vec is the position average error of the Y-axis P_Z N is the number of position data in the converted first motion trail data, N is an integer larger than 1 and P is the position average error of the Z axis error_X P is the position error of X axis error_Y For position error of Y-axis and P error_Z Is the position error of the Z axis.
Since the four-element attitude data includes the Yaw angle Yaw, the Pitch angle Pitch, and the Roll angle Roll, and the relationship between the above parameters and the euler angle is shown in the following formula (17).
[Yaw error ,Pitch error ,Roll error ]=Qust2Angle.....................(17)
Wherein Yaw error Pitch for yaw angle error error For pitch angle error, roll error For flip Angle errors, and Qust2Angle is the pose data in the form of four elements, to the euler Angle relationship.
Further, based on the above formula (17), the attitude average error can be calculated using formula (18);
wherein vec is yaw Vec is the average error of yaw angle pitch Vec is the average error of pitch angle roll N is the number of position data in the converted first motion trail data, N is an integer larger than 1, and Yaw is the average error of the flip angle error Pitch for yaw angle error error Is pitch angle error and Roll error Is the flip angle error.
According to the positioning accuracy determining method of the positioning device, the first motion track data of the positioning device and the second motion track data of the positioning rigid body are obtained, so that the coordinate system conversion relation is determined according to the first motion track data and the second motion track data, the first motion track data are converted into the coordinate system of the second motion track data according to the coordinate system conversion relation, and then the positioning accuracy of the positioning device is determined according to the converted first motion track data and the second motion track data. According to the method and the device, the motion track data of the positioning device are converted into the coordinate system of the motion track data of the positioning rigid body, so that the positioning accuracy of the positioning device is determined by utilizing the two motion track data in the same coordinate system, the checking accuracy of the positioning device is improved, and the checking cost of the positioning accuracy of the positioning device is reduced.
A positioning accuracy determining apparatus of a positioning apparatus according to an embodiment of the present application will be described below with reference to fig. 8. Fig. 8 is a schematic block diagram of a positioning accuracy determining apparatus of a positioning apparatus according to an embodiment of the present application.
The positioning accuracy determining device 400 of the positioning device includes: a track acquisition module 410, a relationship determination module 420, a track conversion module 430, and an accuracy determination module 440.
The track acquisition module 410 is configured to acquire first motion track data of the positioning device and second motion track data of a positioning rigid body;
a relationship determining module 420, configured to determine a coordinate system conversion relationship according to the first motion trajectory data and the second motion trajectory data;
the track conversion module 430 is configured to convert the first motion track data into a coordinate system of the second motion track data according to the coordinate system conversion relationship;
the accuracy determining module 440 is configured to determine positioning accuracy of the positioning device according to the converted first motion trajectory data and the second motion trajectory data.
An optional implementation manner of the embodiment of the present application, the relationship determining module 420 includes:
The synchronization unit is used for carrying out time synchronization processing on the first motion trail data and the second motion trail data to obtain synchronized first motion trail data and synchronized second motion trail data;
the interpolation unit is used for carrying out interpolation processing on the synchronized first motion trail data and the synchronized second motion trail to obtain first motion trail data and second motion trail data with the same frequency;
and the first determining unit is used for determining a coordinate system conversion relation according to the first motion trail data and the second motion trail data with the same frequency.
An optional implementation manner of the embodiment of the present application, the synchronization unit is specifically configured to:
determining a target timestamp difference value according to the first motion trail data and the second motion trail data;
and carrying out time synchronization processing on the first motion trail data and the second motion trail data according to the target time stamp difference value to obtain synchronized first motion trail data and synchronized second motion trail data.
An optional implementation manner of the embodiment of the present application, the synchronization unit is further configured to:
acquiring first starting time stamp information and first ending time stamp information from the first motion trail data, and acquiring second starting time stamp information and second ending time stamp information from the second motion trail data;
Determining a starting time stamp difference value according to the first starting time stamp information and the second starting time stamp information;
determining an end timestamp difference value according to the first end timestamp information and the second end timestamp information;
and determining a target timestamp difference value according to the starting timestamp difference value and the ending timestamp difference value.
An optional implementation manner of the embodiment of the present application, the interpolation unit is specifically configured to:
sequentially extracting time stamp information from the synchronized second motion trail data;
determining a target time stamp section comprising the time stamp information from the time stamp information of the synchronized first motion trail data according to the time stamp information;
and carrying out interpolation processing on the synchronized second motion trail data according to the timestamp information and the target timestamp interval to obtain first motion trail data and second motion trail data with the same frequency.
An optional implementation manner of the embodiment of the present application, the interpolation unit is further configured to:
determining the maximum value in the target timestamp interval as a target start value and the minimum value as a target end value;
and carrying out interpolation processing on the synchronized second motion trail data according to the timestamp information, the target start value, the target end value, pose data corresponding to the target start value and pose data corresponding to the target end value.
An optional implementation manner of the embodiment of the present application, a first determining unit is specifically configured to:
acquiring pose data sets corresponding to the same time stamp information from the first motion trail data with the same frequency and the second motion trail data with the same frequency, wherein the number of the pose data sets is multiple;
and determining a coordinate system conversion relation according to the pose data set.
An optional implementation manner of the embodiment of the present application, the apparatus 400 further includes:
and the adjusting module is used for adjusting the position data in the second motion trail data with the same frequency.
An optional implementation manner of the embodiment of the present application, the precision determining module 440 is specifically configured to:
determining a position error and an attitude error of the converted first motion trail data according to the converted first motion trail data and the second motion trail data;
and determining the positioning precision of the positioning device according to the position error and the posture error of the converted first motion trail data.
In an alternative implementation manner of the embodiment of the present application, the accuracy determining module 440 is further configured to:
determining a position average error according to the converted first motion trail data and the position error;
And determining an average attitude error according to the converted first motion trail data and the attitude error.
In an optional implementation manner of the embodiment of the present application, the positioning rigid body includes at least four marking points, and the positioning rigid body is disposed in the positioning device;
accordingly, the track acquisition module 410 is specifically configured to:
acquiring an optical signal reflected by each marking point;
determining pose data of the positioning rigid body according to the optical signals reflected by each marking point;
and acquiring second motion trail data of the positioning rigid body according to the pose data of the positioning rigid body.
According to the positioning accuracy determining device of the positioning device, the first motion track data of the positioning device and the second motion track data of the positioning rigid body are obtained, so that the coordinate system conversion relation is determined according to the first motion track data and the second motion track data, the first motion track data are converted into the coordinate system of the second motion track data according to the coordinate system conversion relation, and then the positioning accuracy of the positioning device is determined according to the converted first motion track data and the second motion track data. According to the method and the device, the motion track data of the positioning device are converted into the coordinate system of the motion track data of the positioning rigid body, so that the positioning accuracy of the positioning device is determined by utilizing the two motion track data in the same coordinate system, the checking accuracy of the positioning device is improved, and the checking cost of the positioning accuracy of the positioning device is reduced.
It should be understood that apparatus embodiments and method embodiments may correspond with each other and that similar descriptions may refer to the method embodiments. To avoid repetition, no further description is provided here. Specifically, the apparatus 400 shown in fig. 8 may perform the method embodiment corresponding to fig. 1, and the foregoing and other operations and/or functions of each module in the apparatus 400 are respectively for implementing the corresponding flow in each method in fig. 1, and are not further described herein for brevity.
The apparatus 400 of the embodiments of the present application is described above in terms of functional modules in connection with the accompanying drawings. It should be understood that the functional module may be implemented in hardware, or may be implemented by instructions in software, or may be implemented by a combination of hardware and software modules. Specifically, each step of the method embodiments in the embodiments of the present application may be implemented by an integrated logic circuit of hardware in a processor and/or an instruction in software form, and the steps of the method disclosed in connection with the embodiments of the present application may be directly implemented as a hardware decoding processor or implemented by a combination of hardware and software modules in the decoding processor. Alternatively, the software modules may be located in a well-established storage medium in the art such as random access memory, flash memory, read-only memory, programmable read-only memory, electrically erasable programmable memory, registers, and the like. The storage medium is located in a memory, and the processor reads information in the memory, and in combination with hardware, performs the steps in the above method embodiments.
Fig. 9 is a schematic block diagram of an electronic device provided in an embodiment of the present application.
As shown in fig. 9, the electronic device 500 may include:
a memory 510 and a processor 520, the memory 510 being for storing a computer program and for transmitting the program code to the processor 520. In other words, the processor 520 may call and run a computer program from the memory 410 to implement the positioning accuracy determining method of the positioning device in the embodiment of the present application.
For example, the processor 520 may be configured to perform the above-described method embodiments according to instructions in the computer program.
In some embodiments of the present application, the processor 520 may include, but is not limited to:
a general purpose processor, digital signal processor (Digital Signal Processor, DSP), application specific integrated circuit (Application Specific Integrated Circuit, ASIC), field programmable gate array (Field Programmable Gate Array, FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like.
In some embodiments of the present application, the memory 510 includes, but is not limited to:
volatile memory and/or nonvolatile memory. The nonvolatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable EPROM (EEPROM), or a flash Memory. The volatile memory may be random access memory (Random Access Memory, RAM) which acts as an external cache. By way of example, and not limitation, many forms of RAM are available, such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (Double Data Rate SDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), and Direct memory bus RAM (DR RAM).
In some embodiments of the present application, the computer program may be partitioned into one or more modules that are stored in the memory 510 and executed by the processor 520 to perform the methods provided herein. The one or more modules may be a series of computer program instruction segments capable of performing the specified functions, which are used to describe the execution of the computer program in the electronic device.
As shown in fig. 9, the electronic device may further include:
a transceiver 530, the transceiver 530 being connectable to the processor 520 or the memory 510.
The processor 520 may control the transceiver 530 to communicate with other devices, and in particular, may send information or data to other devices or receive information or data sent by other devices. The transceiver 530 may include a transmitter and a receiver. The transceiver 530 may further include antennas, the number of which may be one or more.
It will be appreciated that the various components in the electronic device are connected by a bus system that includes, in addition to a data bus, a power bus, a control bus, and a status signal bus.
As an alternative implementation, in an embodiment of the present application, when the electronic device is an HMD, the embodiment of the present application provides a schematic block diagram of the HMD, as shown in fig. 10.
As shown in fig. 10, the main functional modules of the HMD600 may include, but are not limited to, the following: the detection module 610, the feedback module 620, the sensor 630, the control module 640, the modeling module 650.
The detection module 610 is configured to detect operation commands of a user by using various sensors, and act on a virtual environment, such as continuously updating images displayed on a display screen along with the line of sight of the user, so as to realize interaction between the user and the virtual scene.
The feedback module 620 is configured to receive data from the sensors and provide real-time feedback to the user. For example, the feedback module 620 may generate a feedback instruction based on the user operation data and output the feedback instruction.
The sensor 630 is configured to accept an operation command from a user and apply it to the virtual environment; and on the other hand is configured to provide the results generated after the operation to the user in the form of various feedback.
The control module 640 is configured to control sensors and various input/output devices, including obtaining user data such as motion, voice, etc., and outputting sensory data such as images, vibrations, temperature, sounds, etc., to affect the user, virtual environment, and the real world. For example, the control module 640 may obtain user gestures, voice, and the like.
The modeling module 650 is configured to construct a three-dimensional model of the virtual environment, and may also include various feedback mechanisms of sound, touch, etc. in the three-dimensional model.
It should be appreciated that the various functional modules in the HMD600 are connected by a bus system that includes, in addition to a data bus, a power bus, a control bus, a status signal bus, and the like.
The present application also provides a computer storage medium having stored thereon a computer program which, when executed by a computer, enables the computer to perform the method of the above-described method embodiments. Alternatively, embodiments of the present application also provide a computer program product comprising instructions which, when executed by a computer, cause the computer to perform the method of the method embodiments described above.
When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces, in whole or in part, a flow or function consistent with embodiments of the present application. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital subscriber line (digital subscriber line, DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., a floppy disk, a hard disk, a magnetic tape), an optical medium (e.g., a digital video disc (digital video disc, DVD)), or a semiconductor medium (e.g., a Solid State Disk (SSD)), or the like.
Those of ordinary skill in the art will appreciate that the various illustrative modules and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the several embodiments provided in this application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, and for example, the division of the modules is merely a logical function division, and there may be additional divisions when actually implemented, for example, multiple modules or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or modules, which may be in electrical, mechanical, or other forms.
The modules illustrated as separate components may or may not be physically separate, and components shown as modules may or may not be physical modules, i.e., may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. For example, functional modules in the embodiments of the present application may be integrated into one processing module, or each module may exist alone physically, or two or more modules may be integrated into one module.
The foregoing is merely specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the present application, and the changes and substitutions are intended to be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (15)

1. A positioning accuracy determining method of a positioning device, comprising:
acquiring first motion trail data of the positioning device and second motion trail data of a positioning rigid body;
Determining a coordinate system conversion relation according to the first motion trail data and the second motion trail data;
converting the first motion trail data into a coordinate system of the second motion trail data according to the coordinate system conversion relation;
and determining the positioning precision of the positioning device according to the converted first movement track data and the second movement track data.
2. The method of claim 1, wherein determining a coordinate system conversion relationship from the first motion profile data and the second motion profile data comprises:
performing time synchronization processing on the first motion trail data and the second motion trail data to obtain synchronized first motion trail data and synchronized second motion trail data;
interpolation processing is carried out on the synchronized first motion trail data and the synchronized second motion trail to obtain first motion trail data and second motion trail data with the same frequency;
and determining a coordinate system conversion relation according to the first motion trail data and the second motion trail data with the same frequency.
3. The method of claim 2, wherein performing time synchronization processing on the first motion profile data and the second motion profile data to obtain synchronized first motion profile data and synchronized second motion profile data, comprises:
Determining a target timestamp difference value according to the first motion trail data and the second motion trail data;
and carrying out time synchronization processing on the first motion trail data and the second motion trail data according to the target time stamp difference value to obtain synchronized first motion trail data and synchronized second motion trail data.
4. A method according to claim 3, wherein determining a target timestamp difference from the first motion profile data and the second motion profile data comprises:
acquiring first starting time stamp information and first ending time stamp information from the first motion trail data, and acquiring second starting time stamp information and second ending time stamp information from the second motion trail data;
determining a starting time stamp difference value according to the first starting time stamp information and the second starting time stamp information;
determining an end timestamp difference value according to the first end timestamp information and the second end timestamp information;
and determining a target timestamp difference value according to the starting timestamp difference value and the ending timestamp difference value.
5. The method of claim 2, wherein interpolating the synchronized first motion profile data and the synchronized second motion profile to obtain first motion profile data and second motion profile data of the same frequency comprises:
Sequentially extracting time stamp information from the synchronized second motion trail data;
determining a target time stamp section comprising the time stamp information from the time stamp information of the synchronized first motion trail data according to the time stamp information;
and carrying out interpolation processing on the synchronized second motion trail data according to the timestamp information and the target timestamp interval to obtain first motion trail data and second motion trail data with the same frequency.
6. The method of claim 5, wherein interpolating the synchronized second motion profile data based on the timestamp information and the target timestamp interval comprises:
determining the maximum value in the target timestamp interval as a target start value and the minimum value as a target end value;
and carrying out interpolation processing on the synchronized second motion trail data according to the timestamp information, the target start value, the target end value, pose data corresponding to the target start value and pose data corresponding to the target end value.
7. The method of claim 2, wherein determining the coordinate system conversion relationship from the first motion profile data and the second motion profile data of the same frequency comprises:
Acquiring pose data sets corresponding to the same time stamp information from the first motion trail data with the same frequency and the second motion trail data with the same frequency, wherein the number of the pose data sets is multiple;
and determining a coordinate system conversion relation according to the pose data set.
8. The method according to claim 7, further comprising, before acquiring the pose data set corresponding to the same time stamp information from the first motion trajectory data of the same frequency and the second motion trajectory data of the same frequency:
and adjusting the position data in the second motion trail data with the same frequency.
9. The method according to any one of claims 1-8, wherein determining the positioning accuracy of the positioning device from the converted first motion profile data and the second motion profile data comprises:
determining a position error and an attitude error of the converted first motion trail data according to the converted first motion trail data and the second motion trail data;
and determining the positioning precision of the positioning device according to the position error and the posture error of the converted first motion trail data.
10. The method as recited in claim 9, further comprising:
determining a position average error according to the converted first motion trail data and the position error;
and determining an average attitude error according to the converted first motion trail data and the attitude error.
11. The method of claim 1, wherein the positioning rigid body comprises at least four marker points and the positioning rigid body is disposed on the positioning device;
correspondingly, acquiring second motion trail data of the positioning rigid body comprises:
acquiring an optical signal reflected by each marking point;
determining pose data of the positioning rigid body according to the optical signals reflected by each marking point;
and acquiring second motion trail data of the positioning rigid body according to the pose data of the positioning rigid body.
12. A positioning accuracy determining apparatus of a positioning apparatus, comprising:
the track acquisition module is used for acquiring first motion track data of the positioning device and second motion track data of the positioning rigid body;
the relation determining module is used for determining a coordinate system conversion relation according to the first motion trail data and the second motion trail data;
The track conversion module is used for converting the first motion track data into a coordinate system of the second motion track data according to the coordinate system conversion relation;
and the precision determining module is used for determining the positioning precision of the positioning device according to the converted first motion trail data and the second motion trail data.
13. An electronic device, comprising:
a processor and a memory for storing a computer program, the processor being for invoking and running the computer program stored in the memory to perform the positioning accuracy determination method of the positioning device of any of claims 1 to 11.
14. A computer-readable storage medium storing a computer program for causing a computer to execute the positioning accuracy determining method of the positioning device according to any one of claims 1 to 11.
15. A computer program product comprising program instructions which, when run on an electronic device, cause the electronic device to perform the positioning accuracy determination method of the positioning apparatus according to any of claims 1 to 11.
CN202211117672.1A 2022-09-14 2022-09-14 Positioning accuracy determining method, device, equipment and medium for positioning device Pending CN117740025A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211117672.1A CN117740025A (en) 2022-09-14 2022-09-14 Positioning accuracy determining method, device, equipment and medium for positioning device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211117672.1A CN117740025A (en) 2022-09-14 2022-09-14 Positioning accuracy determining method, device, equipment and medium for positioning device

Publications (1)

Publication Number Publication Date
CN117740025A true CN117740025A (en) 2024-03-22

Family

ID=90254997

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211117672.1A Pending CN117740025A (en) 2022-09-14 2022-09-14 Positioning accuracy determining method, device, equipment and medium for positioning device

Country Status (1)

Country Link
CN (1) CN117740025A (en)

Similar Documents

Publication Publication Date Title
CN110457414A (en) Offline map processing, virtual objects display methods, device, medium and equipment
CN104536579A (en) Interactive three-dimensional scenery and digital image high-speed fusing processing system and method
CN110780742B (en) Eyeball tracking processing method and related device
US20200097068A1 (en) Method and apparatus for providing immersive reality content
WO2020044949A1 (en) Information processing device, information processing method, and program
CN115131528A (en) Virtual reality scene determination method, device and system
CN106802716B (en) Data processing method of virtual reality terminal and virtual reality terminal
KR102176805B1 (en) System and method for providing virtual reality contents indicated view direction
CN117689826A (en) Three-dimensional model construction and rendering method, device, equipment and medium
CN117740025A (en) Positioning accuracy determining method, device, equipment and medium for positioning device
CN112927718B (en) Method, device, terminal and storage medium for sensing surrounding environment
WO2021262376A1 (en) Motion matching for vr full body reconstruction
CN117664173A (en) Calibration method, device, equipment and medium of motion capture equipment
CN117351090A (en) Calibration method, device, equipment and system for light-emitting unit and camera
US20240013404A1 (en) Image processing method and apparatus, electronic device, and medium
US20240169568A1 (en) Method, device, and computer program product for room layout
CN118115653A (en) Three-dimensional scene reconstruction method, device, equipment and medium
CN117742555A (en) Control interaction method, device, equipment and medium
CN117130465A (en) Parameter setting method, device, equipment and storage medium based on XR equipment
CN118115592A (en) Target object calibration method, device, equipment and medium
CN118533144A (en) Pose determination method, pose determination device, pose determination medium, pose determination program, and pose determination program
CN118363450A (en) Motion capture method, motion capture device, motion capture apparatus, motion capture medium, and motion capture program product
CN118170248A (en) Data processing method, device, equipment, medium and product
CN117666852A (en) Method, device, equipment and medium for determining target object in virtual reality space
CN118819286A (en) Ray generation method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination