CN115311353B - Multi-sensor multi-handle controller graph optimization tight coupling tracking method and system - Google Patents

Multi-sensor multi-handle controller graph optimization tight coupling tracking method and system Download PDF

Info

Publication number
CN115311353B
CN115311353B CN202211036999.6A CN202211036999A CN115311353B CN 115311353 B CN115311353 B CN 115311353B CN 202211036999 A CN202211036999 A CN 202211036999A CN 115311353 B CN115311353 B CN 115311353B
Authority
CN
China
Prior art keywords
handle
pose
camera
helmet
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211036999.6A
Other languages
Chinese (zh)
Other versions
CN115311353A (en
Inventor
朱张豪
费越
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Play Out Dreams Shanghai Technology Co ltd
Original Assignee
Play Out Dreams Shanghai Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Play Out Dreams Shanghai Technology Co ltd filed Critical Play Out Dreams Shanghai Technology Co ltd
Priority to CN202211036999.6A priority Critical patent/CN115311353B/en
Publication of CN115311353A publication Critical patent/CN115311353A/en
Application granted granted Critical
Publication of CN115311353B publication Critical patent/CN115311353B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/251Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention relates to a multi-sensor multi-handle controller graph optimization tight coupling tracking method and method. The method comprises the following steps: acquiring helmet tracking data and handle tracking data; determining the camera pose of the multi-camera coordinate system under the world coordinate system according to helmet inertial navigation data by a visual+IMU tightly coupled SLAM mode; determining the pose of the handle under the world coordinate system by referring to the pose of the camera No. 0; constructing a system state quantity based on the position and the posture of the handle, and carrying out IMU pre-integration according to the last system state quantity and the inertial navigation data of the handle to determine the initial predicted position and posture of the handle; according to the initial predicted pose of the handle, extracting and matching the 2D coordinates of the infrared light points to the 3D model coordinates, and determining the initial value of the system state quantity; determining the current system state by adopting a close-coupled BA diagram optimization mode according to the initial value of the system state quantity; according to the current system state, the handle is enabled to continuously output the 6DoF pose. The invention can realize that the handle tracking can be stable and not lost at high speed.

Description

Multi-sensor multi-handle controller graph optimization tight coupling tracking method and system
Technical Field
The invention relates to the technical field of SLAM (sequential liquid level memory), in particular to a multi-sensor multi-handle controller graph optimization tight coupling tracking method and system.
Background
Visual instant positioning and mapping (Simultaneous Localization anD Mapping, SLAM) refers to the creation of a map from images acquired by cameras in a completely unknown environment with the robot in an uncertain position itself, while utilizing the map for autonomous positioning and navigation.
Meanwhile, in VR/AR/MR fields, a multi-view camera is mostly adopted to observe marking points of some special optical patterns built in a handle controller, such as infrared LED light points, and meanwhile, by combining with an inertial measurement unit (Inertial measurement unit, IMU) built in the handle, through a computer vision technology, mainly referred to as SLAM, the motion state of the optical patterns on the handle controller in space, such as the position, the posture and the speed of the handle in space, and other system state quantity information, which generally includes the motion state of the position and the posture, is called 6 degrees of freedom (Degree of freeDom, doF).
First, some prior art problems are: the electromagnetic data of the built-in IMU unit of the handle can be fused, and though the accurate gesture and the accurate gravity direction can be obtained more easily, objects which interfere with the electromagnetic data such as iron products can exist in the range of the actual use scene, so that the tracking result of the fused electromagnetic data is unstable, and the hardware cost and the software computing power are increased instead.
Secondly, when the existing handle tracking method tracks the system state information of the handle, the following factors easily appear to cause great influence on the tracking performance. For example, since the camera is sensitive to ambient light, the complexity of the ambient light can directly affect the imaging quality of the camera, thereby affecting the tracking performance of the handle; in actual control, the angle of the hand-held handle is different, so that a plurality of mark points are overlapped or adhered on imaging, and the tracking performance of the handle is further affected; the hand-held handle is swung at a high speed, if the acceleration can reach more than hundred meters per square second during emergency stop, the 2D coordinate position uv of the mark point on imaging and the predicted position deviation are caused to be larger, so that the handle tracking is failed, and the clamping phenomenon occurs. Therefore, the existing gesture tracking method of the handle controller has more limitations, so that phenomena such as drift, jitter, jamming and the like exist in the virtual scene, and the user experience is affected relatively, and the gesture tracking method is difficult in an application with higher requirements on the tracking performance of the handle, such as the experert mode of the bet saber in VR.
The reason for the instability of the tracking method is many, and the tracking pose estimation may be inaccurate caused by loose coupling technology in the existing handle field, or may be caused by inaccurate and rapid initialization of the IMU, or may be caused by insufficient implementation of a tightly coupled filter mode to provide a pose with sufficient accuracy.
Disclosure of Invention
The invention aims to provide a multi-sensor multi-handle controller graph optimization tight coupling tracking method and system, which are used for solving the problem of instability of the tracking method.
In order to achieve the above object, the present invention provides the following solutions:
a multi-sensor multi-handle controller graph optimized close-coupled tracking method comprising:
acquiring helmet tracking data and handle tracking data; the helmet tracking data comprise images shot by a plurality of cameras and helmet inertial navigation data; the handle tracking data comprise images shot by the multi-camera and handle inertial navigation data;
determining the camera pose of the multi-view camera coordinate system under the world coordinate system according to the helmet inertial navigation data by a visual+IMU tightly coupled SLAM mode;
determining the pose of the handle under the world coordinate system by referring to the pose of the camera No. 0;
constructing a system state quantity based on the handle pose, and carrying out IMU pre-integration according to the last system state quantity and the handle inertial navigation data to determine an initial predicted pose of the handle; the system state quantity comprises a 3D vector position of a handle, a 3D vector speed, a gyroscope deviation and an accelerometer deviation;
extracting and matching the 2D coordinates of the infrared light points to the 3D model coordinates according to the initial predicted pose of the handle, and determining a system state quantity initial value;
determining the current system state by adopting a close-coupled BA diagram optimization mode according to the initial value of the system state quantity; the current state of the system is a low-frequency handle state;
and according to the current system state, enabling the handle to continuously output the 6DoF pose.
Optionally, the determining, by the SLAM manner of tight coupling of vision and IMU, the camera pose of the multi-camera coordinate system in the world coordinate system according to the helmet inertial navigation data specifically includes:
calibrating internal parameters of the multi-camera and external parameters among the multi-camera;
acquiring external parameters between a No. 0 camera and a helmet IMU sensor, delay of the helmet IMU sensor relative to the multi-camera and internal parameters of the helmet IMU sensor;
and determining the camera pose of the multi-camera coordinate system under the world coordinate system through a SLAM mode of visual + IMU tight coupling according to the internal parameters of the multi-camera, the external parameters between the No. 0 camera and the helmet IMU sensor, the delay of the helmet IMU sensor relative to the multi-camera and the internal parameters of the helmet IMU sensor.
Optionally, the constructing a system state quantity based on the handle pose, and performing IMU pre-integration according to the last system state quantity and the inertial navigation data of the handle, to determine an initial predicted pose of the handle, specifically includes:
pre-integrating the gyroscope 3d data and the accelerometer 3d data of the handle IMU sensor, determining the rotation amount, the translation amount and the speed variation of the current moment relative to the previous moment, and updating the gyroscope deviation and the accelerometer deviation;
an initial predicted position of the handle is determined based on the updated screw bias and the accelerometer bias.
Optionally, the step of extracting and matching the 2D coordinates of the infrared light points to the 3D model coordinates according to the initial predicted pose of the handle to determine the initial value of the system state quantity specifically includes:
and extracting and matching the 2D coordinates of the infrared light points to the 3D model coordinates by utilizing an N-point perspective algorithm according to the initial predicted pose of the handle, and determining the initial value of the system state quantity.
A multi-sensor multi-handle controller graph optimized close-coupled tracking system comprising:
the tracking data acquisition module is used for acquiring helmet tracking data and handle tracking data; the helmet tracking data comprise images shot by a plurality of cameras and helmet inertial navigation data; the handle tracking data comprise images shot by the multi-camera and handle inertial navigation data;
the camera pose determining module is used for determining the camera pose of the multi-camera coordinate system under the world coordinate system according to the helmet inertial navigation data in a SLAM mode of tight coupling of vision and IMU;
the handle pose determining module is used for determining the handle pose of the handle under the world coordinate system by referring to the camera pose of the No. 0 camera;
the handle initial predicting pose determining module is used for constructing a system state quantity based on the handle pose, and carrying out IMU pre-integration according to the last system state quantity and the handle inertial navigation data to determine the handle initial predicting pose; the system state quantity comprises a 3D vector position of a handle, a 3D vector speed, a gyroscope deviation and an accelerometer deviation;
the system state quantity initial value determining module is used for extracting and matching the 2D coordinates of the infrared light points to the 3D model coordinates according to the initial predicted pose of the handle to determine a system state quantity initial value;
the system current state determining module is used for determining the current system state by adopting a close-coupled BA diagram optimization mode according to the system state quantity initial value; the current state of the system is a low-frequency handle state;
and the 6DoF pose output module is used for enabling the handle to continuously output the 6DoF pose according to the current system state.
Optionally, the camera pose determining module specifically includes:
the calibration unit is used for calibrating the internal parameters of the multi-camera and the external parameters among the multi-camera;
the parameter acquisition unit is used for acquiring external parameters between the No. 0 camera and the helmet IMU sensor, delay of the helmet IMU sensor relative to the multi-camera and internal parameters of the helmet IMU sensor;
the camera pose determining unit is used for determining the camera pose of the multi-camera coordinate system under the world coordinate system through a SLAM mode of visual + IMU tight coupling according to the internal parameters of the multi-camera, the external parameters between the No. 0 camera and the helmet IMU sensor, the delay of the helmet IMU sensor relative to the multi-camera and the internal parameters of the helmet IMU sensor.
Optionally, the initial predicted position and pose determining module of the handle specifically includes:
the updating unit is used for pre-integrating the gyroscope 3d data and the accelerometer 3d data of the handle IMU sensor, determining the rotation amount, the translation amount and the speed variation of the current moment relative to the previous moment, and updating the gyroscope deviation and the accelerometer deviation;
and the handle initial predicted pose determining unit is used for determining the handle initial predicted pose according to the updated screw deviation and the accelerometer deviation.
Optionally, the system state quantity initial value determining module specifically includes:
and the system state quantity initial value determining unit is used for extracting and matching the 2D coordinates of the infrared light points to the 3D model coordinates by utilizing an N-point perspective algorithm according to the initial predicted pose of the handle, and determining the system state quantity initial value.
According to the specific embodiment provided by the invention, the invention discloses the following technical effects: the invention provides a multi-sensor multi-handle controller graph optimization tight coupling tracking method and method, which are used for determining the current system state by using a tight coupling BA graph optimization mode for multi-sensor multi-handle tracking, optimizing the system state at multiple moments, and enabling the current system state quantity to be more accurate, so that the light spot coordinates can be stably and accurately extracted and matched each time, the calculated amount of front end lifting points is reduced, the robustness of lifting points is increased, and further, the handle tracking can be stable and not lost at high speed, and the square root mean square (Root Mean Squared Error, RMSE) precision can reach millimeter level.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a method for optimizing tight coupling tracking of a multi-sensor multi-handle controller according to the present invention;
fig. 2 is a schematic diagram of connection relationship among a helmet, a multi-camera, and a helmet IMU sensor according to the present invention;
FIG. 3 is a schematic diagram of the relationship of the handle, optical sensor and handle IMU sensor provided by the present invention;
FIG. 4 is a block diagram of a multi-sensor multi-handle controller diagram optimized close-coupled tracking system provided by the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The invention aims to provide a multi-sensor multi-handle controller graph optimization tight coupling tracking method and a method, and handle tracking can be stable and free from loss at high speed.
In order that the above-recited objects, features and advantages of the present invention will become more readily apparent, a more particular description of the invention will be rendered by reference to the appended drawings and appended detailed description.
In the SLAM technical field, loose coupling means that only one sensor measurement error is optimized independently when the pose is optimized, and tight coupling means that an optimization objective function contains all sensor measurement errors when the pose is optimized, and a filter mode and a graph optimization mode are typical. The filter mode refers to a two-stage optimization mode, wherein a part of data sources of an optimization objective function, such as data of an IMU sensor, are used for transmitting Propagate and augmentation Augment to obtain a predicted pose for visual matching, and the rest part of data sources of the optimization objective function, such as visual data for updating Update, are used for common algorithms including MSCKF, ESCKF and the like. The graph optimization mode refers to that an optimization objective function and a state to be optimized are put together and optimized at the same time, commonly called as bundle adjustment BA (BunDleADjustment) optimization, the optimized state is commonly called as a node, the optimization objective function is simply called as an objective function, the optimization objective function is composed of a plurality of error function blocks or commonly called as edges, a common optimization algorithm library comprises ceres, g2o and the like, and a common algorithm comprises a Levenberg-MarquarDt, gauss-Newton method and the like. Other differences are that the state of the filter optimization is always the state at the current moment, the state of the graph optimization can be related to the state at more moments, and the data volume which can be processed is larger.
Fig. 1 is a flowchart of a multi-sensor multi-handle controller graph optimizing close-coupling tracking method provided by the invention, and as shown in fig. 1, the multi-sensor multi-handle controller graph optimizing close-coupling tracking method comprises the following steps:
step 101: acquiring helmet tracking data and handle tracking data; the helmet tracking data comprise images shot by a plurality of cameras and helmet inertial navigation data; the handle tracking data comprise images shot by the multi-camera and handle inertial navigation data, fig. 2 is a schematic diagram of connection relation among the helmet, the multi-camera and the helmet IMU sensor provided by the invention, and fig. 3 is a schematic diagram of relation among the handle, the optical sensor and the handle IMU sensor provided by the invention.
In practical application, the steps of acquiring the pose Twc are as follows:
calibrating the external reference T between the internal reference and the multiple eyes of the multiple-eye camera by using open source or self-grinding calibration software like kalibr cic0 ,c i Representing camera number i; and a reference external reference Tbc between camera number 0 and IMU sensor 0 The method comprises the steps of carrying out a first treatment on the surface of the Time delay t of IMU relative to camera d And internal parameters of the IMU.
Based on the internal and external parameters, the pose Twb_slam of the IMU under a world coordinate system W is obtained through open source or self-grinding vision and IMU tightly coupled SLAM, wherein W is a static inertial coordinate system with gravity being exactly one axis of xyz axis; thereby transforming Tb_slam_c from calibrated helmet IMU to camera number 0 0 And T cic0 Twc of the resulting multi-view camera i =Twb_slam*Tb_slam_c 0 *T cic0 -1
The main flow of the SLAM technology of the open source or self-grinding vision and IMU tight coupling comprises the following steps:
and extracting key points and descriptors of the multi-view image.
And predicting the pose of the current frame by using the IMU data of the helmet and the last pose Twb_slam, so as to perform the multi-view of the extracted key points and the matching with the historical map points.
Generating new map points through a triangulation algorithm according to the matches, and projecting the new map points together with the historical map points, wherein the new map points and the extracted key points have pixel-level reprojection 2d errors; meanwhile, 15d errors exist between IMU integral and actual relative pose; twb_slam, vwb_slam, bg_slam, and ba_slam are optimized using a nonlinear optimization method, thereby reducing these errors.
The iterative loop step 101 is in a SLAM-specific thread, but the Twc data is passed to the front-end thread of the subsequent step.
Step 102: and determining the camera pose of the multi-camera coordinate system under the world coordinate system according to the helmet inertial navigation data by a visual+IMU tightly coupled SLAM mode.
The step 102 specifically includes: calibrating internal parameters of the multi-camera and external parameters among the multi-camera; acquiring external parameters between a No. 0 camera and a helmet IMU sensor, delay of the helmet IMU sensor relative to the multi-camera and internal parameters of the helmet IMU sensor; and determining the camera pose of the multi-camera coordinate system under the world coordinate system through a SLAM mode of visual + IMU tight coupling according to the internal parameters of the multi-camera, the external parameters between the No. 0 camera and the helmet IMU sensor, the delay of the helmet IMU sensor relative to the multi-camera and the internal parameters of the helmet IMU sensor.
Step 103: and determining the pose of the handle under the world coordinate system by referring to the pose of the camera No. 0.
Step 104: constructing a system state quantity based on the handle pose, and carrying out IMU pre-integration according to the last system state quantity and the handle inertial navigation data to determine an initial predicted pose of the handle; the system state quantities include 3D vector position of the handle, 3D vector speed, gyroscope bias, and accelerometer bias.
The step 104 specifically includes: pre-integrating the gyroscope 3d data and the accelerometer 3d data of the handle IMU sensor, determining the rotation amount, the translation amount and the speed variation of the current moment relative to the previous moment, and updating the gyroscope deviation and the accelerometer deviation; an initial predicted position of the handle is determined based on the updated screw bias and the accelerometer bias.
In practical application, IMU pre-integration is performed according to the known last system state quantity and the IMU data, so as to obtain the predicted current handle pose Twb0, where the system state quantity includes the 3d vector position Twb of the handle, the direction or rotation matrix Rwb (Twb and Rwb are collectively referred to as Twb), the 3d vector speed vwb, the gyroscope deviation bg, the accelerometer deviation ba, the gravity acceleration g, the transformation Tbc, and the transformation Tbh from the calibrated handle IMU coordinate system to the LED light spot model coordinate system.
The last system state quantity is x i =[Twb i ,vwb i ,bg i ,ba i ]。
The IMU pre-integration is that the gyroscope 3d data gyr and the accelerometer 3d data acc of the IMU are used for pre-integration (the integration of the last system state quantity is not contained, which is equivalent to the relative quantity of a non-inertial system with gravity g and constant velocity vwbi) to obtain the current moment t j Relative to the last system time t i Is fixed by the rotation amount of (a)And amount of translationAnd speed variation->For the purpose of deviation from gyroscopes>A rotation part of the lower IMU pre-integration, +.>For the 3d gyroscope bias for IMU pre-integration, g is the gravity vector, +.>To be about IMU deviationTranslation part of the lower IMU pre-integration, +.>Gyro bias for pre-integration for IMU +.>And 3d accelerometer bias->6d vector of constitution->For +.>The speed part of the next IMU pre-integration, i is the sequence number of the current time frame, and j is the sequence number of the last system time frame. At the same time because of the current t i Bg and ba at the moment, i.e. +.>May be updated at the back end, assuming the update amount is +.>A more accurate predicted relative amount +.>And->Andwherein (1)>For deviation more accurate->Is a first order approximation of taylor expansion, Δp ij And Deltav ij Respectively->And->Is a first order approximation of->And->A coefficient matrix or a jacobian matrix corresponding to the first order term, respectively.
Thereby obtaining the initial value Twb of the predicted pose of the current handle 0 =[Rwb j ,twb j ]=[Rwb i *ΔR ij ,twb i +Rwb i *Δp ij +vwb i *Δt ij +(gΔt ij ^2)/2]Wherein Rwb j For the current t j Rotation of handle IMU under moment in world W coordinate system twb j For the current t j Translation, rwb, corresponding to time of day i For the last system t i Rotation of handle IMU under moment in world W coordinate system twb i For the last system t i Corresponding translation at time, vwb i For the last system t i Corresponding speed at the moment; update the current initial handle speed vwb 0 =vwb i +Rwb i *Δv ij +gΔt ij For the next step of searching and matching; meanwhile, an initial value is provided for optimization of the front-end tight coupling BA diagram.
Step 105: and extracting and matching the 2D coordinates of the infrared light points to the 3D model coordinates according to the initial predicted pose of the handle, and determining the initial value of the system state quantity.
The step 105 specifically includes: and extracting and matching the 2D coordinates of the infrared light points to the 3D model coordinates by utilizing an N-point perspective algorithm according to the initial predicted pose of the handle, and determining the initial value of the system state quantity.
Step 106: determining the current system state by adopting a close-coupled BA diagram optimization mode according to the initial value of the system state quantity; the current state of the system is a low frequency handle state.
Step 107: and according to the current system state, enabling the handle to continuously output the 6DoF pose.
In practical application, twb according to the system state quantity 0 And Tb h Obtaining Twh 0 =Twb 0 *Tb h Extracting and matching the infrared LED light spot 2d coordinate uv to the 3d model coordinate P, mainly obtaining more accurate matching through an N-point perspective algorithm PNP, entering a front end tight coupling BA diagram for optimization, and obtaining a system state initial value x of ti at the moment i =[Twb i ,vwb i ,bg i ,ba i ]The front end is a lightweight thread. At this point, the low frequency handle state xi is output to the system, which performs smoothing filtering and prediction using the handle IMU data, and then renders the handle state to the handle state seen by the user in advance.
The front-end processes the repeated loops of the LED light spot data picture of each frame of the handle, and the steps 103 to 105 are all that is needed.
Judging whether the handle LED light spot data picture is a key frame or not, wherein the simplest judging mode is to consider the handle LED light spot data picture as the key frame if the handle LED light spot data picture is separated from the last key frame by more than 0.2s, if the handle LED light spot data picture is the key frame, adding a back-end sliding window type BA (binary image) picture for optimization, updating the system state xi at the time, and updating the last system state used in the next frame into the system state at the position.
The back end runs slower threads except the front end, including but not limited to BA optimization threads and IMU initialization threads; meanwhile, the IMU initialization thread at the rear end can initialize IMU related parameters, wherein the initialization refers to the process of calculating the initial values of the scales s of IMU deviation bg, BA, gravity g and light spot 3d model coordinates and the speed vwb of a key frame through mathematical derivation under a certain priori, and finally providing the initial values for the front end to predict the pose through BA diagram optimization.
The back-end thread repeatedly loops through this portion of step 105, i.e., the sliding window BA optimization and IMU initialization, each time a new key frame is added, and only loops through the sliding window BA optimization if all handles have been IMU initialized.
The BA optimization is performed to make the system state xi at this time more accurate and smooth, especially the speed v, IMU deviation bg and BA; the IMU initialization is performed to make the IMU pre-integration and prediction of step 104 more accurate, so that the matching obtained after the PNP algorithm of step 105 is more accurate, the situation of mismatching is reduced, the state of the low-frequency handle output to the system is more accurate, and jitter is reduced.
The two optimizations work together to minimize the loss of the handle, even if it happens infrequently, less than a certain time threshold, such as 1s, and can be essentially masked by the above-mentioned predictive state, i.e., the state output to the system at this time is directly the predicted result xi 0 The last system state used in the next frame i+1 is then also xi here 0 . If the time threshold is exceeded, the handle is judged to be lost, and a pure rotation 3DOF mode which only uses gyroscope data in the IMU is entered.
The PNP algorithm is a method for solving 3D to 2D point-to-point motion, and is used for eliminating unreasonable 2D to 3D matching, and the main method is as follows:
the 3d model coordinate Pk of the LED spot is hPk under the handle h model coordinate system, then there is pose Twh of the handle h model coordinate system 0 =Twb 0 *Tb h After that, the 3d coordinates wPk in the world coordinate system can be obtained.
Once there are several sets of matched 2d coordinates uv, the pose Twh of the handle h coordinate system can be obtained by the P3P or EPnP method of the open source algorithm library opencv 1 Comparing and predicting pose Twh 0 If the error exceeds a certain threshold, the set of matches is discarded.
If the IMU predicted pose is relatively accurate, it is initially assumed that the continuously traceable matching relationship can be used directly by PNP, otherwise wPk needs to be projected onto the camera ci plane at the predicted 2d position uv 0 If spot z can be found within a certain circle of (c), then add to the above mentioned sets of matches.
Once the error of the PNP result has a matching pair which does not exceed a threshold value, outputting a group of matching with the smallest fusion error of the PNP pose error and the reprojection error under the matching after the overtime or all the groups of matching conditions are checked, and providing the matching for the reprojection error of the next BA.
The BA diagram optimization specifically refers to a method of nonlinear optimization, including but not limited to optimization libraries known in the industry such as ceres, g2o, etc., and is characterized in that a system state quantity, a given error function e and a covariance matrix Cov need to be determined, and a mathematical jacobian matrix J needs to be calculated for accelerating operation.
At the heart of one of the error functions e is here the observation uv that projects the 3d model coordinates P onto the distorted image plane (which can be considered as an image surface for a fisheye camera):
where pi () is the projection model of the camera, and contains the corresponding distortion model.
The 2N-dimensional reprojection error vector uv_all can be composed of N2D vectors uv, wherein N is the number of the observed multi-mesh LED light spot matches, and the 2N-dimensional reprojection error vector uv_all is mainly used in an objective function of BA optimization, and the essence of the objective function is the mahalanobis distance of various error vectors. The main function of uv is to position the handle Twb i Is optimized to best, while the IMU error is to constrain the relative x i-1 And x i The erroneous matching is reduced, so that uv errors can better reflect the errors of Twbi.
Specifically, the input of each projection function pi () in uv is that the 3D position P of the handle LED light spot in its model coordinate system H is projected into the handle IMU coordinate system B through the calibrated handle IMU external parameter Tbh, then projected into the world coordinate system W through the variable Twb to be optimized, and finally projected into the corresponding camera coordinate system C through Twc obtained in step one; the function of the projection function pi () is to convert the 3D coordinate under C into a 2D coordinate under the camera image plane with fisheye distortion, and then to make a difference with the LED2D coordinate extracted from the front end to obtain a 2D error.
The tight coupling specifically refers to the process of optimizing the system state quantity, and the error function comprises a visual error generated by the influence of the 2d coordinate position uv obtained by visual measurement data and the system state quantity and an IMU error generated by the influence of the IMU pre-integral and the system state quantity.
In the optimization of the tightly coupled BA graph, the optimization core equation is the Levenberg-Marquardt (LM) equation for a 15d increment Δx of the state quantity x= [ Twb, vwb, bg, BA ]:
(H+λI)Δx * =-b
where λ is the damping factor in LM method, smaller λ can make LM method closer to gaussian newton GN method; h is the information matrix of increment Deltax or the inverse of the cooperative differential matrix, and H is the square matrix of MxM; i is an identity matrix, b is a total error term stacked up corresponding to the observed errors of different sensors, and is a vector of Mx 1. H ij Giving a block matrix of NixNj of the ith row and jth column, ni representing the ith state quantity x i The dimension of delta Δxi, bi, gives the block vector of row i Nix1, where r (·) represents the observed error of the (·) sensor, jr (·) xi represents the jacobian matrix of this error versus Δxi. ρ(s) represents a robust kernel function,representing its first order derivative about the real number sA number. The specific optimization is to solve a large-dimension linear equation.
Fig. 4 is a block diagram of a multi-sensor multi-handle controller diagram optimization tight coupling tracking system according to the present invention, as shown in fig. 4, a multi-sensor multi-handle controller diagram optimization tight coupling tracking system includes:
a tracking data acquisition module 401 for acquiring helmet tracking data and handle tracking data; the helmet tracking data comprise images shot by a plurality of cameras and helmet inertial navigation data; the handgrip tracking data includes images captured by a multi-view camera and handgrip inertial navigation data.
The camera pose determining module 402 is configured to determine, according to the helmet inertial navigation data, a camera pose of the multi-view camera coordinate system under a world coordinate system in a SLAM manner of tight coupling of vision and IMU.
The camera pose determining module 402 specifically includes: the calibration unit is used for calibrating the internal parameters of the multi-camera and the external parameters among the multi-camera; the parameter acquisition unit is used for acquiring external parameters between the No. 0 camera and the helmet IMU sensor, delay of the helmet IMU sensor relative to the multi-camera and internal parameters of the helmet IMU sensor; the camera pose determining unit is used for determining the camera pose of the multi-camera coordinate system under the world coordinate system through a SLAM mode of visual + IMU tight coupling according to the internal parameters of the multi-camera, the external parameters between the No. 0 camera and the helmet IMU sensor, the delay of the helmet IMU sensor relative to the multi-camera and the internal parameters of the helmet IMU sensor.
The handle pose determining module 403 is configured to determine a handle pose of the handle in the world coordinate system by referring to a camera pose of the camera No. 0.
The handle initial predicting pose determining module 404 is configured to construct a system state quantity based on the handle pose, and perform IMU pre-integration according to the last system state quantity and the handle inertial navigation data to determine a handle initial predicting pose; the system state quantities include 3D vector position of the handle, 3D vector speed, gyroscope bias, and accelerometer bias.
The initial predicted position and pose determining module 404 of the handle specifically includes: the updating unit is used for pre-integrating the gyroscope 3d data and the accelerometer 3d data of the handle IMU sensor, determining the rotation amount, the translation amount and the speed variation of the current moment relative to the previous moment, and updating the gyroscope deviation and the accelerometer deviation; and the handle initial predicted pose determining unit is used for determining the handle initial predicted pose according to the updated screw deviation and the accelerometer deviation.
And the system state quantity initial value determining module 405 is configured to extract and match the 2D coordinates of the infrared light point to the 3D model coordinates according to the initial predicted pose of the handle, and determine a system state quantity initial value.
The system state quantity initial value determining module 405 specifically includes: and the system state quantity initial value determining unit is used for extracting and matching the 2D coordinates of the infrared light points to the 3D model coordinates by utilizing an N-point perspective algorithm according to the initial predicted pose of the handle, and determining the system state quantity initial value.
The system current state determining module 406 is configured to determine a current system state by using a tight coupling BA graph optimization mode according to the initial value of the system state quantity; the current state of the system is a low frequency handle state.
And the 6DoF pose output module 407 is configured to enable the handle to continuously output the 6DoF pose according to the current system state.
The scheme of optimizing the multi-time system state of the multi-sensor multi-handle is not limited to a graph optimization mode, and can expand the state variable of a filter mode, for example, the state variable of 15+6n dimension of MSCKF is expanded to 15n, so that the effect of optimizing the multi-time system state of the scheme can be achieved at the same time to a certain extent.
The multi-thread scheme applied to the multi-sensor multi-handle is not limited to a graph optimization mode, and similar effects can be achieved by adopting a common filter scheme and the extended filter scheme respectively in the optimization mode.
The projection model node applied to the multi-sensor and multi-handle can expand internal parameters including cameras, namely focal length parameters fx and fy, optical center coordinate parameters cx and cy, distortion parameters and the like, and can realize similar effects by fixing or not fixing the parameters.
The infrared LED light spot in the present invention, that is, the optical sensor, includes, but is not limited to, an infrared LED light spot with a known 3d model coordinate P, and may be a visible light marker with a known 3d model coordinate P.
For multi-sensor multi-handle tracking, the invention uses multithreading, including but not limited to three threads of a front-end thread, a back-end BA optimization thread and an IMU initialization thread belonging to the back-end, so that the system state at the current moment can be optimized only when the system state at the multiple moments is optimized or the IMU initialization is not output soon, and the handle tracking can continuously and stably output the 6DoF pose.
For multi-sensor multi-handle tracking, the present invention also uses an observed projection model with more than 3 nodes of the conventional projection model under distorted image plane, the nodes including but not limited to handle pose T wb Transformation T of model coordinate System to handle coordinate System bh Camera pose T wc The light spot model P, thus the tracking effect of the handle at different picture positions can be very stable and accurate.
In the present specification, each embodiment is described in a progressive manner, and each embodiment is mainly described in a different point from other embodiments, and identical and similar parts between the embodiments are all enough to refer to each other. For the system disclosed in the embodiment, since it corresponds to the method disclosed in the embodiment, the description is relatively simple, and the relevant points refer to the description of the method section.
The principles and embodiments of the present invention have been described herein with reference to specific examples, the description of which is intended only to assist in understanding the methods of the present invention and the core ideas thereof; also, it is within the scope of the present invention to be modified by those of ordinary skill in the art in light of the present teachings. In view of the foregoing, this description should not be construed as limiting the invention.

Claims (6)

1. A multi-sensor multi-handle controller graph optimized close-coupled tracking method, comprising:
acquiring helmet tracking data and handle tracking data; the helmet tracking data comprise images shot by a plurality of cameras and helmet inertial navigation data; the handle tracking data comprise images shot by the multi-camera and handle inertial navigation data;
the camera pose of the multi-camera coordinate system under the world coordinate system is determined according to the helmet inertial navigation data by a visual+IMU tightly coupled SLAM mode, and the method specifically comprises the following steps:
calibrating internal parameters of the multi-camera and external parameters among the multi-camera;
acquiring external parameters between a No. 0 camera and a helmet IMU sensor, delay of the helmet IMU sensor relative to the multi-camera and internal parameters of the helmet IMU sensor;
determining the camera pose of the multi-camera coordinate system under a world coordinate system through a SLAM mode of visual + IMU tight coupling according to the internal parameters of the multi-camera, the external parameters between the No. 0 camera and the helmet IMU sensor, the delay of the helmet IMU sensor relative to the multi-camera and the internal parameters of the helmet IMU sensor;
determining the pose of the handle under the world coordinate system by referring to the pose of the camera No. 0;
constructing a system state quantity based on the handle pose, and carrying out IMU pre-integration according to the last system state quantity and the handle inertial navigation data to determine an initial predicted pose of the handle; the system state quantity comprises a 3D vector position of a handle, a 3D vector speed, a gyroscope deviation and an accelerometer deviation;
extracting and matching the 2D coordinates of the infrared light points to the 3D model coordinates according to the initial predicted pose of the handle, and determining a system state quantity initial value;
determining the current system state by adopting a close-coupled BA diagram optimization mode according to the initial value of the system state quantity; the current state of the system is a low-frequency handle state;
and according to the current system state, enabling the handle to continuously output the 6DoF pose.
2. The method for optimizing close-coupled tracking of a multi-sensor multi-handle controller graph according to claim 1, wherein the constructing a system state quantity based on the handle pose and performing IMU pre-integration according to a previous system state quantity and the handle inertial navigation data to determine an initial predicted handle pose specifically comprises:
pre-integrating the gyroscope 3d data and the accelerometer 3d data of the handle IMU sensor, determining the rotation amount, the translation amount and the speed variation of the current moment relative to the previous moment, and updating the gyroscope deviation and the accelerometer deviation;
and determining an initial predicted position of the handle according to the updated gyroscope deviation and the accelerometer deviation.
3. The optimized close-coupled tracking method of the multi-sensor multi-handle controller graph according to claim 1, wherein the steps of extracting and matching the 2D coordinates of the infrared light points to the 3D model coordinates according to the initial predicted pose of the handle, and determining the initial value of the system state quantity comprise:
and extracting and matching the 2D coordinates of the infrared light points to the 3D model coordinates by utilizing an N-point perspective algorithm according to the initial predicted pose of the handle, and determining the initial value of the system state quantity.
4. A multi-sensor multi-handle controller graph optimized close-coupled tracking system, comprising:
the tracking data acquisition module is used for acquiring helmet tracking data and handle tracking data; the helmet tracking data comprise images shot by a plurality of cameras and helmet inertial navigation data; the handle tracking data comprise images shot by the multi-camera and handle inertial navigation data;
the camera pose determining module is used for determining the camera pose of the multi-camera coordinate system under the world coordinate system according to the helmet inertial navigation data in a SLAM mode of tight coupling of vision and IMU, and specifically comprises the following steps:
the calibration unit is used for calibrating the internal parameters of the multi-camera and the external parameters among the multi-camera;
the parameter acquisition unit is used for acquiring external parameters between the No. 0 camera and the helmet IMU sensor, delay of the helmet IMU sensor relative to the multi-camera and internal parameters of the helmet IMU sensor;
the camera pose determining unit is used for determining the camera pose of the multi-camera coordinate system under the world coordinate system through a SLAM mode of visual + IMU tight coupling according to the internal parameters of the multi-camera, the external parameters between the No. 0 camera and the helmet IMU sensor, the delay of the helmet IMU sensor relative to the multi-camera and the internal parameters of the helmet IMU sensor;
the handle pose determining module is used for determining the handle pose of the handle under the world coordinate system by referring to the camera pose of the No. 0 camera;
the handle initial predicting pose determining module is used for constructing a system state quantity based on the handle pose, and carrying out IMU pre-integration according to the last system state quantity and the handle inertial navigation data to determine the handle initial predicting pose; the system state quantity comprises a 3D vector position of a handle, a 3D vector speed, a gyroscope deviation and an accelerometer deviation;
the system state quantity initial value determining module is used for extracting and matching the 2D coordinates of the infrared light points to the 3D model coordinates according to the initial predicted pose of the handle to determine a system state quantity initial value;
the system current state determining module is used for determining the current system state by adopting a close-coupled BA diagram optimization mode according to the system state quantity initial value; the current state of the system is a low-frequency handle state;
and the 6DoF pose output module is used for enabling the handle to continuously output the 6DoF pose according to the current system state.
5. The multi-sensor multi-handle controller graph optimized close-coupled tracking system of claim 4, wherein the handle initial prediction pose determination module specifically comprises:
the updating unit is used for pre-integrating the gyroscope 3d data and the accelerometer 3d data of the handle IMU sensor, determining the rotation amount, the translation amount and the speed variation of the current moment relative to the previous moment, and updating the gyroscope deviation and the accelerometer deviation;
and the handle initial predicted pose determining unit is used for determining the handle initial predicted pose according to the updated gyroscope deviation and the accelerometer deviation.
6. The multi-sensor multi-handle controller graph optimization tight-coupling tracking system of claim 4, wherein the system state quantity initial value determination module specifically comprises:
and the system state quantity initial value determining unit is used for extracting and matching the 2D coordinates of the infrared light points to the 3D model coordinates by utilizing an N-point perspective algorithm according to the initial predicted pose of the handle, and determining the system state quantity initial value.
CN202211036999.6A 2022-08-29 2022-08-29 Multi-sensor multi-handle controller graph optimization tight coupling tracking method and system Active CN115311353B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211036999.6A CN115311353B (en) 2022-08-29 2022-08-29 Multi-sensor multi-handle controller graph optimization tight coupling tracking method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211036999.6A CN115311353B (en) 2022-08-29 2022-08-29 Multi-sensor multi-handle controller graph optimization tight coupling tracking method and system

Publications (2)

Publication Number Publication Date
CN115311353A CN115311353A (en) 2022-11-08
CN115311353B true CN115311353B (en) 2023-10-10

Family

ID=83864073

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211036999.6A Active CN115311353B (en) 2022-08-29 2022-08-29 Multi-sensor multi-handle controller graph optimization tight coupling tracking method and system

Country Status (1)

Country Link
CN (1) CN115311353B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118212258A (en) * 2022-12-16 2024-06-18 高通科技公司 Pose tracking method, pose tracking system, pose tracking equipment and storage medium

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110880189A (en) * 2018-09-06 2020-03-13 舜宇光学(浙江)研究院有限公司 Combined calibration method and combined calibration device thereof and electronic equipment
CN111949123A (en) * 2020-07-01 2020-11-17 青岛小鸟看看科技有限公司 Hybrid tracking method and device for multi-sensor handle controller
CN111983639A (en) * 2020-08-25 2020-11-24 浙江光珀智能科技有限公司 Multi-sensor SLAM method based on Multi-Camera/Lidar/IMU
CN112085790A (en) * 2020-08-14 2020-12-15 香港理工大学深圳研究院 Point-line combined multi-camera visual SLAM method, equipment and storage medium
CN112179338A (en) * 2020-09-07 2021-01-05 西北工业大学 Low-altitude unmanned aerial vehicle self-positioning method based on vision and inertial navigation fusion
CN113838141A (en) * 2021-09-02 2021-12-24 中南大学 External parameter calibration method and system for single line laser radar and visible light camera
CN114170308A (en) * 2021-11-18 2022-03-11 上海鱼微阿科技有限公司 All-in-one machine pose true value calculating method and device, electronic equipment and storage medium
CN114295127A (en) * 2021-12-21 2022-04-08 上海鱼微阿科技有限公司 RONIN and 6DOF positioning fusion method and hardware system framework
CN114332423A (en) * 2021-12-30 2022-04-12 深圳创维新世界科技有限公司 Virtual reality handle tracking method, terminal and computer-readable storage medium
CN114935975A (en) * 2022-05-13 2022-08-23 歌尔股份有限公司 Multi-user interaction method for virtual reality, electronic equipment and readable storage medium
CN114943773A (en) * 2022-04-06 2022-08-26 阿里巴巴(中国)有限公司 Camera calibration method, device, equipment and storage medium

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110880189A (en) * 2018-09-06 2020-03-13 舜宇光学(浙江)研究院有限公司 Combined calibration method and combined calibration device thereof and electronic equipment
CN111949123A (en) * 2020-07-01 2020-11-17 青岛小鸟看看科技有限公司 Hybrid tracking method and device for multi-sensor handle controller
CN112085790A (en) * 2020-08-14 2020-12-15 香港理工大学深圳研究院 Point-line combined multi-camera visual SLAM method, equipment and storage medium
CN111983639A (en) * 2020-08-25 2020-11-24 浙江光珀智能科技有限公司 Multi-sensor SLAM method based on Multi-Camera/Lidar/IMU
CN112179338A (en) * 2020-09-07 2021-01-05 西北工业大学 Low-altitude unmanned aerial vehicle self-positioning method based on vision and inertial navigation fusion
CN113838141A (en) * 2021-09-02 2021-12-24 中南大学 External parameter calibration method and system for single line laser radar and visible light camera
CN114170308A (en) * 2021-11-18 2022-03-11 上海鱼微阿科技有限公司 All-in-one machine pose true value calculating method and device, electronic equipment and storage medium
CN114295127A (en) * 2021-12-21 2022-04-08 上海鱼微阿科技有限公司 RONIN and 6DOF positioning fusion method and hardware system framework
CN114332423A (en) * 2021-12-30 2022-04-12 深圳创维新世界科技有限公司 Virtual reality handle tracking method, terminal and computer-readable storage medium
CN114943773A (en) * 2022-04-06 2022-08-26 阿里巴巴(中国)有限公司 Camera calibration method, device, equipment and storage medium
CN114935975A (en) * 2022-05-13 2022-08-23 歌尔股份有限公司 Multi-user interaction method for virtual reality, electronic equipment and readable storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于IMU与单目视觉融合的位姿估计方法研究;王晨曦;《中国优秀硕士学位论文全文数据库信息科技辑》(第2期);第20-48页 *

Also Published As

Publication number Publication date
CN115311353A (en) 2022-11-08

Similar Documents

Publication Publication Date Title
US11668571B2 (en) Simultaneous localization and mapping (SLAM) using dual event cameras
Qin et al. Vins-mono: A robust and versatile monocular visual-inertial state estimator
CN111156998B (en) Mobile robot positioning method based on RGB-D camera and IMU information fusion
CN110880189B (en) Combined calibration method and combined calibration device thereof and electronic equipment
US9071829B2 (en) Method and system for fusing data arising from image sensors and from motion or position sensors
Dong-Si et al. Estimator initialization in vision-aided inertial navigation with unknown camera-IMU calibration
CN110411476B (en) Calibration adaptation and evaluation method and system for visual inertial odometer
KR100855657B1 (en) System for estimating self-position of the mobile robot using monocular zoom-camara and method therefor
US12073630B2 (en) Moving object tracking method and apparatus
CN102914293A (en) Information processing apparatus and information processing method
Seiskari et al. HybVIO: Pushing the limits of real-time visual-inertial odometry
CN113516692B (en) SLAM method and device for multi-sensor fusion
KR102559203B1 (en) Method and apparatus of outputting pose information
CN111623773B (en) Target positioning method and device based on fisheye vision and inertial measurement
CN115311353B (en) Multi-sensor multi-handle controller graph optimization tight coupling tracking method and system
Wang et al. LF-VIO: A visual-inertial-odometry framework for large field-of-view cameras with negative plane
Wang et al. A robust 6-D pose tracking approach by fusing a multi-camera tracking device and an AHRS module
Ling et al. RGB-D inertial odometry for indoor robot via keyframe-based nonlinear optimization
Yin et al. Stereo visual-inertial odometry with online initialization and extrinsic self-calibration
Yang et al. Recursive depth parametrization of monocular visual navigation: Observability analysis and performance evaluation
Lee et al. Scale-aware visual-inertial depth estimation and odometry using monocular self-supervised learning
Presnov et al. Robust range camera pose estimation for mobile online scene reconstruction
Chen et al. RELEAD: Resilient Localization with Enhanced LiDAR Odometry in Adverse Environments
Huai et al. Markov parallel tracking and mapping for probabilistic slam
Gui et al. Robust direct visual inertial odometry via entropy-based relative pose estimation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: Room 501, Building 3, No. 1 Jiusong Road, Xinqiao Town, Songjiang District, Shanghai, 2016

Applicant after: Play Out Dreams (Shanghai) Technology Co.,Ltd.

Address before: 201600 Room 501, Building 3, No. 1 Caosung Road, Xinqiao Town, Songjiang District, Shanghai

Applicant before: Shanghai yuweia Technology Co.,Ltd.

GR01 Patent grant
GR01 Patent grant