WO2023142353A1 - Procédé et appareil de prédiction de pose - Google Patents

Procédé et appareil de prédiction de pose Download PDF

Info

Publication number
WO2023142353A1
WO2023142353A1 PCT/CN2022/100638 CN2022100638W WO2023142353A1 WO 2023142353 A1 WO2023142353 A1 WO 2023142353A1 CN 2022100638 W CN2022100638 W CN 2022100638W WO 2023142353 A1 WO2023142353 A1 WO 2023142353A1
Authority
WO
WIPO (PCT)
Prior art keywords
target device
predicted
pose
target
visual information
Prior art date
Application number
PCT/CN2022/100638
Other languages
English (en)
Chinese (zh)
Inventor
陈星鑫
庞敏健
万培佩
刘贤焯
Original Assignee
奥比中光科技集团股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN202210093752.1A external-priority patent/CN114593735B/zh
Application filed by 奥比中光科技集团股份有限公司 filed Critical 奥比中光科技集团股份有限公司
Publication of WO2023142353A1 publication Critical patent/WO2023142353A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • G01C21/343Calculating itineraries, i.e. routes leading from a starting point to a series of categorical destinations using a global route restraint, round trips, touristic trips
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"

Definitions

  • the present application relates to the technical field of positioning, and in particular to a pose prediction method and device.
  • SLAM Simultaneous Localization and Mapping
  • SLAM is a method that uses its own sensors to perceive the environment, calculate its own pose in real time, and build incremental maps.
  • Technology it does not need to modify the external environment, and its positioning accuracy can reach centimeter level. Its application fields can include AR/VR, robots, unmanned driving, drones, etc.
  • the positioning of the existing SLAM system is very dependent on visual information. If the environment texture is weak and the equipment moves faster within a certain period of time, the image in the visual information collected by the SLAM system will appear as weak texture, occlusion or Blurred, which affects the normal operation of the SLAM system. For example, when the visual information is not collected, the SLAM system cannot estimate the pose of the device, which makes the SLAM system unable to work and cannot be restarted, and must wait until the visual information is re-collected. to restart and work. Therefore, there is an urgent need for a technical solution that can solve the problem that the SLAM system cannot work and cannot be restarted because the SLAM system cannot estimate the pose of the device when the visual information is not collected.
  • the embodiment of the present application provides a pose prediction method, device, computer equipment, and computer-readable storage medium to solve the problem that the SLAM system cannot perform the pose prediction of the device when no visual information is collected in the prior art. It is estimated that the SLAM system cannot work and cannot be restarted.
  • the first aspect of the embodiment of the present application provides a pose prediction method, the method comprising:
  • the movement speed parameter includes the angular velocity and linear acceleration of the target device, and the previous moment is a historical moment before the target device loses visual information;
  • the second aspect of the embodiment of the present application provides a device for pose prediction, the device comprising:
  • a parameter acquisition module which acquires a movement speed parameter of the target device at a previous moment; wherein, the movement speed parameter includes the angular velocity and linear acceleration of the target device, and the previous moment is a historical moment before the target device loses visual information;
  • An increment determination module configured to estimate the predicted displacement increment of the target device at each current moment by using the motion speed parameter and the preset displacement prediction model; wherein, the current moment is when the target device loses visual information The latest moment and every moment after the loss of visual information;
  • a pose and trajectory prediction module configured to calculate a predicted pose of the target device corresponding to each current moment according to the predicted displacement increment of the target device at each current moment, and construct a predicted motion trajectory of the target device;
  • An optimization module configured to optimize the predicted pose and the predicted motion trajectory of the target device, and acquire target poses and corresponding target motion trajectories of the target device at each current moment.
  • the third aspect of the embodiments of the present application provides a pose prediction system, including an inertial navigation sensor and a terminal device, wherein the inertial navigation sensor is used for the movement speed parameter of the terminal device at the previous moment, and the The terminal device is used to realize the steps of the above method.
  • a fourth aspect of the embodiments of the present application provides a terminal device, including a memory, a processor, and a computer program stored in the memory and operable on the processor, and the processor implements the steps of the above method when executing the computer program.
  • a fifth aspect of the embodiments of the present application provides a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and when the computer program is executed by a processor, the steps of the above method are implemented.
  • the present application has the beneficial effect that the target device’s pose can be estimated only by using the target device’s motion velocity parameters without the need for visual information of the target device; In the case of the visual information of the target device, the pose of the target device can still be estimated, so as to ensure that the SLAM system can continue to work using the estimated pose, thereby ensuring the robustness of the SLAM system.
  • Fig. 1 is a schematic diagram of a pose prediction system provided by an embodiment of the present application
  • Fig. 2 is a flow chart of the pose prediction method provided by the embodiment of the present application.
  • Fig. 3 is a schematic diagram of the network architecture of the displacement prediction model provided by the embodiment of the present application.
  • Fig. 4 is a schematic diagram of an optimized pose graph provided by an embodiment of the present application.
  • FIG. 5 is a block diagram of a pose prediction device provided in an embodiment of the present application.
  • Fig. 6 is a schematic diagram of a computer device provided by an embodiment of the present application.
  • the present invention provides a pose prediction method, which obtains the motion speed parameters of the target device at the previous moment; uses the motion speed parameters and the preset displacement prediction model to estimate the predicted displacement increment of the target device at each current moment;
  • the predicted displacement increment calculates the predicted pose of the target device at each current moment, and constructs the predicted motion trajectory; optimizes the predicted pose and predicted motion trajectory at each current moment, and obtains the target pose and the corresponding target device at each current moment. target trajectory.
  • Fig. 1 is a schematic diagram of a pose prediction system provided in this embodiment, which includes an inertial sensor (Inertial Measurement Unit, IMU) 1 and a terminal device 2, wherein, in one implementation, the terminal device 2 can be a server, or Various electronic devices that support data processing, including but not limited to smartphones, sweepers, tablets, laptops and desktop computers, etc.
  • the IMU1 can be set on a target device, and the target device can be a device that needs to be positioned, for example, the target device can be an AR/VR device, a robot, a vehicle, and the like.
  • the IMU1 can measure the movement speed parameter of the target device at the previous moment, and send the movement speed parameter to the terminal device 2 .
  • the terminal device 2 After the terminal device 2 obtains the movement speed parameter of the target device at the previous moment, it can obtain the predicted displacement increment of the target device according to the movement speed parameter and the preset displacement prediction model; the terminal device 2 can obtain the predicted displacement increment of the target device , to determine the predicted pose and predicted motion trajectory of the target device; the terminal device 2 can optimize the predicted pose and predicted motion trajectory of the target device to obtain the target pose and target motion trajectory of the target device.
  • Fig. 2 is a flow chart of a pose prediction method provided by an embodiment of the present application.
  • a pose prediction method in FIG. 2 may be executed by the data processing device 2 in FIG. 1 .
  • the pose prediction method includes:
  • S201 Obtain a movement speed parameter of the target device at a previous moment; the movement speed parameter includes the angular velocity and the linear acceleration of the target device.
  • the target device can be understood as a device that needs to predict a pose, for example, the target device can be an AR/VR device, a mobile phone, an autonomously moving robot, or an unmanned vehicle.
  • the target device may be equipped with a SLAM (Simultaneous Localization and Mapping) system, wherein the SLAM system includes an image acquisition device (such as a camera) and an IMU.
  • SLAM Simultaneous Localization and Mapping
  • the image acquisition device may collect the visual information of the target device every preset time period, for example, the image acquisition device may collect the visual information of the target device every 30 ms.
  • the visual information of the target device may be understood as image frames around the target device, for example, image frames of the front side, rear side, left side, and right side of the target device.
  • the IMU can measure the motion speed parameters of the target device at preset intervals, for example, the IMU can measure the motion speed parameters of the target device at a frequency of 100 Hz.
  • the moving speed parameter of the target device can be understood as the speed data of the target device during the moving process.
  • the movement speed parameter may include the angular velocity and linear acceleration in the device coordinate system of the target device, that is, the angular velocity and linear acceleration in the body coordinate system (ie, body frame) of the target device.
  • the accelerometers can be used to detect the respective linear accelerations of the target device on the x-axis, y-axis, and z-axis in the device coordinate system.
  • the gyroscope detects the respective angular velocities of the target device on the x-axis, y-axis, and z-axis in the device coordinate system; it can be understood that the movement speed parameters can include the x-axis, y-axis, and z-axis in the device coordinate system of the target device respective angular velocity and linear acceleration.
  • the IMU may also be a 9-axis IMU, that is, include a magnetometer, and the IMU is not specifically limited in this embodiment.
  • this embodiment uses the IMU to obtain the motion velocity parameters of the target device at the previous moment to predict the pose of the target device at the current moment. It should be noted that the previous moment is a historical moment before the target device loses the visual information.
  • the method may further include the following steps:
  • the acquisition of the movement speed parameter of the target device at the previous moment is performed.
  • the visual information of the target device if the visual information of the target device is not detected, it means that the visual information of the target device is lost, so it is not possible to use the visual information to estimate the pose of the target device; at this time, it is necessary to obtain the target device The motion speed parameters at the previous moment, so that the motion speed parameters of the target device can be used to estimate the pose of the target device.
  • the target device is equipped with a SLAM system
  • the SLAM system in the main function of the SLAM tracking thread, when the system state of the SLAM system is normal, the SLAM system is in the normal tracking state; when the visual information is lost, the SLAM system enters the tracking loss state , at this time, the motion velocity parameters collected by the IMU can be used to perform integral operations to obtain the recursive pose, but since the reliability of the recursive position obtained through the integral operation is not high, when the loss of visual information exceeds the preset duration ( For example, after 1s), if the visual information has not returned to normal, the SLAM system enters the tracking loss state. At this time, the step of obtaining the motion speed parameter of the target device at the previous moment needs to be executed.
  • S202 Estimate the predicted displacement increment of the target device at each current moment by using the motion speed parameter and the preset displacement prediction model.
  • the current moment is the latest moment when the target device loses the visual information and every moment after the visual information is lost.
  • the preset displacement prediction model can be used to determine the subsequent predicted displacement increment of the target device at each current moment according to the movement speed parameters, wherein the target device
  • the predicted displacement increment at the current moment can be understood as the predicted variation of displacement at each moment during the time when the target device loses visual information.
  • the target rotation matrix may be determined according to the angular velocity of the target device in the device coordinate system at the previous moment.
  • the angular velocity under the device coordinate system of the target device is integrated to obtain the target rotation matrix, wherein the target rotation matrix can be understood as converting the motion velocity parameter from the device coordinate system (ie body frame) to the preset world of the SLAM system
  • the rotation matrix that is, the rotation Rwb) in the coordinate system (world frame).
  • target rotation matrix to rotate the angular velocity and linear acceleration in the equipment coordinate system to obtain the angular velocity and linear acceleration in the world coordinate system, and input the angular velocity and linear acceleration in the world coordinate system into the preset displacement prediction model to obtain the target equipment Predicted displacement increments at each current moment.
  • the input of the preset displacement prediction model can be the angular velocity and linear acceleration of the x-axis, y-axis, and z-axis in the world coordinate system of a fixed time length (such as 1s); it should be noted that, assuming a fixed time length is 1s, since the data of angular velocity and linear acceleration at one moment is 6-dimensional, if the acquisition frequency of angular velocity and linear acceleration is 250HZ, then there are 250 pieces of angular velocity and linear acceleration data collected in 1s. It is understandable that the preset The input tensor size of the displacement prediction model is batch_size ⁇ 6 ⁇ 250.
  • the output of the preset displacement prediction model is the predicted displacement increment of the x-axis, y-axis, and z-axis of the target device in the world coordinate system, that is, within a period of time, the x-axis and y-axis of the target device in the world coordinate system , The displacement increment on the z axis.
  • the preset displacement prediction model includes a plurality of cascaded convolutional layers and an output layer, as shown in FIG.
  • the layers are connected, and the output layer is composed of a global average pooling layer (AngPool1d) and a convolutional layer (Conv1d) in series, thereby retaining the spatial structure of the network, greatly reducing the number of parameters of the model, reducing the reasoning time, and preventing The technical effect of overfitting.
  • the displacement prediction model also includes a BN layer and a relu layer, but they are omitted in Figure 3, where c in Figure 3 represents the number of channels, and k represents the size of the volume.
  • the model can be supervised by using a loss function, wherein the loss function is preferably a mean-square error (Mean-Square Error, MSE) loss function.
  • MSE LOSS that is, the MSE loss function
  • L mse is MSE LOSS
  • i ⁇ [0,n] representing the i-th moment.
  • the network structure of the displacement prediction model can also be a one-dimensional form of resnet18, TCN, LSTM and other neural network structures, which are not limited here.
  • S203 Calculate the predicted pose of the target device corresponding to each current moment according to the predicted displacement increment of the target device at each current moment, and construct a predicted motion trajectory of the target device.
  • the predicted displacement increment at each current moment and the pose information of the target device at the previous moment can be used to obtain the prediction of the target device at each current moment Pose information, and then use all the predicted pose information at the current moment to construct the predicted motion trajectory of the target device.
  • the predicted displacement increment of the target device includes a predicted displacement increment on the X axis, a predicted displacement increment on the Y axis, and a predicted displacement increment on the Z axis.
  • the target is determined according to the predicted displacement increment on the X-axis, the predicted displacement increment on the Y-axis, the predicted displacement increment on the Z-axis, and the pose information of the target device at the previous moment.
  • the predicted pose of the device on the X-axis, the predicted pose on the Y-axis, and the predicted pose on the Z-axis at the current moment for example, the pose information of the target device at the previous moment (that is, the x-axis, y-axis, Coordinates on the z-axis) add the predicted displacement increment of the target device on the X-axis, the predicted displacement increment on the Y-axis, and the predicted displacement increment on the Z-axis to obtain the predicted position of the target device on the X-axis at the current moment pose, the predicted pose on the Y axis, and the predicted pose on the Z axis (that is, the predicted coordinate value).
  • the predicted position of the target device at each current moment after losing visual information can be determined
  • Attitude information that is, the predicted position of the target device on the X-axis, the predicted position on the Y-axis, and the predicted position on the Z-axis at the current moment can be used as the predicted pose information of the target device at the current moment; the predicted position of the target device at the current moment
  • the pose information is regarded as the pose information of the previous moment, and the above calculation is repeated to predict the predicted pose of the next current moment, so as to obtain the predicted pose of each current moment during the time period when the target device loses visual information.
  • the predicted motion trajectory of the target device is constructed using the predicted pose information of the displacement at each current moment, as shown by the solid line in FIG. 4 .
  • this application can also perform pose estimation through zero-speed update combined with Kalman filter (EKF), Step Counting and other methods, and there is no limitation here.
  • EKF Kalman filter
  • the target device's motion velocity parameters can be used to estimate the predicted pose of the target device, in order to ensure the accuracy of the determined running trajectory of the target device, it is necessary to continuously check whether there is a visual image of the target device collected. Information, so that the predicted motion trajectory of the target device can be corrected according to the collected visual information of the target device.
  • step S204 also includes:
  • the predicted pose of the target device is optimized according to the image frame in the visual information to obtain the target pose.
  • the relocation condition may be that the target device can be positioned according to the visual information
  • determine the target pose of the target device that is, after detecting the visual information, judge whether the image frame in the visual information matches successfully with an image frame in the global map or local map pre-established by the SLAM system; if the visual information The image frame in the SLAM system is successfully matched with an image frame in the global map or local map pre-established by the SLAM system, indicating that the visual information satisfies the relocation condition, so that the target device can be located according to the visual information and the target of the target device can be determined
  • the image frame in the visual information cannot match any image frame in the pre-established global map or local map of the SLAM system, it means that the visual information does not meet the relocation conditions, so it cannot be based on
  • the visual information locates the target device to determine the target pose of
  • a pose optimization may be performed on the predicted pose of the target device according to the image frames in the visual information to obtain a target pose.
  • the target pose corresponding to the target device may be determined first according to the image frame in the visual information.
  • the image frame that matches the image frame in the visual information can be determined in the global map or local map pre-established in the SLAM system, wherein the similarity between the image frame in the visual information and the matched image frame is greater than Preset threshold, or, the matching image frame is the largest similarity between all image frames in the map and the image frame in the visual information; then, according to the image frame in the visual information and the matching image frame in the map Image frames, determine the ideal pose of the target device.
  • the predicted pose and predicted motion trajectory before the visual information is not detected are corrected and optimized according to the ideal pose, and the target pose and target motion trajectory of the target device are obtained.
  • the predicted pose before no visual information is detected can be obtained first, that is, the predicted pose of the target device is determined using a preset displacement prediction model; and then the predicted pose before no visual information is detected according to the ideal pose
  • the predicted motion trajectory corresponding to the pose and predicted pose is optimized to obtain the target pose and target motion trajectory of the target device.
  • the predicted pose is used as the optimization variable
  • the error of the relative motion estimation between the predicted pose and the ideal pose that is, the predicted motion trajectory corresponding to the predicted pose of the target device
  • the The optimization model is shown in the following formula:
  • T i is the predicted pose of the target device at the i-th moment
  • is the covariance of the residual
  • min() represents minimization
  • ⁇ T m is the predicted pose The observed value of the relative pose with the ideal pose
  • r() is a residual function, which represents the residual between the relative pose between the predicted pose to be optimized and the ideal pose and its observed value, specifically: Then, the optimization model is solved to obtain the target motion trajectory, where the starting point and end point of the visual information loss stage in the target motion trajectory are the detected pose before the loss of visual information and the ideal pose of the target device, respectively.
  • the error of the predicted motion trajectory corresponding to the predicted pose of the target device will be reduced, and the head and tail of the trajectory in the visual information loss stage in the optimized target motion trajectory will be compared with the SLAM system based on the collected visual information.
  • Normal estimated trajectory alignment for example, as shown in Figure 4, the solid line segment in the visual information loss stage is the predicted motion trajectory corresponding to the predicted pose, and the dashed line segment is the optimized target motion trajectory.
  • the visual information of the target device is detected, and the visual information does not satisfy the relocation condition, then continue to perform the acquisition of the movement speed parameter of the target device at the previous moment until the visual information of the target device is detected, and The visual information satisfies the relocation criteria.
  • a preset time period such as 20s
  • Fig. 5 is a schematic diagram of a pose prediction device provided by an embodiment of the present application. As shown in Figure 5, the pose prediction device includes:
  • the parameter acquisition module 501 is configured to acquire the movement speed parameter of the target device at a previous moment; wherein, the movement speed parameter includes the angular velocity and linear acceleration of the target device, and the previous moment is a historical moment before the target device loses visual information;
  • the incremental calculation module 502 is used to estimate the predicted displacement increment of the target device at each current moment by using the motion speed parameter and the preset displacement prediction model; wherein, the current moment is the latest moment when the target device loses visual information and after the visual information is lost every moment of
  • the pose and trajectory prediction module 503 is used to calculate the predicted pose of the target device corresponding to each current moment according to the predicted displacement increment of each current moment of the target device, and construct the predicted motion trajectory of the target device;
  • the optimization module 504 is configured to optimize the predicted pose and predicted motion trajectory of the target device, and obtain the target pose and corresponding target motion trajectory of the target device at each current moment.
  • the incremental calculation module 502 is specifically used for:
  • the predicted displacement increment of the target device includes a predicted displacement increment on the X axis, a predicted displacement increment on the Y axis, and a predicted displacement increment on the Z axis;
  • the position determination module 503 is specifically used for:
  • the predicted displacement increment on the X-axis determines the target device's position The predicted position on the X axis, the predicted position on the Y axis, and the predicted position on the Z axis;
  • the predicted positions of the target device on the X-axis the predicted position on the Y-axis and the predicted position on the Z-axis, the predicted poses of the target device at each current moment are determined.
  • the device also includes a detection module; the detection module is used for:
  • optimization module specifically for:
  • the predicted pose and predicted motion trajectory of the target device are optimized according to the image frames in the visual information to obtain the target pose and target motion trajectory.
  • optimization module specifically for:
  • the predicted pose of the target device is optimized to obtain the target pose.
  • optimization module is also used to:
  • optimization module is also used to:
  • FIG. 6 is a schematic diagram of a terminal device 6 provided by an embodiment of the present application.
  • the terminal device 6 includes: a processor 601 , a memory 602 and a computer program 603 stored in the memory 602 and capable of running on the processor 601 .
  • the processor 601 executes the computer program 603
  • the steps in the foregoing method embodiments are implemented.
  • the processor 601 executes the computer program 603 the functions of the modules/modules in the foregoing device embodiments are realized.
  • the computer program 603 can be divided into one or more modules/modules, and one or more modules/modules are stored in the memory 602 and executed by the processor 601 to complete the present application.
  • One or more modules/modules may be a series of computer program instruction segments capable of performing specific functions, and the instruction segments are used to describe the execution process of the computer program 603 in the computer device 6.
  • the terminal device 6 may include, but not limited to, a processor 601 and a memory 602 .
  • FIG. 6 is only an example of the terminal device 6, and does not constitute a limitation on the terminal device 6. It may include more or less components than those shown in the figure, or combine certain components, or different components.
  • computer equipment may also include input and output equipment, network access equipment, bus, and so on.
  • the processor 601 can be a central processing unit (Central Processing Unit, CPU), or other general processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), on-site Field-Programmable Gate Array (FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • CPU Central Processing Unit
  • DSP Digital Signal Processor
  • ASIC Application Specific Integrated Circuit
  • FPGA Field-Programmable Gate Array
  • a general-purpose processor may be a microprocessor, or the processor may be any conventional processor, or the like.
  • the storage 602 may be an internal storage module of the terminal device 6 , for example, a hard disk or memory of the terminal device 6 .
  • the memory 602 can also be an external storage device of the computer device 6, for example, a plug-in hard disk equipped on the terminal device 6, a smart memory card (Smart Media Card, SMC), a secure digital (Secure Digital, SD) card, a flash memory card ( Flash Card), etc. Further, the memory 602 may also include both an internal storage module of the terminal device 6 and an external storage device.
  • the memory 602 is used to store computer programs and other programs and data required by the computer equipment.
  • the memory 602 can also be used to temporarily store data that has been output or will be output.
  • modules and algorithm steps of the examples described in conjunction with the embodiments disclosed herein can be implemented by electronic hardware, or a combination of computer software and electronic hardware. Whether these functions are executed by hardware or software depends on the specific application and design constraints of the technical solution. Skilled artisans may use different methods to implement the described functions for each specific application, but such implementation should not be regarded as exceeding the scope of the present application.
  • the disclosed apparatus/computer equipment and methods can be implemented in other ways.
  • the device/computer device embodiments described above are only illustrative, for example, the division of modules or modules is only a logical function division, and there may be other division methods in actual implementation, and multiple modules or components can be Incorporation may either be integrated into another system, or some features may be omitted, or not implemented.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be through some interfaces, and the indirect coupling or communication connection of devices or modules may be in electrical, mechanical or other forms.
  • a module described as a separate component may or may not be physically separated, and a component shown as a module may or may not be a physical module, that is, it may be located in one place, or may also be distributed to multiple network modules. Part or all of the modules can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional module in each embodiment of the present application may be integrated into one processing module, each module may exist separately physically, or two or more modules may be integrated into one module.
  • the above-mentioned integrated modules can be implemented in the form of hardware or in the form of software function modules.
  • the integrated modules/modules are implemented in the form of software function modules and sold or used as independent products, they can be stored in a computer-readable storage medium. Based on this understanding, the present application realizes all or part of the processes in the methods of the above embodiments, and can also be completed by instructing related hardware through computer programs.
  • the computer programs can be stored in computer-readable storage media, and the computer programs can be processed. When executed by the controller, the steps in the above-mentioned method embodiments can be realized.
  • a computer program may include computer program code, which may be in source code form, object code form, executable file, or some intermediate form or the like.
  • the computer-readable medium may include: any entity or device capable of carrying computer program code, recording medium, U disk, removable hard disk, magnetic disk, optical disk, computer memory, read-only memory (Read-Only Memory, ROM), random access Memory (Random Access Memory, RAM), electrical carrier signal, telecommunication signal and software distribution medium, etc. It should be noted that the content contained in computer readable media may be appropriately increased or decreased according to the requirements of legislation and patent practice in the jurisdiction. For example, in some jurisdictions, computer readable media may not Including electrical carrier signals and telecommunication signals.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Resources & Organizations (AREA)
  • Automation & Control Theory (AREA)
  • Economics (AREA)
  • Strategic Management (AREA)
  • Marketing (AREA)
  • Game Theory and Decision Science (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Development Economics (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Image Analysis (AREA)

Abstract

Procédé et appareil de prédiction de pose. À condition qu'un dispositif cible perde des informations visuelles, la pose du dispositif cible est estimée uniquement à l'aide d'un paramètre de vitesse de mouvement du dispositif cible. Même si les informations visuelles du dispositif cible ne sont pas collectées, la pose du dispositif cible est quand même estimée, de telle sorte que la poursuite du travail et du fonctionnement d'un système SLAM peut être assurée en utilisant la pose estimée, ce qui permet d'assurer la robustesse du système SLAM.
PCT/CN2022/100638 2022-01-26 2022-06-23 Procédé et appareil de prédiction de pose WO2023142353A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210093752.1A CN114593735B (zh) 2022-01-26 一种位姿预测方法及装置
CN202210093752.1 2022-01-26

Publications (1)

Publication Number Publication Date
WO2023142353A1 true WO2023142353A1 (fr) 2023-08-03

Family

ID=81805920

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/100638 WO2023142353A1 (fr) 2022-01-26 2022-06-23 Procédé et appareil de prédiction de pose

Country Status (1)

Country Link
WO (1) WO2023142353A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110850455A (zh) * 2019-10-18 2020-02-28 浙江天尚元科技有限公司 一种基于差分gps和车辆运动学模型的轨迹录制方法
CN111161337A (zh) * 2019-12-18 2020-05-15 南京理工大学 一种动态环境下的陪护机器人同步定位与构图方法
US20210142488A1 (en) * 2019-11-12 2021-05-13 Naver Labs Corporation Method and system for tracking trajectory based on visual localization and odometry
CN113031436A (zh) * 2021-02-25 2021-06-25 西安建筑科技大学 一种基于事件触发的移动机器人模型预测轨迹跟踪控制系统及方法
CN113483755A (zh) * 2021-07-09 2021-10-08 北京易航远智科技有限公司 一种基于非全局一致地图的多传感器组合定位方法及系统
CN114593735A (zh) * 2022-01-26 2022-06-07 奥比中光科技集团股份有限公司 一种位姿预测方法及装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110850455A (zh) * 2019-10-18 2020-02-28 浙江天尚元科技有限公司 一种基于差分gps和车辆运动学模型的轨迹录制方法
US20210142488A1 (en) * 2019-11-12 2021-05-13 Naver Labs Corporation Method and system for tracking trajectory based on visual localization and odometry
CN111161337A (zh) * 2019-12-18 2020-05-15 南京理工大学 一种动态环境下的陪护机器人同步定位与构图方法
CN113031436A (zh) * 2021-02-25 2021-06-25 西安建筑科技大学 一种基于事件触发的移动机器人模型预测轨迹跟踪控制系统及方法
CN113483755A (zh) * 2021-07-09 2021-10-08 北京易航远智科技有限公司 一种基于非全局一致地图的多传感器组合定位方法及系统
CN114593735A (zh) * 2022-01-26 2022-06-07 奥比中光科技集团股份有限公司 一种位姿预测方法及装置

Also Published As

Publication number Publication date
CN114593735A (zh) 2022-06-07

Similar Documents

Publication Publication Date Title
CN107687850B (zh) 一种基于视觉和惯性测量单元的无人飞行器位姿估计方法
CN112734852B (zh) 一种机器人建图方法、装置及计算设备
CN110490900B (zh) 动态环境下的双目视觉定位方法及系统
US20190195631A1 (en) Positioning method, positioning device, and robot
CN112179330B (zh) 移动设备的位姿确定方法及装置
CN111209978B (zh) 三维视觉重定位方法、装置及计算设备、存储介质
US20210183100A1 (en) Data processing method and apparatus
CN108564657B (zh) 一种基于云端的地图构建方法、电子设备和可读存储介质
WO2022193508A1 (fr) Procédé et appareil d'optimisation de posture, dispositif électronique, support de stockage lisible par ordinateur, programme d'ordinateur et produit-programme
WO2020221307A1 (fr) Procédé et dispositif pour suivre un objet mobile
CN111932616B (zh) 一种利用并行计算加速的双目视觉惯性里程计方法
CN111915675B (zh) 基于粒子漂移的粒子滤波点云定位方法及其装置和系统
CN110942474B (zh) 机器人目标跟踪方法、设备及存储介质
CN114323033B (zh) 基于车道线和特征点的定位方法、设备及自动驾驶车辆
CN114136315B (zh) 一种基于单目视觉辅助惯性组合导航方法及系统
CN111623773A (zh) 一种基于鱼眼视觉和惯性测量的目标定位方法及装置
WO2020014864A1 (fr) Procédé et dispositif de détermination de pose, et support de stockage lisible par ordinateur
CN112967340A (zh) 同时定位和地图构建方法、装置、电子设备及存储介质
CN115164936A (zh) 高精地图制作中用于点云拼接的全局位姿修正方法及设备
CN112556699B (zh) 导航定位方法、装置、电子设备及可读存储介质
CN113570716A (zh) 云端三维地图构建方法、系统及设备
CN116958452A (zh) 三维重建方法和系统
WO2023142353A1 (fr) Procédé et appareil de prédiction de pose
CN115727871A (zh) 一种轨迹质量检测方法、装置、电子设备和存储介质
CN114593735B (zh) 一种位姿预测方法及装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22923162

Country of ref document: EP

Kind code of ref document: A1