CN118067118A - Positioning method, positioning system, robot, and computer-readable storage medium - Google Patents
Positioning method, positioning system, robot, and computer-readable storage medium Download PDFInfo
- Publication number
- CN118067118A CN118067118A CN202410208000.4A CN202410208000A CN118067118A CN 118067118 A CN118067118 A CN 118067118A CN 202410208000 A CN202410208000 A CN 202410208000A CN 118067118 A CN118067118 A CN 118067118A
- Authority
- CN
- China
- Prior art keywords
- moment
- positioning system
- sensor
- measurement
- state
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 65
- 238000005259 measurement Methods 0.000 claims abstract description 217
- 230000004927 fusion Effects 0.000 claims abstract description 110
- 239000011159 matrix material Substances 0.000 claims description 51
- 230000004438 eyesight Effects 0.000 claims description 29
- 238000004590 computer program Methods 0.000 claims description 7
- 230000009466 transformation Effects 0.000 claims description 7
- 238000004422 calculation algorithm Methods 0.000 description 10
- 238000010586 diagram Methods 0.000 description 9
- 230000006870 function Effects 0.000 description 8
- 230000000007 visual effect Effects 0.000 description 7
- 230000001133 acceleration Effects 0.000 description 4
- 230000003044 adaptive effect Effects 0.000 description 4
- 238000001514 detection method Methods 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000001934 delay Effects 0.000 description 1
- 238000009795 derivation Methods 0.000 description 1
- 238000007499 fusion processing Methods 0.000 description 1
- 238000012886 linear function Methods 0.000 description 1
- 238000005295 random walk Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000004382 visual function Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/183—Compensation of inertial measurements, e.g. for temperature effects
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C22/00—Measuring distance traversed on the ground by vehicles, persons, animals or other moving solid bodies, e.g. using odometers, using pedometers
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C25/00—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Manufacturing & Machinery (AREA)
- Position Fixing By Use Of Radio Waves (AREA)
Abstract
The embodiment of the application provides a positioning method, a positioning system, a robot and a computer readable storage medium. The positioning method comprises the following steps: based on the motion parameters obtained by the prediction sensor at the moment k, obtaining a prediction state variable of the positioning system at the moment k; acquiring state measurement acquired at the moment k by each of M measuring sensors in the N measuring sensors, wherein M is a positive integer less than or equal to N; based on the working states of N measuring sensors at the moment k, configuring a positioning system to enter a first fusion mode, wherein the first fusion mode is an independent fusion mode or a multiple fusion mode; under the condition that the positioning system enters a first fusion mode, a predicted state variable at the k moment is updated based on state measurement at the k moment, which is acquired by a measuring sensor in an effective working state, of the M measuring sensors, and positioning information at the k moment of the positioning system is acquired. According to the technical scheme provided by the embodiment of the application, more accurate positioning information can be obtained.
Description
Technical Field
The embodiment of the application relates to the technical field of positioning, in particular to a positioning method, a positioning system, a robot and a computer readable storage medium.
Background
In general, in the field of automatic driving, some prediction sensors, such as a inertial odometer or a wheel odometer, are installed on a mobile platform, and predict a state variable of a positioning system at a current moment by using data acquired at a previous moment, so as to obtain positioning information of the mobile platform.
Disclosure of Invention
The embodiment of the application provides a positioning method, a positioning system, a robot and a computer readable storage medium, which can obtain more accurate positioning information.
In a first aspect, a positioning method is provided, applied to a positioning system, the positioning system includes a prediction sensor and N measurement sensors, the N measurement sensors include a vision sensor and/or a pose sensor, and N is a positive integer greater than 1, the positioning method includes: acquiring motion parameters acquired by the prediction sensor at the moment k, and determining a prediction state variable at the moment k of the positioning system based on the motion parameters acquired by the prediction sensor at the moment k; acquiring state measurement acquired at the moment k by each of M measuring sensors in the N measuring sensors, wherein M is a positive integer less than or equal to N; judging whether the N measuring sensors are in an effective working state at the moment k, configuring the positioning system to enter a first fusion mode according to the number of the measuring sensors in the effective working state, wherein the first fusion mode is an independent fusion mode or a multiple fusion mode, the independent fusion mode corresponds to that only one measuring sensor is in the effective working state at the moment k, and the multiple fusion mode corresponds to that a plurality of measuring sensors are in the effective working state at the moment k; and under the condition that the positioning system enters the first fusion mode, updating a predicted state variable at the k moment based on state measurement at the k moment acquired by a measuring sensor in an effective working state in the M measuring sensors, and acquiring positioning information at the k moment of the positioning system.
In a second aspect, a positioning system is provided, including a prediction sensor, N measurement sensors and a main control chip, where the N measurement sensors include a vision sensor and/or a pose sensor, and N is a positive integer greater than 1; the prediction sensor is used for acquiring the motion parameters of the prediction sensor body at the moment k; each measuring sensor in the N measuring sensors is used for scanning an application scene where the positioning system is located to acquire measuring data at the moment k so as to acquire state measurement at the moment k; the main control chip is configured to perform the positioning method according to the first aspect and any possible implementation manner of the first aspect to obtain positioning information of the positioning system.
In a third aspect, a robot is provided, comprising a robot body and a positioning system as in the second aspect and any one of the possible implementation manners of the second aspect, the positioning system being mounted on the robot body for acquiring positioning information of the robot.
In a fourth aspect, a computer readable storage medium is provided for storing a computer program for causing a computer to perform the positioning method as in the first aspect and any one of the possible implementations of the first aspect.
Drawings
Fig. 1 shows a schematic diagram of an application scenario according to an embodiment of the present application.
Fig. 2 shows a first schematic block diagram of a positioning method according to an embodiment of the application.
Fig. 3 shows a second schematic block diagram of a positioning method according to an embodiment of the application.
Fig. 4 shows a third schematic block diagram of a positioning method according to an embodiment of the application.
Fig. 5 shows a fourth schematic block diagram of a positioning method according to an embodiment of the application.
Fig. 6 shows a schematic block diagram of a positioning device according to an embodiment of the application.
Fig. 7 shows a schematic block diagram of a positioning system of an embodiment of the application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the technical solutions of the embodiments of the present application will be clearly described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments of the present application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs; the terminology used in the description of the application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application; the terms "comprising" and "having" and any variations thereof in the description of the application and the claims and the description of the drawings above are intended to cover a non-exclusive inclusion. The terms first, second and the like in the description and in the claims or in the above-described figures, are used for distinguishing between different objects and not necessarily for describing a particular sequential or chronological order.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the described embodiments of the application may be combined with other embodiments.
The term "and/or" in the present application is merely an association relation describing the association object, and indicates that three kinds of relations may exist, for example, a and/or B may indicate: a exists alone, A and B exist together, and B exists alone. In the present application, the character "/" generally indicates that the front and rear related objects are an or relationship.
In general, in the field of automatic driving, some prediction sensors, such as a inertial odometer and/or a wheel odometer, are installed on a mobile platform, and can predict a state variable of a positioning system at a current moment through data acquired at a previous moment, so that positioning information of the mobile platform can be obtained.
Similarly, some measurement sensors, such as a vision sensor and/or a pose sensor, may be installed on the mobile platform, and may measure the current state of the positioning system in real time to obtain the current state measurement of the body.
In view of the above, the embodiment of the application provides a positioning method, which updates the predicted state variable of the positioning system obtained based on the prediction sensor by using the state measurement of the body obtained by the measurement sensor, so as to obtain more accurate positioning information of the positioning system. In addition, the working states of a plurality of measuring sensors of the positioning system are judged to configure the positioning system to enter an adaptive fusion mode, so that as long as the measuring sensors are in an effective working state, the predicted state variables can be updated based on the state measurement of the measuring sensors in the effective working state.
In certain embodiments, the mobile platform comprises at least one of an unmanned aerial vehicle, an automobile, a remote control vehicle, a robot, a camera. When the positioning system is applied to the unmanned aerial vehicle, the platform body is the body of the unmanned aerial vehicle. When the positioning system is applied to an automobile, the platform body is the body of the automobile. The vehicle may be an autonomous vehicle or a semi-autonomous vehicle, without limitation. When the positioning system is applied to a remote control car, the platform body is a car body of the remote control car. When the positioning system is applied to a robot, the platform body is a robot body. When the positioning system is applied to a camera, the platform body is the camera itself.
An example of an application of the positioning system according to the embodiment of the present application to a robot will be described, as shown in fig. 1. The robot 100 includes a robot body 110 and a positioning system 120, wherein the positioning system 120 is mounted on the robot body 110, and the positioning system 120 includes: a measurement sensor 121, configured to scan an application scene to obtain real-time measurement data; a prediction sensor 122 for recording a motion parameter of the prediction sensor 122 body while the measurement sensor 121 scans the application scene; the main control chip 123 is used for processing the measurement data obtained by scanning the measurement sensor 121 and the motion parameters recorded by the prediction sensor 122 to obtain the positioning information of the robot body in the application scene.
Fig. 2 shows a schematic block diagram of a positioning method 200 of an embodiment of the application. The positioning method 200 is applied to a positioning system having a predictive sensor and N measurement sensors, where N is a positive integer greater than 1, including vision sensors and/or pose sensors. The positioning method 200 may be processed by a master control chip.
In one embodiment. The master control chip may be a processor dedicated to processing data collected by the measurement and prediction sensors, for example, the master control chip may be the master control chip 123 shown in fig. 1.
In another embodiment, the main control chip may be a main processor of a mobile platform on which the positioning system is mounted.
As shown in fig. 2, the positioning method 200 may include some or all of the following.
S210, acquiring motion parameters acquired by a prediction sensor at the moment k, and determining a prediction state variable at the moment k of the positioning system based on the motion parameters of the prediction sensor;
S220, acquiring state measurement acquired at the moment k by each of M measuring sensors in the N measuring sensors, wherein M is a positive integer less than or equal to N;
S230, judging whether N measuring sensors are in an effective working state at the moment k, configuring a positioning system to enter a first fusion mode according to the number of the measuring sensors in the effective working state, wherein the first fusion mode is an independent fusion mode or a multiple fusion mode, the independent fusion mode corresponds to that only one measuring sensor is in the effective working state at the moment k, and the multiple fusion mode corresponds to that a plurality of measuring sensors are in the effective working state at the moment k;
s240, under the condition that the positioning system enters a first fusion mode, based on state measurement acquired by a measuring sensor in an effective working state in M measuring sensors at the moment k, updating a predicted state variable at the moment k to acquire positioning information at the moment k of the positioning system.
Firstly, the state variable refers to a variable which describes the state of the whole positioning system and is estimated online by a fusion algorithm, wherein the variable generally comprises translation, rotation, speed, bias, external parameters among sensors and the like.
The predicted sensor obtains the motion parameter of the predicted sensor body at time k, and the predicted state variable at time k may refer to the state variable at time k of the positioning system predicted in conjunction with the state variable at time (k-1) of the positioning system. In the embodiment of the application, the motion parameter of the k moment acquired by the prediction sensor can be referred to when the predicted state variable of the k moment of the positioning system is acquired. It should be noted that, after the positioning system is started or restarted, the predicted state variable may be obtained by processing the motion parameter acquired by the prediction sensor for the first time. In some embodiments, the predictive sensor may be a inertial odometer or a wheel odometer.
The measuring sensor collects the state measurement of the measuring sensor body at the moment k. The state measurement may refer to a full degree of freedom pose (i.e., 6 DOF) or a partial degree of freedom pose (e.g., 3 DOF) that the measurement sensor outputs via the respective positioning algorithm. It generally includes translation and rotation; in addition, the status measurement may also include speed, etc. In one embodiment, the measurement sensor may include a plurality of vision sensors and/or pose sensors for scanning the application scene at time k to obtain a state measurement corresponding to time k. The visual sensor generally includes a sensor with visual function such as a camera, a laser radar, etc. capable of acquiring images or point cloud data, etc., and the pose sensor generally includes a sensor with positioning information function such as Real-time dynamic differential method (Real-TIME KINEMATIC, RTK)/Ultra-Wideband (UWB)/global positioning system (Global Positioning System, GPS), etc.
The method used by different measuring sensors for obtaining the state measurement at the k moment is different, for example, when the measuring sensor is a vision sensor, the state measurement at the k moment of the vision sensor is obtained by collecting the characteristic point position changes of the images of the two frames before and after the adjacent moment and combining the corresponding VO algorithm or the LO algorithm. For another example, when the measuring sensor is a pose sensor, the state measurement of the pose sensor at the moment k is obtained through the collected measuring data and the corresponding algorithm.
Specifically, when the measuring sensor is a vision sensor, such as a camera or a laser radar, outputting the 6DOF pose of the measuring sensor through the acquired image data in combination with a corresponding VO algorithm or LO algorithm; when the measuring sensor is a pose sensor, such as RTK/GPS, the measuring sensor can output the longitude and latitude altitude, and the 3DOF pose (namely translation x, y, z in the ENU coordinate system) of the measuring sensor can be obtained by converting the longitude and latitude altitude into the ENU coordinate system. In addition, the state measurement of the measuring sensor can also be speed, for example, the general VO or LO algorithm can also output speed.
In some embodiments, the prediction sensor may send the motion parameters to the main control chip for processing in real time after the motion parameters are acquired. In other embodiments, the prediction sensor may also acquire the predicted state variable of the positioning system based on the acquired motion parameter, and then send the predicted state variable to the main control chip for processing.
Similarly, in some embodiments, after some measurement data is collected, the measurement sensor may send the measurement data to the main control chip in real time for processing, so as to obtain a status measurement of the measurement sensor. In other embodiments, the measurement sensor may also convert the obtained measurement data into a state measurement by an algorithm, and then send the state measurement to the main control chip for processing. In other embodiments, the measurement sensor may further process the state measurement after obtaining the state measurement, and send the processed measurement value (for example, an observed quantity described below) to the main control chip.
In one example, the master control chip may obtain state measurements of all measurement sensors (i.e., N measurement sensors) in the positioning system at time k. In another example, the main control chip may only acquire the state measurements of part of the measurement sensors (i.e., M measurement sensors, M is smaller than N) in the positioning system at the time k. For example, at the time k, only part of the measuring sensors are in the effective working state, and the main control chip can only acquire the state measurement at the time k of the part of the measuring sensors in the effective working state naturally.
In some embodiments, the main control chip may determine, at intervals, whether N measurement sensors in the positioning system are in an effective working state, and configure the positioning system to enter an adaptive fusion mode according to the number of measurement sensors in the effective working state. The operating states of the measuring sensor may include an active operating state and an inactive operating state (i.e., a deactivated operating state). The adapted fusion pattern may be a single fusion pattern or a multiple fusion pattern. The single fusion mode corresponds to only one measuring sensor being in an active working state at the time k, and the multiple fusion mode corresponds to a plurality of measuring sensors being in an active working state at the time k. For example, if only one of the N measurement sensors is in an active working state at time k, configuring the positioning system to enter an individual fusion mode; if a plurality of measuring sensors in the N measuring sensors are in an effective working state at the moment k, the positioning system is configured to enter a multiple fusion mode.
Under the condition that the positioning system enters an adaptive fusion mode, based on state measurement acquired by a measuring sensor in an effective working state at the moment k in the M measuring sensors, updating a predicted state variable at the moment k to acquire positioning information at the moment k of the positioning system, and the method can comprise the following steps: under the condition that the positioning system enters an independent fusion mode, only a single measuring sensor is in an effective working state, and the predicted state variable at the k moment of the positioning system is updated according to the state measurement at the k moment acquired by the single measuring sensor to acquire positioning information at the k moment of the positioning system; and under the condition that the positioning system enters a multiple fusion mode, a plurality of measuring sensors are in an effective working state, and the predicted state variable at the k moment of the positioning system is updated according to the state measurement at the k moment acquired by the measuring sensors to acquire the positioning information at the k moment of the positioning system. It should be noted that, when the positioning system enters the multiple fusion mode, if there are more than 2 measurement sensors in the active working state, for example, there are 4 measurement sensors in the active working state, then only the state measurement at the k time acquired by 2 or 3 measurement sensors may be selected to update the predicted state variable at the k time of the positioning system, so long as the number of measurement sensors used for updating is not less than 2.
In this embodiment, the state measurement of the k moment of the body obtained by the measurement sensor is used to update the predicted state variable of the k moment of the positioning system obtained by the prediction sensor, so that the positioning information of the positioning system at the k moment can be obtained more accurately. In addition, the working states of N measuring sensors of the positioning system are judged to configure the positioning system to enter an adaptive fusion mode, so that as long as the measuring sensors are in an effective working state, the predicted state variable of the k moment of the positioning system can be updated based on the state measurement of the k moment of the measuring sensors in the effective working state.
Optionally, S210, that is, determining a predicted state variable at time k of the positioning system based on the motion parameter acquired by the prediction sensor at time k, includes: and determining a predicted state variable at the moment k by integrating the motion parameter acquired at the moment k and combining the positioning information at the moment (k-1) of the positioning system.
The manner in which the different types of predictive sensors acquire the predicted state variables of the positioning system is different. In one embodiment, when the predictive sensor is a range meter, its motion parameters include acceleration and angular velocity, and the predicted state variable at time k of the positioning system is obtained by integrating the acceleration and angular velocity at time k obtained by the range meter and combining the true value estimated state at time (k-1). In another embodiment, when the predictive sensor is a wheel odometer, its motion parameters include linear and angular velocities, and the predicted state variable at time k of the positioning system is obtained by integrating the linear and angular velocities at time k obtained by the wheel odometer and combining the true estimated state at time (k-1).
Further, the predicted state variable at the k time can be obtained by a nominal state kinematic equation, specifically, the following equation 1:
Wherein x k represents a predicted state variable of the positioning system at time k, p k represents a predicted translation of the positioning system at time k under a global coordinate system, p k-1 represents a true value estimated state translation of the positioning system at time (k-1) under the global coordinate system, v k represents a predicted speed of the positioning system at time k under the global coordinate system, v k-1 represents a true value estimated state speed of the positioning system at time k-1 under the global coordinate system, and R k represents a predicted rotation matrix from the global coordinate system at time k to a predicted sensor coordinate system, which can be calculated by a quaternion q k; q k is a predicted quaternion corresponding to the k moment predicted rotation matrix, and q k-1 is a true value quaternion corresponding to the k-1 moment; a k represents the observed acceleration at time k, ω k represents the observed angular velocity at time k, a b、ωb represents the zero drift of the predictive sensor, whose value varies with time, but which is consistent with random walk and can be considered as unchanged due to the short time required to obtain the predicted state measurements; g represents gravitational acceleration; Δt represents the time difference between the time k and the time k-1.
In one embodiment, the step S230 of determining whether the N measurement sensors are in an active operation state at time k includes: and judging whether the N measuring sensors are in an effective working state at the k moment according to whether the state measurement of the N measuring sensors at the k moment is acquired within a preset time length.
For example, a preset time period may be set, timing is started from the acquisition of the state measurement at the time (k-1) of the measurement sensor, after the preset time period, if the state measurement at the time k of the measurement sensor is not received, the state measurement at the time k of the measurement sensor may be considered to be lost, and then it may be determined that the measurement sensor is in an inactive working state at the time k, otherwise, it may be determined that the measurement sensor is in an active working state at the time k. If the measuring sensor is a pose sensor, and under the condition that houses or trees are blocked, the pose sensor cannot normally output state measurement at the moment k due to feature tracking failure, and the pose sensor is regarded as an ineffective working state at the moment k.
In another embodiment, determining whether the N measurement sensors are in an active operational state at time k includes: and judging whether the N measuring sensors are in an effective working state at the k moment according to the data differences acquired by the N measuring sensors at different moments. It should be noted that, the data difference may be a state measurement or an observed quantity described below, which is not limited by the embodiment of the present application.
That is, it is possible to determine whether the N measurement sensors are in an effective operation state by the run-out detection.
In one example, determining whether the N measurement sensors are in a valid operating state at time k based on data differences acquired by the N measurement sensors at different times includes: and judging whether the N measuring sensors are in an effective working state at the k moment by comparing whether the covariance matrix of the latest state variable at the k moment of the positioning system is in a first preset range.
It should be noted that, the state variable at the k time of the positioning system may include a plurality of state variables, where the latest state variable at the k time refers to the state variable updated last time by using the state measurement at the k time of the measurement sensor, that is, the state variable of the positioning system may be updated multiple times by using the state measurement of the measurement sensor obtained at the same time, and the state variable obtained by the last update is the latest state variable at the k time. For judging the working state of the measuring sensor at the moment k by using the method for the first time in the multiple fusion mode or judging the working state of the measuring sensor at the moment k by using the method in the single fusion mode, the latest state variable at the moment k is a predicted state variable at the moment k.
By comparing whether the covariance of the latest state variable at the k moment is within the first preset range, whether the N measuring sensors are in an effective working state at the k moment is judged, and it can be understood that if the covariance matrix of the latest state variable at the k moment is within the first preset range, the measuring sensor is considered to be in an effective working state, otherwise, the measuring sensor is considered to be in an ineffective working state.
In another example, determining whether the N measurement sensors are in a valid operating state at time k based on data differences acquired by the N measurement sensors at different times includes: and respectively determining whether N measuring sensors are in an effective working state at the moment k by comparing the magnitude relation between the difference between the latest state variable at the moment (k-1) of the positioning system and the latest state variable at the moment k and the first preset threshold value.
Since the stage of acquiring the positioning information at the time k has been entered, the latest state variable at the time (k-1) can be understood as the state variable after the state variable of the positioning system is updated by using all the measuring sensors in the effective working state at the time (k-1), and can also be understood as the positioning information at the time (k-1) acquired by adopting the positioning method provided by the application.
By comparing the difference between the latest state variable at time k and the latest state variable at time (k-1) with a first preset threshold value, the working states of the N measuring sensors at time k are determined, and it can be understood that if the difference corresponding to a certain measuring sensor in the difference is smaller than the first preset threshold value, the measuring sensor can be considered to be in a valid working state, otherwise, the measuring sensor can be considered to be in a non-valid working state.
It should be noted that one state variable may include a plurality of variables, and different variables correspond to different first preset thresholds.
In other examples, determining whether the N measurement sensors are in a valid operating state at time k based on data differences acquired by the N measurement sensors at different times includes: and respectively judging whether the N measuring sensors are in an effective working state at the moment k by comparing the number of the stable characteristic data acquired by each measuring sensor at the moment k in the N measuring sensors with the number of the stable characteristic data acquired by the corresponding measuring sensor at the moment (k-1).
For example, when the measurement sensor is a vision sensor, if the number of stable feature data acquired by the vision sensor at time k is less than a certain threshold value compared with the number of stable feature data acquired by the vision sensor at time (k-1), it may be determined that the state measurement at time k of the vision sensor is invalid, that is, it is determined that the vision sensor is in an inactive operation state at time k. For example, if there is a weak texture, a moving object is blocked, and the data of the stable feature data of the vision sensor at the k time is too small, the state measurement at the k time acquired by the vision sensor is considered invalid, that is, it is determined that the vision sensor is in an inactive operation state at the k time.
It should be noted that the stability characteristic data may be understood as a characteristic of the desired information that may be collected by the measuring sensor during a period of time, such as a characteristic having depth information during a period of time, that is, as stability characteristic data.
In some embodiments, in step S230, configuring the positioning system to enter the first fusion mode according to the number of measurement sensors in an active operating state includes: under the condition that the positioning system enters a multiple fusion mode, if only one measuring sensor in N measuring sensors is detected to be in an effective working state, the positioning system is configured to enter an independent fusion mode; or under the condition that the positioning system enters the independent fusion mode, if detecting that a plurality of measuring sensors exist in the N measuring sensors simultaneously and are in an effective working state, configuring the positioning system to enter the multiple fusion mode.
For example, after the positioning system is configured to enter the single fusion mode based on the working states of the N measurement sensors at the time k, the working states of the N measurement sensors at the time (k+1) may be continuously determined, and if the N measurement sensors have the plurality of measurement sensors in the effective working states at the time (k+1), the fusion mode of the positioning system may be switched from the single fusion mode to the multiple fusion mode, and otherwise, the positioning system may continuously maintain the single fusion mode.
For another example, after the positioning system is configured to enter the multiple fusion mode based on the working states of the N measurement sensors at the time k, the working states of the N measurement sensors at the time (k+1) can be continuously determined, if only one measurement sensor of the N measurement sensors at the time (k+1) is in an effective working state, the fusion mode of the positioning system can be switched from the multiple fusion mode to the single fusion mode, and otherwise, the positioning system continuously maintains the multiple fusion mode.
The above determination method may be used to determine the operating states of the N measurement sensors at any time, which is not limited in the embodiment of the present application.
Taking at least two measurement sensors of the positioning system as an example for explanation, assuming that the two measurement sensors are respectively a visual sensor and a pose sensor, after the positioning system is started, whether the visual sensor and the pose sensor are in an effective working state can be judged successively, if the pose sensor is in the effective working state, if the pose sensor can be successfully initialized, the positioning system is configured to enter a single fusion mode for updating the state variable of the positioning system by adopting the state measurement acquired by the pose sensor, further, on the premise that the pose sensor is in the effective working state, whether the visual sensor is in the effective working state is continuously judged, if the visual sensor is also in the effective working state, the relative transformation matrix between the pose sensor and the visual sensor is calculated, if the relative transformation matrix is successfully acquired, the positioning system is configured to enter a multiple fusion mode, and under the multiple fusion mode, the state measurement acquired by the visual sensor can be converted into the same coordinate system of the pose sensor by the relative transformation matrix, so that the state measurement of the pose sensor and the vision sensor can be fused, and the state variable of the positioning system can be predicted by utilizing the state measurement of the multiple measurement sensors, and the positioning system can be updated. In the case that the positioning system enters a multiple fusion mode, running and flying detection can be added in the fusion process, and if the vision sensor is in an ineffective working state, the predicted state variable of the positioning system can be updated only by using state measurement acquired by the pose sensor to acquire positioning information of the positioning system; similarly, if the pose sensor is in an inactive state, the predicted state variable of the positioning system may be updated only by using the state measurement acquired by the vision sensor to obtain positioning information of the positioning system.
In one embodiment, successfully obtaining the relative transformation matrix may include: the pose sensor and the vision sensor can obtain reasonable relative poses through a track alignment method.
It should be noted that, whether the positioning system is in the single fusion mode or the multiple fusion mode, if all the measurement sensors are detected to be in the failure state, the positioning system may be configured to enter the data waiting stage until valid data is generated or the positioning system is restarted.
In the embodiment, when the positioning system is in the single fusion mode, the working states of the N measuring sensors are continuously monitored, so that once a plurality of measuring sensors exist in the effective working states, the positioning system can be configured to enter the multiple fusion mode to update the state variables of the positioning system, and the accuracy of positioning information can be improved; and in the multiple fusion mode of the positioning system, the working states of the N measuring sensors are continuously monitored, so that once the measuring sensors are in a failure state, state variables of the positioning system are updated by eliminating state measurement of the measuring sensors, the accuracy of data is ensured, and the accuracy of the positioning system for outputting positioning information can be effectively improved.
In one embodiment, as shown in fig. 3, S240, that is, in the case that the positioning system enters the first fusion mode, updates a predicted state variable at the k time based on a state measurement at the k time acquired by a measurement sensor in an active working state from M measurement sensors, to acquire positioning information at the k time of the positioning system, including:
S241, acquiring observed quantity of a corresponding measuring sensor at the moment k based on state measurement acquired by the measuring sensor in an effective working state at the moment k;
S242, under the condition that the positioning system enters a first fusion mode, determining an error state variable of the positioning system at the k moment based on a predicted state variable of the positioning system at the k moment and an observed quantity of the measuring sensor in an effective working state at the k moment;
S243, updating the predicted state variable at the k moment based on the error state variable at the k moment of the positioning system, and acquiring positioning information at the k moment of the positioning system.
In one embodiment, obtaining an observed quantity of a measurement sensor based on a status measurement of the measurement sensor may include: the state measurement of the measuring sensor is directly used as the observed quantity of the measuring sensor. For example, the pose obtained by the measurement result output by the measurement sensor through the algorithm may be used as the observed quantity of the measurement sensor. In other embodiments, the observed quantity of the measurement sensor may also be obtained by performing a certain process on the state measurement of the measurement sensor, which is not limited by the embodiment of the present application.
The initial error state variable may be derived by differentiating the nominal state kinematics equation, such as equation 2 below:
Wherein [ (×) represents an antisymmetric matrix, θ represents a rotation angle, η v、ηθ、ηa and η ω are noise, respectively, which obey gaussian distribution.
In some embodiments, the nominal state variables (which may also be referred to as up-to-date state variables, e.g., predicted state variables) of the positioning system are defined according to the nominal state kinematic equations described above:
The error state variable is located according to equation 2 above as:
The latest state variable and the error state variable generally meet the generalized addition, and the state is estimated by the true value Is the sum of the nominal state variable and the error state variable, specifically the following equation 3:
Further, after the true value estimation state of the positioning system at the k moment is obtained, the latest state variable of the positioning system can be predicted by utilizing the true value estimation state at the k moment and the motion parameter at the k+1 moment recorded by the prediction sensor, so that the independent pose recursion of the prediction sensor is realized to obtain the real-time pose of the positioning system, and the real-time property of regulation is ensured.
It should be noted that, since the error state motion equation is derived based on the nominal state motion equation, that is, when the prediction state variable of the positioning system is obtained, the corresponding error state variable may also be obtained synchronously, and since the prediction sensor does not consider the noise in the process of obtaining the prediction state variable, the obtained error state variable has deviation, in order to obtain the accurate state variable of the positioning system, the error state variable needs to be corrected.
Optionally, if the first fusion mode is a multiple fusion mode, the positioning method 200 further includes: the observed quantity at k-time of the plurality of measurement sensors is fused before the error state variable at k-time of the positioning system is determined based on the predicted state variable at k-time of the positioning system and the observed quantity at k-time of the plurality of measurement sensors in an active operating state.
In some embodiments, when the positioning system is in the multiple fusion mode and the measurement sensors are of different types or the same type, the observed quantity of the plurality of measurement sensors may be used to sequentially update the state variable of the positioning system, for example, when the positioning system is in the multiple fusion mode, the plurality of measurement sensors in the active working state at the time k may sequentially iteratively update the state variable of the positioning system. For example, the state of the first measuring sensor at the time k can be used for measuring and obtaining the error state variable of the system, and then the error state variable obtained by the first measuring sensor is used for updating the predicted state variable at the time k; and acquiring a corresponding error state variable by using the state measurement of the second measuring sensor at the k moment, updating the updated state variable at the k moment again by using the error state variable corresponding to the second measuring sensor, … …, and so on until the state variable at the k moment is updated by using the error state variables corresponding to all measuring sensors at the k moment.
In other embodiments, when the positioning system is in the multiple fusion mode and the measuring sensors are of the same type, the fusion observed quantity is obtained by fusing the state measurements of the plurality of measuring sensors in the effective working state at the time k, and the error state variable of the positioning system is obtained based on the fusion observed quantity and the predicted state variable of the positioning system, so that the predicted state variable at the time k of the positioning system is updated once by using the error state variable, the update times are reduced, and the calculated amount is reduced.
In some embodiments, as shown in fig. 4, S242, that is, in the case that the positioning system enters the first fusion mode, determining the error state variable at the k time of the positioning system based on the predicted state variable at the k time of the positioning system and the observed quantity at the k time of the measuring sensor in the active operation state includes:
s2421, under the condition that the positioning system enters a first fusion mode, determining an observation estimated value at the k moment of a measuring sensor in an effective working state according to a nonlinear transformation function in a preset observation equation and a predicted state variable at the k moment of the positioning system;
S2422, calculating the error between the observed estimated quantity at the time k and the observed quantity at the time k of the measuring sensor in the effective working state;
s2423, calculating a Jacobian matrix of a preset observation equation relative to the error state variable;
S2424, acquiring a Kalman gain matrix at the moment k according to the Jacobi matrix;
S2425, determining an error state variable at the time of k based on the Kalman gain matrix at the time of k and the error between the observed estimation amount and the observed amount at the time of k.
Typically, the observance of the measuring sensor satisfies the observation equation in a broad sense, i.e. is related to the latest state variable of the positioning system by a nonlinear function. For example, the observation equation in a broad sense may be the following equation 4:
Wherein, h () is represented as a nonlinear function, h () in different observation equations is different, the observation quantity z k is obtained by substituting the latest state variable x ′ k of the positioning system into the nonlinear function h (), and overlapping gaussian noise beta, wherein beta represents gaussian noise with a mean value of 0 and a standard deviation of V.
In one embodiment, the observed estimate of the k-time of the measurement sensor may be obtained by directly bringing the latest state variable of the k-time of the positioning system into h (). For example, in a single fusion mode or a multiple fusion mode in which only one update is performed, the estimated observation amount at k time of the measurement sensor can be obtained by substituting the predicted state variable at k time of the positioning system into h ().
Since the observed estimator is calculated without taking system noise into account, there is some error between the observed and observed estimators. Alternatively, the error may be obtained by the following equation 5 calculation:
r=z k-h(x′ k) (equation 5)
In one embodiment, the Jacobian matrix of the preset observation equation relative to the error state variable can be calculated by the following equation 6:
where x ′ is the latest state variable of the positioning system.
In some embodiments, the error state variable may be obtained by the following equation 7:
Wherein K k is a Kalman gain matrix corresponding to K time, For the covariance matrix of the predicted state variable at time k, V is the noise matrix of the observed quantity, δχ k is the error state variable at time k.
The application of two non-linear functions h () will be described in detail below, it being noted that the concept of time, i.e. the information about time below, is omitted below for brevity, and is denoted as information at time k.
Specifically, description is made with the measurement sensor including a pose sensor.
In one embodiment, assuming that the coordinates of the pose sensor in the predicted sensor coordinate system at time k are IyG according to the relative transformation matrix between the predicted sensor and the measurement sensor, the coordinates of the pose sensor in the global coordinate system W are obtained by calculating the following equation 8:
Wherein WpI denotes translation of the predicted sensor coordinate system into the global coordinate system W, The rotation matrix, IyG, representing the predicted sensor coordinate system to the global coordinate system can be understood as the latest state variable of the positioning system,/>Expressed as a nonlinear function h (), WyG in the observation equation, expressed as an observation estimate x' of the pose sensor.
It should be noted that, the global coordinate system in the embodiment of the present application refers to a coordinate system that converts the data obtained by each measurement sensor into the same coordinate system, and may be other coordinate systems except for the prediction sensor, which only needs to ensure that the data of each measurement sensor is located in the same coordinate system.
Alternatively, the global coordinate system may be a coordinate system corresponding to the prediction sensor, that is, after obtaining the state measurement of each measurement sensor, the latest state variable (for example, the predicted state variable) of the positioning system under the global coordinate system needs to be converted into the coordinate system of a certain measurement sensor to obtain the observed estimated value of each measurement sensor.
In another embodiment, the coordinates of the pose sensor under the predicted sensor coordinate system at the k moment may be converted from IyG to the corresponding longitude, latitude and altitude under the ENU coordinate system, for example, the coordinates of the pose sensor under the global coordinate system W obtained by the conversion of the above equation 8 may be converted via the equation 9, so as to obtain the coordinates of the pose sensor under the ENU coordinate system.
Wherein EpW is a translation matrix between the ENU coordinate system and the global coordinate system W,For a rotation matrix between the ENU coordinate system and the global coordinate system W,/>May be expressed as a nonlinear function h (), EyG in the observation equation, as an observation estimate of the pose sensor.
After the observed estimator of the pose sensor is obtained by equations 8 and 9, the error between the observed amount of the pose sensor and the observed estimator may be further obtained based on equation 5.
In the process of obtaining the error state variable, the initial value of the error state variable is obtained based on the derivation of the nominal state equation, that is, when the predicted state variable is obtained, the initial value of the corresponding error state variable can be obtained synchronously, and the initial value of the error state variable obtained by the method does not consider noise in the process of obtaining the predicted state variable by using the nominal state equation, so that the error state variable needs to be corrected in order to obtain the accurate state variable of the system.
Further, a jacobian matrix of the preset observation equation with respect to the error state variable is obtained according to substituting the initial value of the error state variable into equation 6, as shown in equation 10.
Based on the equation 10, the Jacobian matrix corresponding to the pose sensor currently can be calculated, and errors among the Jacobian matrix, the observed quantity and the observed estimated quantity are further substituted into the equation 7 to correct the initial value of the error state variable, so that the corrected error state variable is obtained.
In some embodiments, the positioning method 200 may further include: determining a covariance matrix at the moment k according to the Kalman gain matrix at the moment k and the Jacobian matrix at the moment k; and determining whether the error state variable at the moment k meets the requirement of the positioning system according to whether the covariance matrix at the moment k is in a second preset range.
S242, determining an error state variable at time k of the positioning system based on a predicted state variable at time k of the positioning system and an observed quantity at time k of a measurement sensor in an active operation state, including:
And under the condition that the error state variable at the moment k meets the requirement of the positioning system, updating the predicted state variable at the moment k based on the error state variable at the moment k of the positioning system, and acquiring the positioning information at the moment k of the positioning system.
For example, the covariance matrix at k time can be obtained by the following equation 11:
Wherein I is an identity matrix, K k is a Kalman gain matrix corresponding to K time, And (3) predicting a covariance matrix of the state variable at the moment k, wherein H is a Jacobian matrix of a preset observation equation relative to the error state variable, and P k is a covariance matrix corresponding to the moment k.
Because the values of h () at different moments are different, the Jacobian matrix for calculating the preset observation equation relative to the error state variable is different, so that the obtained covariance matrix is also different, therefore, whether the currently calculated error state variable meets the requirement of the positioning system can be judged according to the covariance matrix, if so, the prediction state variable of the positioning system is updated by using the obtained error state variable to obtain a true value estimation state, otherwise, if the covariance matrix does not meet the requirement of the positioning system, the positioning system is restarted.
Alternatively, the second preset range in this embodiment may be the same as or different from the first preset range described above, which is not limited by the embodiment of the present application.
In addition, before the state measurement of each measuring sensor is converted to the same coordinate system, the positioning system can be initialized to obtain a state variable initial value and a covariance matrix initial value of the positioning system, and then the state variable initial value and the covariance matrix initial value of the positioning system are updated through the state measurement of each measuring sensor to obtain the optimal state variable of the positioning system. It should be noted that, the initial state variable value and the initial covariance matrix value of the positioning system only exist at the initial moment of the positioning system (for example, the moment of the positioning system 0), and the initial state variable value and the initial covariance value are updated along with the continuous measurement of each subsequent measurement sensor, that is, the updated state variable of the positioning system and the covariance matrix corresponding to the updated state variable.
In one embodiment, the measuring sensor in the active working state in the M measuring sensors includes a first measuring sensor and a second measuring sensor, as shown in fig. 5, the positioning method 200 further includes:
S250, obtaining the observed quantity of the first measuring sensor at the k moment according to the state measurement of the first measuring sensor at the k moment;
S260, obtaining the observed quantity of the second measuring sensor at the k moment according to the state measurement of the second measuring sensor at the k moment;
S240, namely, under the condition that the positioning system enters a first fusion mode, updating a predicted state variable at the k moment based on state measurement at the k moment acquired by a measuring sensor in an effective working state in M measuring sensors, and acquiring positioning information at the k moment of the positioning system, wherein the method comprises the following steps:
S244, under the condition that the positioning system enters a multiple fusion mode, estimating a second acquisition time of the observed quantity of the (k+1) time of the first measuring sensor according to a first acquisition time of the observed quantity of the k time of the first measuring sensor and a preset frame rate of the first measuring sensor;
S245, if the observed quantity of the second measuring sensor at the k moment is not acquired between the first acquiring moment and the second acquiring moment, waiting to acquire the observed quantity of the second measuring sensor at the k moment;
S246, after the observables of the first measuring sensor and the second measuring sensor are obtained, the observables of the first measuring sensor and the second measuring sensor are utilized to update the predicted state variable at the k moment obtained before the observables at the k moment of the first measuring sensor, and the positioning information at the k moment of the positioning system is obtained.
Specifically, although the prediction sensor and each measurement sensor can perform measurement synchronously, measurement data of the prediction sensor and the measurement sensor need to be processed to further update state variables of the positioning system, and time required for data processing is unstable. The method includes that the main control chip acquires the observed quantity of a plurality of measuring sensors (for example, a first measuring sensor and a second measuring sensor) in an asynchronous manner, and in the case that the positioning system is in a multiple fusion mode, the main control chip can update the state variable of the positioning system according to the acquired state measurement of the plurality of measuring sensors at the same moment in sequence, and the optimal state variable of the positioning system at a certain moment is obtained after the updating for a plurality of times, so that the calculated quantity is overlarge. In the embodiment of the present application, the acquisition time of the observed quantity of one measurement sensor (for example, the first measurement sensor) and the acquisition time of the observed quantity of the next time of the measurement sensor may be estimated by using the acquisition time of the observed quantity of the measurement sensor and the preset frame rate of the measurement sensor, where the measurement sensor is the measurement sensor that acquires the observed quantity earliest among the measurement sensors, and the preset frame rate of the measurement sensor is the inverse of the time interval between the output of the measurement sensors in two adjacent states. And then based on a time period between the acquisition time of the observed quantity at the current time and the acquisition time of the observed quantity at the estimated next time, judging whether the observed quantity of the other measurement sensors is in the time period, if the observed quantity of the other measurement sensors is not received in the time period, continuing to wait until the observed quantity of the other measurement sensors is acquired, fusing the acquired observed quantities of the plurality of measurement sensors to obtain a fused observed quantity, and finally updating the prediction state variable of the positioning system through the fused observed quantity of the plurality of measurement sensors to obtain the true value estimation state of the positioning system.
In this embodiment, it is necessary to ensure that the time of acquiring the predicted state variable of the positioning system using the motion parameter of the prediction sensor is not greater than the time of acquiring the observed quantity of any one of the measurement sensors.
The following description of time synchronization is made by taking the example of updating the state variables of the positioning system with the state measurements of the pose sensor and the vision sensor, respectively, when the positioning system is in the multiple fusion mode. Specifically, the time recorded by the vision sensor is the corresponding time of the image, and the time for processing the image to obtain the observed quantity is a certain time, because the time for calculating the observed quantity corresponding to the front and rear two frames of images of the vision sensor is unstable, but the time stamp interval for obtaining the front and rear two frames of images is stable, the output time of the observed quantity corresponding to the frame image at the current time and the preset frame rate of the vision sensor can be obtained to estimate the output time of the observed quantity corresponding to the frame image at the next time, and the actual output time of the observed quantity corresponding to the frame image at the next time is not limited. In the time period between the output time of the observed quantity corresponding to the frame image at the current moment and the output time of the observed quantity corresponding to the estimated frame image at the next moment, if the observed quantity corresponding to the current moment of the pose sensor is not obtained, the main control chip does not immediately process the observed quantity at the current moment of the vision sensor, but delays the updating time, and immediately fuses the observed quantity corresponding to the current moment of the vision sensor after waiting for obtaining the observed quantity corresponding to the current moment of the pose sensor, and then updates the predicted state variable of the positioning system based on the fused observed quantity. The method can ensure that the observed quantity of each measuring sensor used in the multiple fusion mode is asynchronous, high-precision positioning information can be obtained by adopting one-time updating, and the robustness of multi-sensor fusion is improved.
Having described the positioning method according to the embodiment of the present application in detail, a positioning device according to the embodiment of the present application will be described below with reference to fig. 6, and technical features described in the method embodiment are applicable to the following device embodiments.
Fig. 6 shows a schematic block diagram of a positioning device 300 according to an embodiment of the application. The positioning device is applied to a positioning system, the positioning system comprises a prediction sensor and N measurement sensors, the N measurement sensors comprise a vision sensor and/or a pose sensor, N is a positive integer greater than 1, and as shown in fig. 6, the positioning device 300 can comprise part or all of the following.
An obtaining unit 310, configured to obtain a motion parameter acquired by a prediction sensor at a time k, determine a prediction state variable at a time k of the positioning system based on the motion parameter acquired by the prediction sensor at the time k, and obtain a state measurement acquired by each of M measurement sensors among the N measurement sensors at the time k, where M is a positive integer less than or equal to N;
The configuration unit 320 is configured to determine whether the N measurement sensors are in an effective working state at time k, and configure the positioning system to enter a first fusion mode according to the number of measurement sensors in the effective working state, where the first fusion mode is an independent fusion mode or a multiple fusion mode, the independent fusion mode corresponds to that only one measurement sensor is in the effective working state at time k, and the multiple fusion mode corresponds to that a plurality of measurement sensors are in the effective working state at time k;
and the processing unit 330 is configured to update a predicted state variable at the k moment based on the state measurement at the k moment acquired by the measuring sensor in the effective working state in the M measuring sensors when the positioning system enters the first fusion mode, and acquire positioning information at the k moment of the positioning system.
It should be understood that the positioning device 300 according to the embodiment of the present application may correspond to the execution subject in the embodiment of the positioning method 200 of the embodiment of the present application, and the above and other operations and/or functions of each unit in the positioning device 300 are respectively for implementing the corresponding flows in each of the methods of fig. 2 to 5, and are not repeated herein for brevity.
Optionally, as shown in fig. 7, the embodiment of the present application further provides a positioning system 400, where the positioning system 400 includes a prediction sensor 410, N measurement sensors 420 and a master control chip 430, where the prediction sensor 410 is used to obtain a motion parameter at time k, each of the N measurement sensors is used to scan an application scene where the positioning system is located to obtain measurement data at time k, so as to obtain state measurement at time k by using the measurement data at time k, and the master control chip 430 is used to obtain the motion parameter collected by the prediction sensor at time k, and determine a predicted state variable at time k of the positioning system based on the motion parameter obtained by the prediction sensor at time k; based on the measurement data of the M measuring sensors at the k moment in the N measuring sensors, acquiring state measurement of each measuring sensor of the M measuring sensors at the k moment, wherein M is a positive integer less than or equal to N; judging whether N measuring sensors are in an effective working state at the moment k, configuring a positioning system to enter a first fusion mode according to the number of the measuring sensors in the effective working state, wherein the first fusion mode is an independent fusion mode or a multiple fusion mode, the independent fusion mode corresponds to the fact that only one measuring sensor is in the effective working state at the moment k, the multiple fusion mode corresponds to the fact that a plurality of measuring sensors are in the effective working state at the moment k, and under the condition that the positioning system enters the first fusion mode, updating a predicted state variable at the moment k based on state measurement at the moment k obtained by the measuring sensors in the effective working state in the M measuring sensors, and obtaining positioning information at the moment k of the positioning system.
Alternatively, the main control chip 430 corresponds to the positioning device 300 shown in fig. 6.
Optionally, as shown in fig. 7, the positioning system 400 may also include a memory 440. The main control chip 430 may call and run a computer storage program from the memory 440 to implement the positioning method 200 in the embodiment of the present application.
Wherein, the memory 440 may be a separate device independent of the main control chip 430, or may be integrated in the main control chip 430.
Optionally, the embodiment of the present application further provides a robot, including a robot body and the positioning system 400, where the positioning system 400 may be installed on the robot body, and is used to obtain positioning information of the robot.
Optionally, the embodiment of the application further provides a chip, and the chip comprises a processor, and the processor can call and run the computer program from the memory to realize the positioning method of the embodiment of the application.
Optionally, the embodiment of the present application further provides a computer readable medium, which is used for storing a computer program, where the computer program makes a computer execute a corresponding flow in the positioning method of the embodiment of the present application, and for brevity, a description is omitted.
The embodiment of the application also provides a computer program product, which comprises computer program instructions for causing a computer to execute the corresponding flow in the positioning method of the embodiment of the application, and for brevity, the description is omitted.
The foregoing is merely illustrative of the present application, and the present application is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
Claims (12)
1. A positioning method, characterized in that it is applied to a positioning system, the positioning system comprising a prediction sensor and N measurement sensors, the N measurement sensors comprising a vision sensor and/or a pose sensor, N being a positive integer greater than 1, the positioning method comprising:
Acquiring motion parameters acquired by the prediction sensor at the moment k, and determining a prediction state variable at the moment k of the positioning system based on the motion parameters acquired by the prediction sensor at the moment k;
acquiring state measurement acquired at the moment k by each of M measuring sensors in the N measuring sensors, wherein M is a positive integer less than or equal to N;
Judging whether the N measuring sensors are in an effective working state at the moment k, configuring the positioning system to enter a first fusion mode according to the number of the measuring sensors in the effective working state, wherein the first fusion mode is an independent fusion mode or a multiple fusion mode, the independent fusion mode corresponds to that only one measuring sensor is in the effective working state at the moment k, and the multiple fusion mode corresponds to that a plurality of measuring sensors are in the effective working state at the moment k;
And under the condition that the positioning system enters the first fusion mode, updating a predicted state variable at the k moment based on state measurement at the k moment acquired by a measuring sensor in an effective working state in the M measuring sensors, and acquiring positioning information at the k moment of the positioning system.
2. The positioning method according to claim 1, characterized in that the positioning method further comprises:
according to whether state measurement of k moments of the N measuring sensors is obtained within a preset duration, whether the N measuring sensors are in an effective working state at the k moments is determined; or alternatively
And determining whether the N measuring sensors are in an effective working state at the k moment according to the data differences acquired by the N measuring sensors at different moments.
3. The positioning method according to claim 2, wherein the determining whether the N measurement sensors are in a valid operation state at time k according to the data differences acquired by the N measurement sensors at different times includes:
determining whether the N measuring sensors are in an effective working state at the k moment by comparing whether a covariance matrix of the latest state variable at the k moment of the positioning system is in a first preset range; or alternatively
Determining whether the N measuring sensors are in an effective working state at the moment k by comparing the magnitude relation between the difference between the latest state variable at the moment (k-1) of the positioning system and the latest state variable at the moment k and a first preset threshold; or alternatively
And determining whether the N measuring sensors are in an effective working state at the moment k by comparing the number of the stable characteristic data acquired by each measuring sensor in the N measuring sensors at the moment k with the number of the stable characteristic data acquired by the corresponding measuring sensor at the moment (k-1).
4. The positioning method according to claim 1, characterized in that the positioning method further comprises:
Under the condition that the positioning system enters the multiple fusion mode, if only one measuring sensor in the N measuring sensors is detected to be in an effective working state, the positioning system is configured to enter the single fusion mode; or alternatively
Under the condition that the positioning system enters the independent fusion mode, if detecting that a plurality of measuring sensors exist in the N measuring sensors simultaneously and are in an effective working state, the positioning system is configured to enter the multiple fusion mode.
5. The positioning method according to claim 1, wherein a measurement sensor of the M measurement sensors that is in an active operation state at a time k includes a first measurement sensor and a second measurement sensor, the positioning method further comprising:
Acquiring the observed quantity of the first measuring sensor at the k moment according to the state measurement of the first measuring sensor at the k moment;
Acquiring the observed quantity of the second measuring sensor at the k moment according to the state measurement of the second measuring sensor at the k moment;
Under the condition that the positioning system enters the first fusion mode, updating a predicted state variable at the k moment based on state measurement at the k moment acquired by a measuring sensor in an effective working state in the M measuring sensors, and acquiring positioning information at the k moment of the positioning system, wherein the method comprises the following steps:
Estimating a second acquisition time of the observed quantity at the (k+1) time of the first measurement sensor according to a first acquisition time of the observed quantity at the k time of the first measurement sensor and a preset frame rate of the first measurement sensor under the condition that the positioning system enters the multiple fusion mode;
If the observed quantity of the k moment of the second measuring sensor is not acquired between the first acquiring moment and the second acquiring moment, waiting to acquire the observed quantity of the k moment of the second measuring sensor;
after the observed quantity of the first measuring sensor and the observed quantity of the second measuring sensor are obtained, the predicted state variable of the k moment, which is obtained before the observed quantity of the k moment of the first measuring sensor, is updated by utilizing the observed quantity of the first measuring sensor and the observed quantity of the second measuring sensor, and the positioning information of the k moment of the positioning system is obtained.
6. The positioning method according to claim 1, wherein, when the positioning system enters the first fusion mode, the predicting state variable at the k moment is updated based on the state measurement at the k moment acquired by the measuring sensor in the active working state among the M measuring sensors, and the acquiring the positioning information at the k moment of the positioning system includes:
Based on the state measurement acquired by the measuring sensor in the effective working state at the moment k, acquiring the observed quantity of the corresponding measuring sensor at the moment k;
Under the condition that the positioning system enters the first fusion mode, determining an error state variable of the positioning system at the k moment based on a predicted state variable of the positioning system at the k moment and an observed quantity of the measuring sensor in an effective working state at the k moment;
And updating the predicted state variable at the k moment based on the error state variable at the k moment of the positioning system, and acquiring positioning information at the k moment of the positioning system.
7. The positioning method according to claim 6, wherein the determining the error state variable at the k-time of the positioning system based on the predicted state variable at the k-time of the positioning system and the observed quantity at the k-time of the measuring sensor in the active operation state in the case where the positioning system enters the first fusion mode includes:
under the condition that the positioning system enters the first fusion mode, according to a nonlinear transformation function in a preset observation equation and a predicted state variable of the positioning system at the k moment, determining an observation estimated value of the measuring sensor at the k moment in an effective working state;
calculating an error between an observed estimated quantity at the time k and an observed quantity at the time k of the measuring sensor in an effective working state;
calculating a Jacobian matrix of the preset observation equation relative to an error state variable;
Acquiring a Kalman gain matrix at the moment k according to the Jacobian matrix;
And determining an error state variable of the k moment based on the Kalman gain matrix of the k moment and the error of the k moment.
8. The positioning method of claim 7, wherein the positioning method further comprises:
Determining a covariance matrix of the k moment according to the Kalman gain matrix of the k moment and the Jacobian matrix of the k moment;
determining whether the error state variable at the k moment accords with the requirement of the positioning system according to whether the covariance matrix at the k moment is in a second preset range or not;
The determining the error state variable of the k moment of the positioning system based on the predicted state variable of the k moment of the positioning system and the observed quantity of the k moment of the measuring sensor in an effective working state comprises the following steps:
and under the condition that the error state variable at the k moment meets the requirement of a positioning system, updating the predicted state variable at the k moment based on the error state variable at the k moment of the positioning system, and acquiring the positioning information at the k moment of the positioning system.
9. The positioning method according to any one of claims 1 to 8, wherein the obtaining a predicted state variable at time k of the positioning system based on the motion parameter obtained by the prediction sensor at time k includes:
And integrating the motion parameters acquired at the moment k and combining the positioning information at the moment (k-1) of the positioning system to acquire the predicted state variable at the moment k.
10. The positioning system is characterized by comprising a prediction sensor, N measurement sensors and a main control chip, wherein the N measurement sensors comprise vision sensors and/or pose sensors, and N is a positive integer greater than 1;
the prediction sensor is used for acquiring the motion parameters of the prediction sensor body at the moment k;
Each measuring sensor in the N measuring sensors is used for scanning an application scene where the positioning system is located to acquire measuring data at the moment k so as to acquire state measurement at the moment k;
the main control chip is used for executing the positioning method according to any one of claims 1 to 9 to obtain positioning information of the positioning system.
11. A robot comprising a robot body and a positioning system according to claim 10, said positioning system being mounted on said robot body for obtaining positioning information of said robot.
12. A computer-readable storage medium storing a computer program for causing a computer to execute the positioning method according to any one of claims 1 to 9.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410208000.4A CN118067118A (en) | 2024-02-23 | 2024-02-23 | Positioning method, positioning system, robot, and computer-readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410208000.4A CN118067118A (en) | 2024-02-23 | 2024-02-23 | Positioning method, positioning system, robot, and computer-readable storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN118067118A true CN118067118A (en) | 2024-05-24 |
Family
ID=91106956
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202410208000.4A Pending CN118067118A (en) | 2024-02-23 | 2024-02-23 | Positioning method, positioning system, robot, and computer-readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN118067118A (en) |
-
2024
- 2024-02-23 CN CN202410208000.4A patent/CN118067118A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113945206B (en) | Positioning method and device based on multi-sensor fusion | |
CN111947671B (en) | Method, apparatus, computing device and computer-readable storage medium for positioning | |
US20080195316A1 (en) | System and method for motion estimation using vision sensors | |
US20070073473A1 (en) | System and method of target tracking using sensor fusion | |
US20100191461A1 (en) | System and method of lane path estimation using sensor fusion | |
EP3907720B1 (en) | Own position estimating device, automatic driving system comprising same, and own generated map sharing device | |
JP4984659B2 (en) | Own vehicle position estimation device | |
CN113405545B (en) | Positioning method, positioning device, electronic equipment and computer storage medium | |
CN110637209B (en) | Method, apparatus and computer readable storage medium having instructions for estimating a pose of a motor vehicle | |
CN112762944A (en) | Zero-speed interval detection and zero-speed updating method | |
CN116202509A (en) | Passable map generation method for indoor multi-layer building | |
CN112572460A (en) | Method and apparatus for estimating yaw rate with high accuracy, and storage medium | |
CN116777984A (en) | System for calibrating external parameters of cameras in autonomous transportation vehicles | |
CN114915913A (en) | UWB-IMU combined indoor positioning method based on sliding window factor graph | |
CN117268408A (en) | Laser slam positioning method and system | |
CN118067118A (en) | Positioning method, positioning system, robot, and computer-readable storage medium | |
CN115236708A (en) | Method, device and equipment for estimating position and attitude state of vehicle and storage medium | |
Akeila et al. | A self-resetting method for reducing error accumulation in INS-based tracking | |
CN118149799A (en) | Positioning method, positioning system, robot, and computer-readable storage medium | |
CN117760417B (en) | Fusion positioning method and system based on 4D millimeter wave radar and IMU | |
CN113074751B (en) | Visual positioning error detection method and device | |
US20240271941A1 (en) | Drive device, vehicle, and method for automated driving and/or assisted driving | |
US20230168352A1 (en) | Method for assessing a measuring inaccuracy of an environment detection sensor | |
CN118112623A (en) | Vehicle fusion positioning method under intermittent GNSS signals and related equipment | |
CN118392197A (en) | Positioning method and related device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |