CN111402333B - Parameter estimation method, device, equipment and medium - Google Patents

Parameter estimation method, device, equipment and medium Download PDF

Info

Publication number
CN111402333B
CN111402333B CN202010175358.3A CN202010175358A CN111402333B CN 111402333 B CN111402333 B CN 111402333B CN 202010175358 A CN202010175358 A CN 202010175358A CN 111402333 B CN111402333 B CN 111402333B
Authority
CN
China
Prior art keywords
obstacle
determining
estimated
parameter
observed value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010175358.3A
Other languages
Chinese (zh)
Other versions
CN111402333A (en
Inventor
赵政
王昊
王亮
马彧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN202010175358.3A priority Critical patent/CN111402333B/en
Publication of CN111402333A publication Critical patent/CN111402333A/en
Application granted granted Critical
Publication of CN111402333B publication Critical patent/CN111402333B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T3/06
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Abstract

The embodiment of the application discloses a parameter estimation method, a device, equipment and a medium, relates to the technical field of data processing, and particularly relates to an automatic driving technology. The specific implementation scheme is as follows: acquiring point cloud data of an obstacle and an observed value of a parameter to be estimated of the obstacle; determining the deviation degree of the observed value according to the point cloud data of the obstacle and the observed value of the parameter to be estimated; and determining the estimated value of the parameter to be estimated according to the deviation degree of the observed value so as to correct the observed value of the parameter to be estimated. The embodiment of the application provides a parameter estimation method, device, equipment and medium, so as to realize accurate estimation of obstacle parameters.

Description

Parameter estimation method, device, equipment and medium
Technical Field
The embodiment of the application relates to the technical field of data processing, in particular to an automatic driving technology. Specifically, the embodiment of the application provides a parameter estimation method, a device, equipment and a medium.
Background
The properties such as the direction, the center and the like of the obstacle in the unmanned system have great influence on the motion planning of the unmanned vehicle and the behavior prediction of the obstacle. Accurate attribute information such as orientation, center can reduce the emergency brake rate and the takeover rate of unmanned vehicles.
When the current unmanned system estimates the direction of an obstacle and the center, the attribute information calculated by using the point cloud coordinates and the geometric information of the point cloud is inaccurate due to shielding and the like. Erroneous attribute information can have a large impact on obstacle tracking and trajectory prediction.
Disclosure of Invention
The embodiment of the application provides a parameter estimation method, device, equipment and medium, so as to realize accurate estimation of obstacle parameters.
The embodiment of the application provides a parameter estimation method, which comprises the following steps:
acquiring point cloud data of an obstacle and an observed value of a parameter to be estimated of the obstacle;
determining the deviation degree of the observed value according to the point cloud data of the obstacle and the observed value of the parameter to be estimated;
and determining the estimated value of the parameter to be estimated according to the deviation degree of the observed value so as to correct the observed value of the parameter to be estimated.
According to the embodiment of the application, the deviation degree of the observed value is determined according to the point cloud data of the obstacle and the observed value of the parameter to be estimated, so that the optimal estimated value of the parameter to be estimated is determined according to the deviation degree of the observed value. Since the degree of deviation of the observed value reflects the degree of deviation of the observed value, introducing the noise deviation of the observed value into the estimation of the parameter to be estimated can improve the accuracy of the estimated value. And then according to the estimated value of the parameter to be estimated, the correction of the observed value can be realized.
Further, the determining the deviation degree of the observed value according to the point cloud data of the obstacle and the observed value of the parameter to be estimated includes:
and if the parameter to be estimated is the direction of the obstacle, determining the direction deviation degree according to the point cloud data of the obstacle and the observed value of the direction of the obstacle.
Based on the technical characteristics, the embodiment of the application determines the direction deviation degree according to the point cloud data of the obstacle and the observed value of the obstacle direction if the parameter to be estimated is the direction of the obstacle, so as to determine the direction deviation degree.
Further, the determining the direction deviation degree according to the point cloud data of the obstacle and the observed value of the direction of the obstacle comprises the following steps:
projecting the point cloud data of the obstacle to a target plane, wherein the target plane is parallel to the ground;
determining a minimum circumscribing polygon of a projection point in the target plane;
determining the orientation of the minimum circumscribed polygon according to the two vertex coordinate vectors of the longest side in the minimum circumscribed polygon;
and determining the orientation deviation degree according to the observed value of the orientation of the minimum circumscribing polygon and the orientation of the obstacle.
Based on the technical characteristics, the embodiment of the application projects the point cloud data of the obstacle to a target plane; determining a minimum circumscribing polygon of a projection point in the target plane; and determining the orientation of the minimum circumscribed polygon according to the coordinate vectors of the two vertexes of the longest side in the minimum circumscribed polygon. Since the orientation of the minimum circumscribing polygon is relatively close to the true orientation of the obstacle, subtracting the orientation of the minimum circumscribing polygon from the observed value of the obstacle orientation can result in the degree of deviation of the orientation.
Further, the determining the estimated value of the parameter to be estimated according to the deviation of the observed value to correct the observed value of the parameter to be estimated includes:
determining an estimated value of the parameter to be estimated according to the orientation deviation degree and the angular velocity deviation degree;
the angular velocity deviation is determined from the point cloud data of the obstacle.
Based on the technical characteristics, the embodiment of the application estimates the parameter to be estimated according to the two dimensions of the orientation deviation degree and the angular velocity deviation degree, so that the estimation accuracy of the parameter to be estimated is further improved.
Further, determining the angular velocity deviation from the point cloud data of the obstacle includes:
acquiring at least two frames of projection images with continuous time, wherein the at least two frames of projection images are obtained by projecting at least two points of cloud data with continuous time to a target plane, and the target plane is parallel to the ground;
determining a first angle change quantity of the obstacle orientation associated with a first frame of projection image and a last frame of projection image in the at least two frames of projection images and a second angle change quantity of the obstacle orientation associated with two adjacent frames of projection images in the at least two frames of projection images;
And determining the angular velocity deviation degree according to the first angle change amount and the second angle change amount.
Based on the technical characteristics, the embodiment of the application determines a first angle change quantity of the orientation of the obstacle associated with the first frame of projection image and the last frame of projection image in the at least two frames of projection images and a second angle change quantity of the orientation of the obstacle associated with the adjacent two frames of projection images in the at least two frames of projection images; and determining the angular velocity deviation according to the first angle variation and the second angle variation, so as to determine the angular velocity deviation.
Further, the determining the angular velocity deviation according to the first angle variation and the second angle variation includes:
and calculating the angular velocity deviation degree by taking the second angular velocity variation as a variable and taking the first angular velocity variation as an average value.
Based on the technical characteristics, the embodiment of the application calculates the angular velocity deviation degree by taking the second angular velocity variation as a variable and taking the first angular velocity variation as an average value, thereby realizing the determination of the angular velocity deviation degree of the observed value according to the first angular variation and the second angular variation.
Further, the determining the deviation degree of the observed value according to the point cloud data of the obstacle and the observed value of the parameter to be estimated includes:
and if the parameter to be estimated is the center of the obstacle, determining the center deviation degree according to the point cloud data of the obstacle and the observed value of the center of the obstacle.
Based on the technical characteristics, the embodiment of the application determines the degree of center deviation according to the point cloud data of the obstacle and the observed value of the center of the obstacle if the parameter to be estimated is the center of the obstacle, thereby realizing the determination of the degree of deviation of the center of the obstacle.
Further, the determining the degree of center deviation according to the point cloud data of the obstacle and the observed value of the center of the obstacle comprises the following steps:
determining the relative position relationship between the vehicle and the obstacle according to the observed value of the center of the obstacle;
determining a visible face of the obstacle, which is a surface of the obstacle viewed from a perspective of the vehicle, according to the determined relative positional relationship;
extracting the point cloud data of the visible face from the point cloud data of the obstacle;
and determining the probability that the visible surface belongs to the surface according to the point cloud data of the visible surface, and taking the probability as the center deviation degree.
Based on this technical feature, embodiments of the present application provide for determining the visible face of the obstacle; and determining the probability that the visible surface belongs to the surface according to the point cloud data of the visible surface, and taking the probability as the center deviation degree so as to determine the center deviation degree.
Further, after the visible face of the obstacle is determined according to the determined relative positional relationship, the method further includes:
if the obstacle is a vehicle, determining a mapping relation between the visible surface and a vehicle size model;
and calculating a vehicle center according to the mapping relation and the vehicle size model, and updating the observed value of the obstacle center by using the calculated vehicle center.
Based on the technical characteristics, the embodiment of the application calculates the vehicle center according to the mapping relation between the visible surface of the obstacle and the vehicle size model, and updates the observed value of the obstacle center by using the calculated vehicle center, thereby realizing correction of the observed value of the obstacle center.
Further, before the determining the mapping relationship between the visible surface and the vehicle size model, the method further includes:
Establishing a vehicle size model according to the minimum vehicle size;
comparing the observed value of the vehicle size with the size of the vehicle size model;
and according to the comparison result, adjusting the size of the vehicle size model so that the size similarity between the vehicle size model and the vehicle is larger than the size similarity between the observed value of the vehicle size and the vehicle.
The minimum vehicle size refers to the minimum size of vehicles on the market.
Based on the technical characteristics, the embodiment of the application compares the observed value of the vehicle size with the size of the vehicle size model; and according to the comparison result, adjusting the size of the vehicle size model so that the size similarity between the vehicle size model and the vehicle is larger than the size similarity between the observed value of the vehicle size and the vehicle, and enabling the vehicle size of the vehicle size model to gradually approximate to the real size of the vehicle.
Further, according to the vehicle size of the adjusted vehicle size model, an optimal estimation of the vehicle size can be achieved.
Further, the determining the estimated value of the parameter to be estimated according to the deviation degree of the observed value includes:
Acquiring the deviation degree of the predicted value of the parameter to be estimated;
determining the predicted value of the parameter to be estimated according to the point cloud data of the obstacle and the deviation degree of the predicted value of the parameter to be estimated;
and determining the estimated value of the parameter to be estimated according to the observed value of the parameter to be estimated, the deviation degree of the observed value, the predicted value of the parameter to be estimated and the deviation degree of the predicted value.
Based on the technical characteristics, the embodiment of the application predicts the predicted value of the parameter to be estimated; and determining the estimated value of the parameter to be estimated according to the observed value of the parameter to be estimated, the deviation degree of the observed value, the predicted value of the parameter to be estimated and the deviation degree of the predicted value, thereby realizing the determination of the estimated value of the parameter to be estimated.
Further, the determining the estimated value of the parameter to be estimated according to the observed value of the parameter to be estimated, the deviation degree of the observed value, the predicted value of the parameter to be estimated and the deviation degree of the predicted value includes:
determining the observation reliability and the prediction reliability of the parameter to be estimated according to the deviation degree of the observation value and the deviation degree of the prediction value;
According to the observation credibility and the prediction credibility, determining the observation weight and the prediction weight of the parameter to be estimated;
and determining the estimated value of the parameter to be estimated according to the observation weight, the prediction weight, the observed value and the predicted value of the parameter to be estimated.
Based on the technical characteristics, the embodiment of the application determines the observation weight and the prediction weight of the parameter to be estimated according to the deviation degree of the observation value and the deviation degree of the prediction value; and according to the observation weight, the prediction weight, the observation value and the prediction value of the parameter to be estimated, accurate estimation of the parameter to be estimated is achieved.
The embodiment of the application also provides a parameter estimation device, which comprises:
the data acquisition module is used for acquiring point cloud data of the obstacle and an observed value of a parameter to be estimated of the obstacle;
the variance determining module is used for determining the deviation degree of the observed value according to the point cloud data of the obstacle and the observed value of the parameter to be estimated;
and the parameter estimation module is used for determining the estimated value of the parameter to be estimated according to the deviation degree of the observed value so as to correct the observed value of the parameter to be estimated.
Further, the variance determining module includes:
and the variance determining unit is used for determining the direction deviation degree according to the point cloud data of the obstacle and the observed value of the obstacle direction if the parameter to be estimated is the direction of the obstacle.
Further, the variance determining unit includes:
a projection subunit, configured to project point cloud data of the obstacle to a target plane, where the target plane is parallel to the ground;
a polygon determination subunit, configured to determine a minimum circumscribing polygon of a projection point in the target plane;
the orientation determining subunit is used for determining the orientation of the minimum circumscribed polygon according to the coordinate vectors of the two vertexes of the longest side in the minimum circumscribed polygon;
and the variance determining subunit is used for determining the orientation deviation degree according to the subtraction of the orientation of the minimum circumscribed polygon and the observed value of the obstacle orientation.
Further, the parameter estimation module includes:
the parameter estimation unit is used for determining an estimated value of the parameter to be estimated according to the orientation deviation degree and the angular velocity deviation degree;
the angular velocity deviation is determined from the point cloud data of the obstacle.
Further, the parameter estimation unit includes:
the image acquisition subunit is used for acquiring at least two frames of projection images with continuous time, wherein the at least two frames of projection images are obtained by projecting at least two points of cloud data with continuous time to a target plane, and the target plane is parallel to the ground;
the angle determining subunit is used for determining a first angle change quantity of the obstacle orientation associated with the first frame of projection image and the last frame of projection image in the at least two frames of projection images and a second angle change quantity of the obstacle orientation associated with the adjacent two frames of projection images in the at least two frames of projection images;
and the variance determining subunit is used for determining the angular speed deviation degree according to the first angle change amount and the second angle change amount.
Further, the variance determining subunit is specifically configured to:
and calculating the angular velocity deviation degree by taking the second angular velocity variation as a variable and taking the first angular velocity variation as an average value.
Further, the variance determining module includes:
and the variance determining unit is used for determining the center deviation degree according to the point cloud data of the obstacle and the observed value of the obstacle center if the parameter to be estimated is the center of the obstacle.
Further, the variance determining unit includes:
a position determining subunit, configured to determine a relative positional relationship between the own vehicle and the obstacle according to an observed value of the center of the obstacle;
a visible face determination subunit configured to determine a visible face of the obstacle, the visible face being a surface of the obstacle that is seen based on a view angle of the own vehicle, according to the determined relative positional relationship;
a data extraction subunit, configured to extract point cloud data of the visible surface from point cloud data of the obstacle;
and the variance determining subunit is used for determining the probability that the visible face belongs to the face according to the point cloud data of the visible face, and taking the probability as the center deviation degree.
Further, the apparatus further comprises:
the relation determining module is used for determining the mapping relation between the visible surface and the vehicle size model if the obstacle is a vehicle after determining the visible surface of the obstacle according to the determined relative position relation;
and the observation value updating module is used for calculating a vehicle center according to the mapping relation and the vehicle size model and updating the observation value of the obstacle center by using the calculated vehicle center.
Further, the apparatus further comprises:
the model building module is used for building a vehicle size model according to the minimum vehicle size before determining the mapping relation between the visible surface and the vehicle size model;
a dimension comparison module for comparing an observed value of a vehicle dimension with a dimension of the vehicle dimension model;
and the size adjustment module is used for adjusting the size of the vehicle size model according to the comparison result so that the size similarity between the vehicle size model and the vehicle is larger than the size similarity between the observed value of the vehicle size and the vehicle.
Further, the parameter estimation module includes:
the variance acquisition unit is used for acquiring the deviation degree of the predicted value of the parameter to be estimated;
a predicted value determining unit, configured to determine a predicted value of the parameter to be estimated according to the point cloud data of the obstacle and a degree of deviation of the predicted value of the parameter to be estimated;
and the estimated value determining unit is used for determining the estimated value of the parameter to be estimated according to the observed value of the parameter to be estimated, the deviation degree of the observed value, the predicted value of the parameter to be estimated and the deviation degree of the predicted value.
Further, the estimation value determining unit includes:
the credibility determining subunit is used for determining the observation credibility and the prediction credibility of the parameter to be estimated according to the deviation degree of the observation value and the deviation degree of the prediction value;
the weight determining subunit is used for determining the observation weight and the prediction weight of the parameter to be estimated according to the observation credibility and the prediction credibility;
and the estimation value determining subunit is used for determining the estimation value of the parameter to be estimated according to the observation weight, the prediction weight, the observation value and the prediction value of the parameter to be estimated.
The embodiment of the application also provides electronic equipment, which comprises:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of the embodiments of the present application.
Embodiments of the present application also provide a non-transitory computer-readable storage medium storing computer instructions for causing a computer to perform the method of any of the embodiments of the present application.
Drawings
The drawings are included to provide a better understanding of the present application and are not to be construed as limiting the application. Wherein:
FIG. 1 is a flow chart of a parameter estimation method according to a first embodiment of the present application;
FIG. 2 is a flow chart of a parameter estimation method according to a second embodiment of the present application;
FIG. 3 is a flow chart of a parameter estimation method according to a third embodiment of the present application;
fig. 4 is a flowchart of an obstacle orientation estimation method according to a fourth embodiment of the present application;
fig. 5 is a flowchart of a method for estimating the center of an obstacle according to a fourth embodiment of the present application;
fig. 6 is a schematic structural diagram of a parameter estimation device according to a fifth embodiment of the present application;
FIG. 7 is a block diagram of an electronic device for implementing a parameter estimation method according to an embodiment of the present application.
Detailed Description
Exemplary embodiments of the present application will now be described with reference to the accompanying drawings, in which various details of the embodiments of the present application are included to facilitate understanding, and are to be considered merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the application. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
First embodiment
Fig. 1 is a flowchart of a parameter estimation method according to a first embodiment of the present application. The method and the device can be applied to accurately estimating the parameter to be estimated according to the point cloud data of the obstacle and the observed value of the parameter to be estimated so as to correct the condition of the observed value of the parameter to be estimated. The method may be performed by a parameter estimation device, which may be implemented in software and/or hardware. Referring to fig. 1, the parameter estimation method provided by the embodiment of the present application includes:
s110, acquiring point cloud data of an obstacle and an observation value of a parameter to be estimated of the obstacle.
The parameter to be estimated may be any attribute parameter of the obstacle. Typically, the parameter to be estimated is the orientation of the obstacle or the centre of the obstacle.
The orientation of an obstacle refers to the direction in which the front face of the obstacle faces. Specifically, if the obstacle is a vehicle, the direction of the obstacle refers to the direction in which the vehicle head faces.
Alternatively, the observed value of the parameter to be estimated may be obtained by measurement or calculation, which is not limited in this embodiment.
S120, determining the deviation degree of the observed value according to the point cloud data of the obstacle and the observed value of the parameter to be estimated.
Wherein the degree of deviation of the observed value reflects the degree to which the observed value deviates from a true value.
Typically, the degree of deviation may be represented by a noise variance.
Specifically, determining, according to the point cloud data of the obstacle and the observed value of the parameter to be estimated, a degree of deviation of the observed value includes:
if the parameter to be estimated is the direction of the obstacle, determining the direction deviation according to the point cloud data of the obstacle and the observed value of the direction of the obstacle;
and if the parameter to be estimated is the center of the obstacle, determining the center deviation degree according to the point cloud data of the obstacle and the observed value of the center of the obstacle.
The observed value of the direction of the obstacle is the observed value of the direction of the obstacle.
The degree of deviation of the orientation refers to the degree of deviation of the observed value of the orientation of the obstacle. The degree of deviation of the orientation reflects the degree to which the observed value of the orientation of the obstacle deviates from the true value of the orientation of the obstacle.
The degree of center deviation refers to the degree of deviation of the observed value of the center of the obstacle. The degree of decentration reflects the degree to which the observed value of the center of the obstacle deviates from the true value of the center of the obstacle.
S130, determining the estimated value of the parameter to be estimated according to the deviation degree of the observed value so as to correct the observed value of the parameter to be estimated.
Specifically, the determining the estimated value of the parameter to be estimated according to the deviation degree of the observed value to correct the observed value of the parameter to be estimated includes:
acquiring the deviation degree of the predicted value of the parameter to be estimated;
determining the predicted value of the parameter to be estimated according to the point cloud data of the obstacle and the deviation degree of the predicted value of the parameter to be estimated;
and determining the estimated value of the parameter to be estimated according to the observed value of the parameter to be estimated, the deviation degree of the observed value, the predicted value of the parameter to be estimated and the deviation degree of the predicted value.
Specifically, the determining the estimated value of the parameter to be estimated according to the observed value of the parameter to be estimated, the deviation degree of the observed value, the predicted value of the parameter to be estimated and the deviation degree of the predicted value includes:
and inputting the observed value of the parameter to be estimated, the deviation degree of the observed value, the predicted value of the parameter to be estimated and the deviation degree of the predicted value into a Kalman filter, and outputting the estimated value of the parameter to be estimated.
Typically, the determining the estimated value of the parameter to be estimated according to the observed value of the parameter to be estimated, the deviation of the observed value, the predicted value of the parameter to be estimated and the deviation of the predicted value includes:
Determining the observation reliability and the prediction reliability of the parameter to be estimated according to the deviation degree of the observation value and the deviation degree of the prediction value;
according to the observation credibility and the prediction credibility, determining the observation weight and the prediction weight of the parameter to be estimated;
and determining the estimated value of the parameter to be estimated according to the observation weight, the prediction weight, the observed value and the predicted value of the parameter to be estimated.
The observation reliability refers to the reliability of the observation value of the parameter to be estimated.
The prediction reliability refers to the reliability of the predicted value of the parameter to be estimated.
The observation weight refers to the weight of the observation value of the parameter to be estimated.
The predicted weight refers to the weight of the predicted value of the parameter to be estimated.
Specifically, determining the estimated value of the parameter to be estimated according to the observed weight and the predicted weight, and the observed value and the predicted value of the parameter to be estimated, including:
according to the observation weight and the prediction weight, carrying out weighted summation on the observation value and the prediction value of the parameter to be estimated;
and dividing the weighted summation result by two to obtain the estimated value of the parameter to be estimated.
According to the technical scheme, the deviation degree of the observed value is determined according to the point cloud data of the obstacle and the observed value of the parameter to be estimated, so that the estimated value of the parameter to be estimated is determined according to the deviation degree of the observed value. Since the degree of deviation of the observed value reflects the degree of deviation of the observed value, introducing the noise deviation of the observed value into the estimation of the parameter to be estimated can improve the accuracy of the estimated value. And then according to the estimated value of the parameter to be estimated, the correction of the observed value can be realized.
Second embodiment
Fig. 2 is a flowchart of a parameter estimation method according to a second embodiment of the present application. The present embodiment is an alternative provided by taking the direction of the obstacle as the parameter to be estimated based on the above embodiment as an example. Referring to fig. 2, the parameter estimation method provided by the embodiment of the present application includes:
s210, acquiring point cloud data of the obstacle and an observation value of the direction of the obstacle.
S220, projecting the point cloud data of the obstacle to a target plane, wherein the target plane is parallel to the ground.
S230, determining the minimum circumscribed polygon of the projection points in the target plane.
S240, determining the orientation of the minimum circumscribed polygon according to the coordinate vectors of the two vertexes of the longest side in the minimum circumscribed polygon.
Wherein the vertex coordinates are vectors having directions. The minimum circumscribing polygon is equal to the direction difference of the coordinates of the two vertexes.
Specifically, determining the orientation of the minimum circumscribing polygon according to the two vertex coordinate vectors of the longest side in the minimum circumscribing polygon includes:
and subtracting the coordinate vectors of the two vertexes of the longest side in the minimum circumscribed polygon to obtain the orientation of the minimum circumscribed polygon.
S250, determining the orientation deviation degree according to the observed value of the orientation of the minimum circumscribed polygon and the orientation of the obstacle.
Specifically, determining the degree of deviation of the orientation according to the observed value of the orientation of the minimum circumscribing polygon and the orientation of the obstacle includes:
and subtracting the observed value of the direction of the obstacle from the direction of the minimum circumscribed polygon to obtain the direction deviation degree.
And S260, determining an estimated value of the obstacle orientation according to the orientation deviation degree so as to correct the observed value of the obstacle orientation.
According to the technical scheme, the true orientation of the obstacle is determined by projecting the point cloud data of the obstacle to the projection point of the target plane; subtracting the determined true orientation of the obstacle from the observed value of the orientation of the obstacle to realize the determination of the degree of deviation of the orientation.
The determining the angular velocity deviation degree according to the point cloud data of the obstacle and the observed value of the obstacle orientation comprises the following steps:
acquiring at least two frames of projection images with continuous time, wherein the at least two frames of projection images are obtained by projecting at least two points of cloud data with continuous time to a target plane, and the target plane is parallel to the ground;
Determining a first angle change quantity of the obstacle orientation associated with a first frame of projection image and a last frame of projection image in the at least two frames of projection images and a second angle change quantity of the obstacle orientation associated with two adjacent frames of projection images in the at least two frames of projection images;
and determining the angular velocity deviation degree according to the first angle change amount and the second angle change amount.
Specifically, the determining the angular velocity deviation degree according to the first angle variation amount and the second angle variation amount includes:
and calculating the angular velocity deviation degree by taking the second angular velocity variation as a variable and taking the first angular velocity variation as an average value.
Based on the technical characteristics, the embodiment of the application estimates the parameter to be estimated according to the two dimensions of the orientation deviation degree and the angular velocity deviation degree, so that the estimation accuracy of the parameter to be estimated is further improved.
Third embodiment
Fig. 3 is a flowchart of a parameter estimation method according to a third embodiment of the present application. The present embodiment is an alternative provided by taking the parameter to be estimated as the center of the obstacle as an example on the basis of the above embodiment. Referring to fig. 3, the parameter estimation method provided by the embodiment of the present application includes:
S310, acquiring point cloud data of the obstacle and an observation value of the center of the obstacle.
S320, determining the relative position relationship between the vehicle and the obstacle according to the observed value of the center of the obstacle.
The own vehicle is a currently driving vehicle and is also a vehicle equipped with the parameter estimation device.
S330, determining the visible surface of the obstacle according to the determined relative position relation.
The visible face refers to the surface of the obstacle that is seen based on the perspective of the vehicle. The device specifically comprises a left side surface, a right side surface, a front surface, a rear surface and a top surface.
S340, extracting the point cloud data of the visible surface from the point cloud data of the obstacle.
S350, determining the probability that the visible surface belongs to the surface according to the point cloud data of the visible surface, and taking the probability as the center deviation degree.
Wherein the above-mentioned surface belongs to the basic element of the planar space, and may be constituted by at least two lines.
S360, according to the center deviation, determining an estimated value of the center of the obstacle so as to correct the observed value of the center of the obstacle.
According to the embodiment of the application, the probability that the visible surface belongs to the surface is determined according to the point cloud data of the visible surface of the obstacle, and the probability is taken as the center deviation degree; and according to the center deviation degree of the obstacle, estimating the center of the obstacle so as to correct the observed value of the center of the obstacle.
Further, after the visible face of the obstacle is determined according to the determined relative positional relationship, the method further includes:
if the obstacle is a vehicle, determining a mapping relation between the visible surface and a vehicle size model;
and calculating a vehicle center according to the mapping relation and the vehicle size model, and updating the observed value of the obstacle center by using the calculated vehicle center.
Based on the technical characteristics, the embodiment of the application calculates the vehicle center according to the mapping relation between the visible surface of the obstacle and the vehicle size model, and updates the observed value of the obstacle center by using the calculated vehicle center, thereby realizing correction of the observed value of the obstacle center.
Further, before the determining the mapping relationship between the visible surface and the vehicle size model, the method further includes:
establishing a vehicle size model according to the minimum vehicle size;
comparing the observed value of the vehicle size with the size of the vehicle size model;
and according to the comparison result, adjusting the size of the vehicle size model so that the size similarity between the vehicle size model and the vehicle is larger than the size similarity between the observed value of the vehicle size and the vehicle.
Specifically, adjusting the size of the vehicle size model according to the comparison result includes:
if the size of the vehicle size model is smaller than the observed value of the vehicle size, determining a size difference value between the observed value of the vehicle size and the vehicle size model according to the comparison result;
increasing the size of the vehicle size module according to the size difference of the set multiple so that the size of the vehicle size module gradually approximates to the size of the actual vehicle, wherein the set multiple is smaller than or equal to 1 so as to avoid that the size of the increased vehicle size module exceeds the size of the actual vehicle;
if the size of the vehicle size template is larger than or equal to the observed value of the vehicle size, not adjusting;
the size of the determined vehicle size template is taken as the current optimal size of the obstacle.
Based on the technical characteristics, the embodiment of the application can also realize the optimal estimation of the vehicle size.
Fourth embodiment
The present embodiment is an alternative proposal proposed on the basis of the above embodiment, taking the example that the obstacle parameter at the current moment is estimated and the deviation degree is the noise variance. The parameter estimation method provided by the embodiment comprises the following steps:
referring to fig. 4, the estimation of the obstacle orientation at the present time includes:
(1) Acquiring an observation value of the direction of an obstacle at the current moment and point cloud data of the obstacle at the current moment;
(2) Projecting the point cloud data of the obstacle at the current moment to the cross section of the obstacle to obtain a current projection image; determining the minimum circumscribed polygon of the projection point in the current projection image; determining the longest edge in the minimum circumscribed polygon; subtracting the coordinate vectors of the two vertexes of the longest side to obtain the orientation of the minimum circumscribed polygon (the orientation is relatively close to the true orientation of the obstacle); subtracting the direction of the minimum circumscribed polygon from the observed value of the obstacle direction at the current moment to obtain a direction noise variance;
(3) Determining at least 2 consecutive frames of images comprising a current projection image and a historical projection image; determining the angle change quantity of the first frame image and the last frame image corresponding to the obstacle orientation in the continuous at least 2 frames of images, dividing the angle change quantity by the change time to obtain a first angular velocity; determining the angle change amount between two adjacent frames of images in the continuous at least 2 frames of images, and dividing the angle change amount by the change time to obtain a second angular velocity; taking the second angular velocity as a variable, taking the first angular velocity as an average value, and calculating the angular velocity noise variance of the obstacle at the current moment;
(4) Determining a predicted value of the current time obstacle orientation according to the information of the historical time obstacle orientation, and the set direction noise variance and angular velocity noise variance of the predicted value of the current time obstacle orientation based on the motion model;
(5) And inputting the observed value of the obstacle orientation at the current moment, the orientation noise variance of the observed value, the angular velocity noise variance of the observed value, the predicted value of the obstacle orientation at the current moment, the orientation noise variance of the predicted value and the angular velocity noise variance of the predicted value into a Kalman filter, and outputting the best estimated value of the obstacle orientation at the current moment.
Referring to fig. 5, taking the example where the obstacle is a vehicle, estimating the obstacle size and center at the current time includes:
(1) Obtaining an observed value of the obstacle size at the current moment and an observed value of the obstacle center at the current moment;
(2) Establishing a vehicle size model based on the minimum size of vehicles on the market;
(3) Determining a visible surface of the obstacle according to the relative positional relationship between the vehicle and the obstacle, wherein the visible surface is a surface of the obstacle vehicle which can be seen based on the view angle of the vehicle, and specifically comprises a left side surface, a right side surface, a front surface, a rear surface and a top surface;
(4) Comparing the observed value of the obstacle size at the current moment with the size of the vehicle size model, and if the size of the vehicle size model is smaller than the observed value, increasing the size of the vehicle size model according to the size difference of the set times so as to gradually approximate the size of the vehicle size model to the actual size of the vehicle serving as the obstacle; if the size of the vehicle size model is larger than the observed value, not adjusting; taking the determined size of the vehicle size model as the best estimated value of the obstacle size at the current moment;
(5) Aligning the visible face with the resized vehicle size model; based on the aligned vehicle size model, adjusting the observed value of the obstacle center at the current moment;
(6) Extracting planar point cloud data of a visible surface from point cloud data of the obstacle at the current moment; determining the probability of the extracted plane point cloud data belonging to the plane based on the data point dispersion characteristics and the characteristics with large variance of the plane, and taking the probability as the central noise variance of the observed value;
(7) Based on the motion model, determining a predicted value of the obstacle center at the current moment according to the center information of the obstacle at the historical moment and the set center noise variance of the predicted value of the obstacle center at the current moment;
(8) And inputting the observed value of the obstacle center at the current moment, the central noise variance of the observed value, the predicted value of the obstacle center at the current moment and the central noise variance of the predicted value into a Kalman filter, and outputting the best estimated value of the obstacle center at the current moment.
According to the technical scheme provided by the embodiment of the application, covariance modeling is performed on the direction, the size and the center of the obstacle by using the point cloud distribution information and the visibility information of the obstacle surface, and Kalman filtering is used for filtering, so that more real direction, size and center information of the obstacle is obtained. The method and the device can effectively improve the accuracy and stability of the direction, the size and the center of the barrier. The technology has a good correcting effect on the direction, the size and the center of the sudden error in the time sequence, and the problem of connection pipe caused by unstable direction, size and center is effectively reduced in a real unmanned vehicle driving system.
Fifth embodiment
Fig. 6 is a schematic structural diagram of a parameter estimation device according to a fifth embodiment of the present application. The parameter estimation device 600 provided in this embodiment includes: a data acquisition module 601, a variance determination module 602, and a parameter estimation module 603.
The data acquisition module 601 is configured to acquire point cloud data of an obstacle and an observed value of a parameter to be estimated of the obstacle;
the variance determining module 602 is configured to determine a degree of deviation of the observed value according to the point cloud data of the obstacle and the observed value of the parameter to be estimated;
the parameter estimation module 603 is configured to determine an estimated value of the parameter to be estimated according to the degree of deviation of the observed value, so as to correct the observed value of the parameter to be estimated.
According to the embodiment of the application, the deviation degree of the observed value is determined according to the point cloud data of the obstacle and the observed value of the parameter to be estimated, so that the optimal estimated value of the parameter to be estimated is determined according to the deviation degree of the observed value. Since the degree of deviation of the observed value reflects the degree of deviation of the observed value, introducing the noise deviation of the observed value into the estimation of the parameter to be estimated can improve the accuracy of the estimated value. And then according to the estimated value of the parameter to be estimated, the correction of the observed value can be realized.
Further, the variance determining module includes:
and the variance determining unit is used for determining the direction deviation degree according to the point cloud data of the obstacle and the observed value of the obstacle direction if the parameter to be estimated is the direction of the obstacle.
Further, the variance determining unit includes:
a projection subunit, configured to project point cloud data of the obstacle to a target plane, where the target plane is parallel to the ground;
a polygon determination subunit, configured to determine a minimum circumscribing polygon of a projection point in the target plane;
the orientation determining subunit is used for determining the orientation of the minimum circumscribed polygon according to the coordinate vectors of the two vertexes of the longest side in the minimum circumscribed polygon;
and the variance determining subunit is used for determining the orientation deviation degree according to the observed value of the orientation of the minimum circumscribed polygon and the orientation of the obstacle.
Further, the parameter estimation module includes:
the parameter estimation unit is used for determining an estimated value of the parameter to be estimated according to the orientation deviation degree and the angular velocity deviation degree;
the angular velocity deviation is determined from the point cloud data of the obstacle.
Further, the parameter estimation unit includes:
the image acquisition subunit is used for acquiring at least two frames of projection images with continuous time, wherein the at least two frames of projection images are obtained by projecting at least two points of cloud data with continuous time to a target plane, and the target plane is parallel to the ground;
The angle determining subunit is used for determining a first angle change quantity of the obstacle orientation associated with the first frame of projection image and the last frame of projection image in the at least two frames of projection images and a second angle change quantity of the obstacle orientation associated with the adjacent two frames of projection images in the at least two frames of projection images;
and the variance determining subunit is used for determining the angular speed deviation degree according to the first angle change amount and the second angle change amount.
Further, the variance determining subunit is specifically configured to:
and calculating the angular velocity deviation degree by taking the second angular velocity variation as a variable and taking the first angular velocity variation as an average value.
Further, the variance determining module includes:
and the variance determining unit is used for determining the center deviation degree according to the point cloud data of the obstacle and the observed value of the obstacle center if the parameter to be estimated is the center of the obstacle.
Further, the variance determining unit includes:
a position determining subunit, configured to determine a relative positional relationship between the own vehicle and the obstacle according to an observed value of the center of the obstacle;
a visible face determination subunit configured to determine a visible face of the obstacle, the visible face being a surface of the obstacle that is seen based on a view angle of the own vehicle, according to the determined relative positional relationship;
A data extraction subunit, configured to extract point cloud data of the visible surface from point cloud data of the obstacle;
and the variance determining subunit is used for determining the probability that the visible face belongs to the face according to the point cloud data of the visible face, and taking the probability as the center deviation degree.
Further, the apparatus further comprises:
the relation determining module is used for determining the mapping relation between the visible surface and the vehicle size model if the obstacle is a vehicle after determining the visible surface of the obstacle according to the determined relative position relation;
and the observation value updating module is used for calculating a vehicle center according to the mapping relation and the vehicle size model and updating the observation value of the obstacle center by using the calculated vehicle center.
Further, the apparatus further comprises:
the model building module is used for building a vehicle size model according to the minimum vehicle size before determining the mapping relation between the visible surface and the vehicle size model;
a dimension comparison module for comparing an observed value of a vehicle dimension with a dimension of the vehicle dimension model;
and the size adjustment module is used for adjusting the size of the vehicle size model according to the comparison result so that the size similarity between the vehicle size model and the vehicle is larger than the size similarity between the observed value of the vehicle size and the vehicle.
Further, the parameter estimation module includes:
the variance acquisition unit is used for acquiring the deviation degree of the predicted value of the parameter to be estimated;
a predicted value determining unit, configured to determine a predicted value of the parameter to be estimated according to the point cloud data of the obstacle and a degree of deviation of the predicted value of the parameter to be estimated;
and the estimated value determining unit is used for determining the estimated value of the parameter to be estimated according to the observed value of the parameter to be estimated, the deviation degree of the observed value, the predicted value of the parameter to be estimated and the deviation degree of the predicted value.
Further, the estimation value determining unit includes:
the credibility determining subunit is used for determining the observation credibility and the prediction credibility of the parameter to be estimated according to the deviation degree of the observation value and the deviation degree of the prediction value;
the weight determining subunit is used for determining the observation weight and the prediction weight of the parameter to be estimated according to the observation credibility and the prediction credibility;
and the estimation value determining subunit is used for determining the estimation value of the parameter to be estimated according to the observation weight, the prediction weight, the observation value and the prediction value of the parameter to be estimated.
Sixth embodiment
According to an embodiment of the present application, the present application also provides an electronic device and a readable storage medium.
As shown in fig. 7, there is a block diagram of an electronic device of a parameter estimation method according to an embodiment of the present application. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the applications described and/or claimed herein.
As shown in fig. 7, the electronic device includes: one or more processors 701, memory 702, and interfaces for connecting the various components, including high-speed interfaces and low-speed interfaces. The various components are interconnected using different buses and may be mounted on a common motherboard or in other manners as desired. The processor may process instructions executing within the electronic device, including instructions stored in or on memory to display graphical information of the GUI on an external input/output device, such as a display device coupled to the interface. In other embodiments, multiple processors and/or multiple buses may be used, if desired, along with multiple memories and multiple memories. Also, multiple electronic devices may be connected, each providing a portion of the necessary operations (e.g., as a server array, a set of blade servers, or a multiprocessor system). One processor 701 is illustrated in fig. 7.
Memory 702 is a non-transitory computer readable storage medium provided by the present application. Wherein the memory stores instructions executable by the at least one processor to cause the at least one processor to perform the parameter estimation method provided by the present application. The non-transitory computer readable storage medium of the present application stores computer instructions for causing a computer to execute the parameter estimation method provided by the present application.
The memory 702 is used as a non-transitory computer readable storage medium for storing a non-transitory software program, a non-transitory computer executable program, and modules, such as program instructions/modules (e.g., the data acquisition module 601, the variance determination module 602, and the parameter estimation module 603 shown in fig. 6) corresponding to the parameter estimation method in the embodiment of the present application. The processor 701 executes various functional applications of the server and data processing, i.e., implements the parameter estimation method in the above-described method embodiments, by running non-transitory software programs, instructions, and modules stored in the memory 702.
Memory 702 may include a storage program area that may store an operating system, at least one application program required for functionality, and a storage data area; the storage data area may store data created from the use of the parameter estimation electronic device, and the like. In addition, the memory 702 may include high-speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid-state storage device. In some embodiments, memory 702 may optionally include memory located remotely from processor 701, which may be connected to the parameter estimation electronics via a network. Examples of such networks include, but are not limited to, the internet, intranets, blockchain networks, local area networks, mobile communication networks, and combinations thereof.
The electronic device of the parameter estimation method may further include: an input device 703 and an output device 704. The processor 701, the memory 702, the input device 703 and the output device 704 may be connected by a bus or otherwise, in fig. 7 by way of example.
The input device 703 may receive input numeric or character information and generate key signal inputs related to user settings and function control of the parameter estimation electronic device, such as a touch screen, keypad, mouse, trackpad, touchpad, pointer stick, one or more mouse buttons, trackball, joystick, and like input devices. The output device 704 may include a display apparatus, auxiliary lighting devices (e.g., LEDs), and haptic feedback devices (e.g., vibration motors), among others. The display device may include, but is not limited to, a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display, and a plasma display. In some implementations, the display device may be a touch screen.
Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, application specific ASIC (application specific integrated circuit), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
These computing programs (also referred to as programs, software applications, or code) include machine instructions for a programmable processor, and may be implemented in a high-level procedural and/or object-oriented programming language, and/or in assembly/machine language. As used herein, the terms "machine-readable medium" and "computer-readable medium" refer to any computer program product, apparatus, and/or device (e.g., magnetic discs, optical disks, memory, programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term "machine-readable signal" refers to any signal used to provide machine instructions and/or data to a programmable processor.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and pointing device (e.g., a mouse or trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), the internet, and blockchain networks.
The computer system may include a client and a server. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present application may be performed in parallel, sequentially, or in a different order, provided that the desired results of the disclosed embodiments are achieved, and are not limited herein.
The above embodiments do not limit the scope of the present application. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present application should be included in the scope of the present application.

Claims (19)

1. A method of parameter estimation, comprising:
acquiring point cloud data of an obstacle and an observed value of a parameter to be estimated of the obstacle;
determining the deviation degree of the observed value according to the point cloud data of the obstacle and the observed value of the parameter to be estimated;
and determining the estimated value of the parameter to be estimated according to the deviation degree of the observed value so as to correct the observed value of the parameter to be estimated.
2. The method according to claim 1, wherein the determining the degree of deviation of the observed value according to the point cloud data of the obstacle and the observed value of the parameter to be estimated comprises:
and if the parameter to be estimated is the direction of the obstacle, determining the direction deviation degree according to the point cloud data of the obstacle and the observed value of the direction of the obstacle.
3. The method of claim 2, wherein determining the degree of deviation of the orientation from the point cloud data of the obstacle and the observed value of the obstacle orientation comprises:
projecting the point cloud data of the obstacle to a target plane, wherein the target plane is parallel to the ground;
determining a minimum circumscribing polygon of a projection point in the target plane;
determining the orientation of the minimum circumscribed polygon according to the two vertex coordinate vectors of the longest side in the minimum circumscribed polygon;
and determining the orientation deviation degree according to the observed value of the orientation of the minimum circumscribing polygon and the orientation of the obstacle.
4. The method according to claim 2, wherein determining the estimated value of the parameter to be estimated based on the degree of deviation of the observed value to correct the observed value of the parameter to be estimated comprises:
and determining the estimated value of the parameter to be estimated according to the orientation deviation degree and the angular velocity deviation degree, wherein the angular velocity deviation degree is determined according to the point cloud data of the obstacle.
5. The method of claim 4, wherein determining the angular velocity deviation from point cloud data for the obstacle comprises:
Acquiring at least two frames of projection images with continuous time, wherein the at least two frames of projection images are obtained by projecting at least two points of cloud data with continuous time to a target plane, and the target plane is parallel to the ground;
determining a first angle change quantity of the obstacle orientation associated with a first frame of projection image and a last frame of projection image in the at least two frames of projection images and a second angle change quantity of the obstacle orientation associated with two adjacent frames of projection images in the at least two frames of projection images;
and determining the angular velocity deviation degree according to the first angle change amount and the second angle change amount.
6. The method of claim 5, wherein said determining said angular velocity deviation from said first angular variation and said second angular variation comprises:
and calculating the angular velocity deviation degree by taking the second angle variation as a variable and taking the first angle variation as an average value.
7. The method according to any one of claims 1-6, wherein said determining a degree of deviation of said observed value from the point cloud data of said obstacle and the observed value of said parameter to be estimated comprises:
And if the parameter to be estimated is the center of the obstacle, determining the center deviation degree according to the point cloud data of the obstacle and the observed value of the center of the obstacle.
8. The method of claim 7, wherein determining the degree of center deviation from the point cloud data of the obstacle and the observed value of the obstacle center comprises:
determining the relative position relationship between the vehicle and the obstacle according to the observed value of the center of the obstacle;
determining a visible face of the obstacle, which is a surface of the obstacle viewed from a perspective of the vehicle, according to the determined relative positional relationship;
extracting the point cloud data of the visible face from the point cloud data of the obstacle;
and determining the probability that the visible surface belongs to the surface according to the point cloud data of the visible surface, and taking the probability as the center deviation degree.
9. The method of claim 8, wherein after determining the visible face of the obstacle based on the determined relative positional relationship, the method further comprises:
if the obstacle is a vehicle, determining a mapping relation between the visible surface and a vehicle size model;
And calculating a vehicle center according to the mapping relation and the vehicle size model, and updating the observed value of the obstacle center by using the calculated vehicle center.
10. The method of claim 9, wherein prior to said determining the mapping of the visible face to the vehicle size model, the method further comprises:
establishing a vehicle size model according to the minimum vehicle size;
comparing the observed value of the vehicle size with the size of the vehicle size model;
and according to the comparison result, adjusting the size of the vehicle size model so that the size similarity between the vehicle size model and the vehicle is larger than the size similarity between the observed value of the vehicle size and the vehicle.
11. The method according to any one of claims 1-6, wherein determining the estimated value of the parameter to be estimated based on the degree of deviation of the observed value comprises:
acquiring the deviation degree of the predicted value of the parameter to be estimated;
determining the predicted value of the parameter to be estimated according to the point cloud data of the obstacle and the deviation degree of the predicted value of the parameter to be estimated;
and determining the estimated value of the parameter to be estimated according to the observed value of the parameter to be estimated, the deviation degree of the observed value, the predicted value of the parameter to be estimated and the deviation degree of the predicted value.
12. The method of claim 11, wherein the determining the estimated value of the parameter to be estimated based on the observed value of the parameter to be estimated, the degree of deviation of the observed value, the predicted value of the parameter to be estimated, and the degree of deviation of the predicted value comprises:
determining the observation reliability and the prediction reliability of the parameter to be estimated according to the deviation degree of the observation value and the deviation degree of the prediction value;
according to the observation credibility and the prediction credibility, determining the observation weight and the prediction weight of the parameter to be estimated;
and determining the estimated value of the parameter to be estimated according to the observation weight, the prediction weight, the observed value and the predicted value of the parameter to be estimated.
13. A parameter estimation apparatus, comprising:
the data acquisition module is used for acquiring point cloud data of the obstacle and an observed value of a parameter to be estimated of the obstacle;
the variance determining module is used for determining the deviation degree of the observed value according to the point cloud data of the obstacle and the observed value of the parameter to be estimated;
and the parameter estimation module is used for determining the estimated value of the parameter to be estimated according to the deviation degree of the observed value so as to correct the observed value of the parameter to be estimated.
14. The apparatus of claim 13, wherein the variance determination module comprises:
and the variance determining unit is used for determining the direction deviation degree according to the point cloud data of the obstacle and the observed value of the obstacle direction if the parameter to be estimated is the direction of the obstacle.
15. The apparatus according to claim 14, wherein the variance determining unit includes:
a projection subunit, configured to project point cloud data of the obstacle to a target plane, where the target plane is parallel to the ground;
a polygon determination subunit, configured to determine a minimum circumscribing polygon of a projection point in the target plane;
the orientation determining subunit is used for determining the orientation of the minimum circumscribed polygon according to the coordinate vectors of the two vertexes of the longest side in the minimum circumscribed polygon;
and the variance determining subunit is used for determining the orientation deviation degree according to the observed value of the orientation of the minimum circumscribed polygon and the orientation of the obstacle.
16. The apparatus according to any one of claims 13-15, wherein the variance determining module comprises:
and the variance determining unit is used for determining the center deviation degree according to the point cloud data of the obstacle and the observed value of the obstacle center if the parameter to be estimated is the center of the obstacle.
17. The apparatus according to claim 16, wherein the variance determining unit includes:
a position determining subunit, configured to determine a relative positional relationship between the own vehicle and the obstacle according to an observed value of the center of the obstacle;
a visible face determination subunit configured to determine a visible face of the obstacle, the visible face being a surface of the obstacle that is seen based on a view angle of the own vehicle, according to the determined relative positional relationship;
a data extraction subunit, configured to extract point cloud data of the visible surface from point cloud data of the obstacle;
and the variance determining subunit is used for determining the probability that the visible face belongs to the face according to the point cloud data of the visible face, and taking the probability as the center deviation degree.
18. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-12.
19. A non-transitory computer readable storage medium storing computer instructions for causing the computer to perform the method of any one of claims 1-12.
CN202010175358.3A 2020-03-13 2020-03-13 Parameter estimation method, device, equipment and medium Active CN111402333B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010175358.3A CN111402333B (en) 2020-03-13 2020-03-13 Parameter estimation method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010175358.3A CN111402333B (en) 2020-03-13 2020-03-13 Parameter estimation method, device, equipment and medium

Publications (2)

Publication Number Publication Date
CN111402333A CN111402333A (en) 2020-07-10
CN111402333B true CN111402333B (en) 2023-11-14

Family

ID=71434422

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010175358.3A Active CN111402333B (en) 2020-03-13 2020-03-13 Parameter estimation method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN111402333B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101754887A (en) * 2007-07-24 2010-06-23 日产自动车株式会社 Drive assistance apparatus for vehicle and vehicle equipped with the apparatus
US9052721B1 (en) * 2012-08-28 2015-06-09 Google Inc. Method for correcting alignment of vehicle mounted laser scans with an elevation map for obstacle detection
CN110222605A (en) * 2019-05-24 2019-09-10 深兰科技(上海)有限公司 A kind of obstacle detection method and equipment
CN110364049A (en) * 2019-07-17 2019-10-22 石虹 A kind of professional skill real training assisted teaching system and assistant teaching method with the correction control of irrelevance automatic feedback data closed loop
CN110705385A (en) * 2019-09-12 2020-01-17 深兰科技(上海)有限公司 Method, device, equipment and medium for detecting angle of obstacle

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7509241B2 (en) * 2001-07-06 2009-03-24 Sarnoff Corporation Method and apparatus for automatically generating a site model
EP3371679B1 (en) * 2016-08-30 2022-03-09 SZ DJI Technology Co., Ltd. Method and system for detecting obstructive object at projected locations within images
CN106951847B (en) * 2017-03-13 2020-09-29 百度在线网络技术(北京)有限公司 Obstacle detection method, apparatus, device and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101754887A (en) * 2007-07-24 2010-06-23 日产自动车株式会社 Drive assistance apparatus for vehicle and vehicle equipped with the apparatus
US9052721B1 (en) * 2012-08-28 2015-06-09 Google Inc. Method for correcting alignment of vehicle mounted laser scans with an elevation map for obstacle detection
CN110222605A (en) * 2019-05-24 2019-09-10 深兰科技(上海)有限公司 A kind of obstacle detection method and equipment
CN110364049A (en) * 2019-07-17 2019-10-22 石虹 A kind of professional skill real training assisted teaching system and assistant teaching method with the correction control of irrelevance automatic feedback data closed loop
CN110705385A (en) * 2019-09-12 2020-01-17 深兰科技(上海)有限公司 Method, device, equipment and medium for detecting angle of obstacle

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
严浙平 等."基于前视声呐的水下移动障碍物运动参数预测".《华中科技大学学报(自然科学版)》.2019,(第undefined期),105-109. *
孙永丽."基于图像的铁路障碍物自动检测算法研究".《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》.2019,(第undefined期),C033-112. *
杨琪."基于复合特征的异源图像配准算法研究".《中国优秀硕士学位论文全文数据库 信息科技辑》.2018,(第undefined期),I138-737. *
滕军."智能工业机器人的环境感知与运动规划".《中国优秀硕士学位论文全文数据库 信息科技辑》.2020,(第undefined期),I140-835. *
邹斌 ; 刘康 ; 王科未 ; .基于三维激光雷达的动态障碍物检测和追踪方法.汽车技术.(第08期),1-3. *
阎岩 ; 唐振民 ; 刘家银 ; .基于不确定性分析的自主导航轨迹评测方法.机器人.(第02期),194-199. *

Also Published As

Publication number Publication date
CN111402333A (en) 2020-07-10

Similar Documents

Publication Publication Date Title
CN111640140B (en) Target tracking method and device, electronic equipment and computer readable storage medium
CN110827325B (en) Target tracking method and device, electronic equipment and storage medium
CN110738183B (en) Road side camera obstacle detection method and device
US11514607B2 (en) 3-dimensional reconstruction method, 3-dimensional reconstruction device, and storage medium
CN112150558B (en) Obstacle three-dimensional position acquisition method and device for road side computing equipment
CN110929639A (en) Method, apparatus, device and medium for determining position of obstacle in image
CN111612753B (en) Three-dimensional object detection method and device, electronic equipment and readable storage medium
CN111402161B (en) Denoising method, device, equipment and storage medium for point cloud obstacle
CN111079079B (en) Data correction method, device, electronic equipment and computer readable storage medium
CN110595490B (en) Preprocessing method, device, equipment and medium for lane line perception data
US20210374439A1 (en) Obstacle detection method and device, apparatus, and storage medium
CN111612852A (en) Method and apparatus for verifying camera parameters
CN112561978A (en) Training method of depth estimation network, depth estimation method of image and equipment
CN111462179B (en) Three-dimensional object tracking method and device and electronic equipment
CN110675635A (en) Method and device for acquiring external parameters of camera, electronic equipment and storage medium
KR102568948B1 (en) Method and apparatus for determining velocity of obstacle, electronic device, storage medium and program
CN111191619A (en) Method, device and equipment for detecting virtual line segment of lane line and readable storage medium
KR102432561B1 (en) Edge-based three-dimensional tracking and registration method and apparatus for augmented reality, and electronic device
CN111260722B (en) Vehicle positioning method, device and storage medium
CN111369571B (en) Three-dimensional object pose accuracy judging method and device and electronic equipment
CN112528932A (en) Method and device for optimizing position information, road side equipment and cloud control platform
CN111402333B (en) Parameter estimation method, device, equipment and medium
CN116883460A (en) Visual perception positioning method and device, electronic equipment and storage medium
CN116772858A (en) Vehicle positioning method, device, positioning equipment and storage medium
CN111814634B (en) Real-time distance determining method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant