CN116222558A - Positioning method, device and system based on vehicle-mounted information - Google Patents

Positioning method, device and system based on vehicle-mounted information Download PDF

Info

Publication number
CN116222558A
CN116222558A CN202310355844.7A CN202310355844A CN116222558A CN 116222558 A CN116222558 A CN 116222558A CN 202310355844 A CN202310355844 A CN 202310355844A CN 116222558 A CN116222558 A CN 116222558A
Authority
CN
China
Prior art keywords
vehicle
information
displacement
camera
pose
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310355844.7A
Other languages
Chinese (zh)
Inventor
杜绍宣
程志勇
赵勇光
文翊
杨仕会
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dongfeng Motor Group Co Ltd
Original Assignee
Dongfeng Motor Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dongfeng Motor Group Co Ltd filed Critical Dongfeng Motor Group Co Ltd
Priority to CN202310355844.7A priority Critical patent/CN116222558A/en
Publication of CN116222558A publication Critical patent/CN116222558A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1652Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1656Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a positioning method, a device and a system based on vehicle-mounted information, belonging to the technical field of vehicle detection and control, wherein the method comprises the following steps: acquiring original image information of a vehicle looking around camera, calculating camera displacement by comparing rotation of feature points in the original image information, and then accumulating integration to obtain the displacement of the camera in the environment, so as to obtain pose information of the vehicle in the environment, and taking the pose information as vehicle displacement for vehicle vision judgment; positioning the position and the posture of the vehicle by using parameter signals on the vehicle to obtain displacement information judged by the physical information of the vehicle; the vehicle displacement judged by the vehicle vision and the displacement information judged by the vehicle physical information are fused, and compared with the original point which is set at the beginning, so that the accurate positioning of the vehicle is realized. Aiming at the inherent defects of the existing positioning scheme that the precision is low, the cost is high, or an external base station is needed, the invention provides a feasible low-cost positioning solution with precision meeting the requirement.

Description

Positioning method, device and system based on vehicle-mounted information
Technical Field
The invention belongs to the technical field of vehicle detection and control, and particularly relates to a positioning method, device and system based on vehicle-mounted information.
Background
At present, a related vehicle positioning system generally utilizes a civil GPS or Beidou navigation system, which provides position signal support for vehicle navigation, but the positioning precision is generally in the meter level, and the requirement of intelligent driving on high-precision positioning cannot be met. Or, the high-level intelligent driving system needs the support of the high-precision map, but the high-precision map needs high authorization cost, and the support of the high-power chip is needed, so that the corresponding hardware, development and authorization cost can be increased. Alternatively, using a differential GPS system, a base station that is stationary and consumes a large amount of energy needs to be set up in the field. However, the current vehicle system, particularly the commodity vehicle system, cannot introduce the requirement of higher-precision position positioning, and cannot effectively solve the problem that intelligent driving has higher requirement on high-precision positioning.
Disclosure of Invention
Aiming at the defects or improvement demands of the prior art, the invention provides a positioning method, a device and a system based on vehicle-mounted information, and provides a positioning solution with low precision and feasibility and low cost aiming at the inherent defects of the existing positioning scheme or low precision or high cost or the requirement of an external base station.
In order to achieve the above object, according to one aspect of the present invention, there is provided a positioning method based on vehicle-mounted information, comprising:
acquiring original image information of a vehicle looking around camera, calculating camera displacement by comparing rotation of feature points in the original image information, and then carrying out accumulated integration on the displacement of the camera in unit time to obtain the displacement of the camera in the environment, so as to obtain pose information of the vehicle in the environment, and taking the pose information as vehicle displacement for vehicle vision judgment;
positioning the position and the posture of the vehicle by using parameter signals on the vehicle to obtain displacement information judged by the physical information of the vehicle;
the vehicle displacement judged by the vehicle vision and the displacement information judged by the vehicle physical information are fused, and compared with the original point which is set at the beginning, so that the accurate positioning of the vehicle is realized.
In some optional embodiments, the calculating the displacement of the camera by comparing the rotation of the feature points in the original image information, and then integrating the displacement of the camera in unit time to obtain the displacement of the camera in the environment, so as to obtain the pose information of the vehicle in the environment, includes:
calculating the three-dimensional position of a point matched with the feature point in the environment under a camera coordinate system by adopting a cosine theorem through the pixel coordinates of the feature point in the original image information;
according to the three-dimensional positions of the same matching point under the camera coordinate systems of the front frame and the rear frame, pose transformation of the same matching point is obtained, and according to the conversion relation between the camera coordinate system and the world coordinate system, pose transformation of the camera is obtained;
according to the installation position of the camera on the vehicle, the pose information of the vehicle in the environment is obtained through the pose transformation of the camera, and the pose information of the vehicle in the environment is compared with an original point which is initially set, so that the vehicle displacement which is judged by the vision of the vehicle is obtained.
In some optional embodiments, the calculating the three-dimensional position of the point matched with the feature point in the environment under the camera coordinate system by using the cosine theorem through the pixel coordinates of the feature point in the original image information includes:
let a, B, C be the position of the feature point in the original image information under the image pixel coordinate system, three points A, B, C be the points matching with a, B, C under the world coordinate system, O be the center point of the camera coordinate system;
recording device
Figure BDA0004163299850000021
From (1-mu) y 2 -μx 2 Cos (b, c) y+2 μxy cos (a, b) +1=0, and (1- ω) x 2 -ωx 2 -cos (a, C) x+2μ xy cos (a, B) +1=0 is solved by wu-method to obtain the 3D coordinates of a, B, C under camera coordinate system, u represents ∈0>
Figure BDA0004163299850000022
And->
Figure BDA0004163299850000023
Is proportional to, ω represents ∈ ->
Figure BDA0004163299850000024
And->
Figure BDA0004163299850000025
Is a proportional coefficient of (2)
In some optional embodiments, the positioning of the vehicle position and posture by using the parameter signal on the vehicle to obtain the displacement information judged by the vehicle physical information includes:
the driving distance S increment and the heading angle theta increment are obtained by the pulse increment of one circle of wheel rotation, the wheel radius, the wheel track of the rear wheel, the pulse increment of the left rear wheel of the vehicle relative to the previous sampling period and the pulse increment of the right rear wheel of the vehicle relative to the previous sampling period;
and obtaining the pose of the vehicle relative to the starting position according to the increment of the driving distance S and the increment of the heading angle theta.
In some alternative embodiments, the method comprises
Figure BDA0004163299850000031
Obtaining the driving distance S increment from +.>
Figure BDA0004163299850000032
Obtaining the increment of the heading angle theta, wherein n is the pulse increment of one wheel rotation, R is the wheel radius and w b For rear wheel track, K n And k wb The vehicle radius adjustment coefficient and the rear wheel track adjustment coefficient, respectively, δrl represents the pulse increment of the left rear wheel of the vehicle with respect to the previous sampling period, δrr represents the pulse increment of the right rear wheel of the vehicle with respect to the previous sampling period.
In some alternative embodiments, the group consisting of x i =x i-1 +ΔS i ×sin(∑Δθ i )、y i =y i-1 +ΔS i ×cos(∑Δθ i ) Head-end i =∑Δθ i Or head-ing i = ≡yawrate·dt to get the pose of the vehicle relative to the starting position, where x i-1 For the last timeEngraving the abscissa, x of the vehicle i Is the abscissa, y of the vehicle at the current moment i-1 Is the ordinate of the vehicle at the last moment, y i As the ordinate of the vehicle at the current moment, deltaS i Delta theta is the running distance S increment at the current moment i Heading angle theta increment and heading at the current moment i The yawrate is the real-time yaw rate for the heading angle of the vehicle at the current time.
In some alternative embodiments, prior to calculating the vehicle displacement for the vehicle visual determination and the displacement information for the vehicle physical information determination, the method further comprises:
the vehicle initial position is set, the initial position is corrected by combining GPS information, and the vehicle-mounted information is time-synchronized.
According to another aspect of the present invention, there is provided a vehicle-mounted information-based positioning apparatus including:
the vehicle vision pose determining module is used for acquiring original image information of the vehicle looking around camera, calculating camera displacement by comparing rotation of feature points in the original image information, and then carrying out accumulated integration on the displacement of the camera in unit time to obtain the displacement of the camera in the environment, so as to obtain pose information of the vehicle in the environment, and the pose information is used as vehicle displacement for vehicle vision judgment;
the vehicle physical pose determining module is used for positioning the position and the pose of the vehicle by utilizing the parameter signals on the vehicle to obtain displacement information judged by the vehicle physical information;
and the fusion module is used for fusing the vehicle displacement judged by the vehicle vision and the displacement information judged by the vehicle physical information, and comparing the vehicle displacement with the original point which is set at the beginning to realize accurate positioning of the vehicle.
According to another aspect of the present invention, there is provided a vehicle-mounted information-based positioning system, including: the system comprises an all-round camera, a four-wheel speed sensor, a wheel pulse counter, an inertia measurement unit and a central controller;
the looking-around camera is used for acquiring original image information of the vehicle;
the four-wheel speed sensor is used for acquiring wheel speed information of the wheels;
the wheel pulse counter is used for counting the wheel tooth pulse of the wheel speed sensor and converting pulse signals into distance information;
the inertia measurement unit is used for providing transverse acceleration, longitudinal acceleration and transverse swing angular velocity information of the vehicle;
the central controller is used for realizing the steps of the method.
According to another aspect of the present invention there is provided a computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of any of the methods described above.
In general, the above technical solutions conceived by the present invention, compared with the prior art, enable the following beneficial effects to be obtained:
the invention adopts a special double-frame positioning design, adopts a mode of manually defining an initial origin, adopts the original image information of a vehicle looking around camera respectively, calculates the displacement of the camera by comparing the rotation of characteristic points in a visual image, and then integrates the displacement of the camera in unit time to obtain the displacement of the camera in the environment. And (3) merging the vehicle displacement judged by the vehicle vision and the displacement information judged by the vehicle physical information, and comparing the vehicle displacement judged by the vehicle vision with the original point set manually at the beginning to accurately position the vehicle in a certain scene range. The inherent limitations are overcome to some extent. Based on the current product configuration, the vehicle model with the function can be adopted, the function popularization can be realized, and the vehicle model has the capability of judging the pose and the angle of the vehicle.
Drawings
FIG. 1 is a schematic diagram of a method for improving the performance of a reversing image system based on an ultrasonic radar according to an embodiment of the present invention;
fig. 2 is a schematic diagram of a positioning method based on vehicle-mounted information according to an embodiment of the present invention;
FIG. 3 is a schematic flow chart of a positioning method based on vehicle-mounted information according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of feature point matching according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a matching set according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of a method for determining displacement information of vehicle physical information according to an embodiment of the present invention;
FIG. 7 is a flowchart of a positioning method based on vehicle-mounted information according to an embodiment of the present invention;
FIG. 8 is a functional diagram of a positioning method based on vehicle-mounted information provided by an embodiment of the invention;
fig. 9 is a schematic diagram of a positioning device based on vehicle-mounted information according to an embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention. In addition, the technical features of the embodiments of the present invention described below may be combined with each other as long as they do not collide with each other.
The invention meets the requirement of measurement precision based on part of sensors, and related signals are connected into the same controller, so that the vehicle-mounted information-based positioning method is convenient to design.
The invention has the advantages of the ultrasonic probe of the reversing radar system, has certain capacity of not being limited by night environment, can overcome the inherent limitation of the camera to a certain extent, ensures that the positioning precision of the advanced intelligent driving assistance system can be improved in partial scenes, and enables the performance to be improved.
In order to achieve the purpose, the embodiment of the invention provides a high-precision positioning scheme under a partial scene with 0 cost or extremely low cost, as shown in fig. 1, wherein the scheme is controlled by a four-wheel speed sensor, an ESC controller, a wheel pulse counter, an EPS controller, an inertial measurement unit IMU, a central controller and the like which support partial algorithm;
the wheel speed sensor is used for deducing the wheel speed information of the corresponding wheel, and is generally provided with 48 pairs (96 pairs or 192 pairs are higher) of magnetic pole terminals at a high level, 96 (192 or 384 are higher) gear tooth pulses can be generated when the wheel rotates for each circle, so that the travel distance of the wheel in one circle can be equally divided into 96 (192 or 384 are higher) so as to be convenient for accurately knowing the displacement distance of the vehicle;
wheel pulse counter for counting wheel tooth pulse of wheel speed sensor and converting pulse signal into distance information
The ESC controller has basic functions of stabilizing the vehicle, preventing the vehicle from locking, and the like, and executes corresponding braking requests in a parking system to control the longitudinal movement of the vehicle; the invention has the special function of integrating the information of the wheel speed sensor and intensively outputting the optimized and stable vehicle speed information;
the central controller is used for storing and operating mechanisms of the algorithm modules, fusing and receiving information of other sensors on the vehicle and storing algorithm programs with normal other functions;
the EPS controller is used for executing a corresponding steering control request in the vehicle system;
an inertia measurement unit for providing information on the lateral and longitudinal acceleration of the vehicle and the yaw rate thereof;
TCU gear: in the present invention, gear information is provided to the central controller for determining the forward (reverse) direction of the vehicle for further use in trajectory estimation.
The invention realizes a positioning method based on vehicle-mounted information through the functional components, and provides high-precision positioning for automatic driving through semantic information and vehicle self information as shown in fig. 2, and mainly comprises the following two aspects: collecting original image information around a vehicle in real time through a looking-around camera, collecting characteristic points or key semantic information in the environment through the camera, comparing rotation and displacement of the characteristic points or the semantic information to obtain the position of the camera relative to the characteristic points, accumulating integration, and outputting vehicle positioning information; vehicle sensors (wheel speed sensor, wheel encoder, gyroscope, steering wheel angle information, etc.) provide coarse positioning for the vehicle; and carrying out data fusion and filtering on the vehicle displacement judged by the vehicle vision and the vehicle displacement judged by the vehicle physical information, and finally outputting positioning information.
In an embodiment of the present invention, a fusion filtering method is provided as follows:
the vehicle displacement judged by the vehicle vision forms the position 1 of the vehicle in the space map, the vehicle displacement judged by the vehicle physical information forms the position 2 of the vehicle in the space map, the data fusion and the filtering are carried out on the vehicle displacement judged by the vehicle vision and the vehicle displacement judged by the vehicle physical information, the elimination is carried out mainly when the jump of the position information is too large, and the average calculation is carried out when the jump is within a normal reasonable threshold.
Fig. 3 is a schematic flow chart of a positioning method based on vehicle-mounted information according to an embodiment of the present invention, where the method shown in fig. 3 includes the following steps:
s1: acquiring original image information of a vehicle looking around camera, calculating camera displacement by comparing rotation of feature points in the original image information, and then carrying out accumulated integration on the displacement of the camera in unit time to obtain the displacement of the camera in the environment, so as to obtain pose information of the vehicle in the environment, and taking the pose information as vehicle displacement for vehicle vision judgment;
s2: positioning the position and the posture of the vehicle by using parameter signals on the vehicle to obtain displacement information judged by the physical information of the vehicle;
s3: the vehicle displacement judged by the vehicle vision and the displacement information judged by the vehicle physical information are fused, and compared with the original point which is set at the beginning, so that the accurate positioning of the vehicle is realized.
In the embodiment of the invention, step S1 relies on a visual positioning algorithm of the looking-around camera, and matches through the movement relation of the feature points to obtain rotation and translation changes of the camera, so as to integrate and accumulate to obtain continuous track changes of the vehicle, and specifically, the method can be realized by the following steps:
s1.1: calculating the three-dimensional position of a point matched with the feature point in the environment under a camera coordinate system by adopting a cosine theorem through the pixel coordinates of the feature point in the original image information;
s1.2: according to the three-dimensional positions of the same matching point under the camera coordinate systems of the front frame and the rear frame, pose transformation of the same matching point is obtained, and according to the conversion relation between the camera coordinate system and the world coordinate system, pose transformation of the camera is obtained;
s1.3: according to the installation position of the camera on the vehicle, the pose information of the vehicle in the environment is obtained through the pose transformation of the camera, and the pose information of the vehicle in the environment is compared with an original point which is initially set, so that the vehicle displacement which is judged by the vision of the vehicle is obtained.
As shown in fig. 4, three points a, B, and C are feature point positions in a world coordinate system, the world coordinate system is an absolute coordinate system of a real world coordinate system, a, B, and C are positions of corresponding points in an image pixel coordinate system, three-dimensional positions of the three points a, B, and C in a camera coordinate system are calculated by using a cosine law from known pixel coordinates, and a center point of the camera coordinate system is assumed to be O.
Recording device
Figure BDA0004163299850000081
x represents the ratio of the OA edge to the OC edge of the projection model, and is otherwise similar, deduced from the cosine theorem:
(1-μ)y 2 -μx 2 -cos(b,c)y+2μxy cos(a,b)+1=0
(1-ω)x 2 -ωx 2 -cos(a,c)x+2μxy cos(a,b)+1=0
wherein x, y is an unknown quantity, and u represents
Figure BDA0004163299850000082
And->
Figure BDA0004163299850000083
Is proportional to, ω represents ∈ ->
Figure BDA0004163299850000084
And->
Figure BDA0004163299850000085
Solving the scale coefficients of the images by a primordial elimination method to obtain 3D coordinates of A, B and C in a camera coordinate system, as shown in figure 5, P i Representing a matched feature point set, A, B and C belong to P i ,u i Representing P i The projection point set of (a), a, b, c) belongs to u i . Because of the same P i The three-dimensional position under the camera coordinate system of the front and back frames is known, and P can be calculated i The pose transformation of the camera is obtained. The installation position of the camera on the vehicle is known, and then the visual calculated displacement and attitude information of the vehicle are obtained through conversion, wherein the real-time vehicle position is used for P iz And then comparing the manually set origin points to obtain the visual position information of the vehicle, and fusing the visual position information with the physical displacement positioning information of the vehicle.
In the embodiment of the present invention, as shown in fig. 6, step S2 may be specifically implemented by:
s2.1: the driving distance S increment and the heading angle theta increment are obtained by the pulse increment of one circle of wheel rotation, the wheel radius, the wheel track of the rear wheel, the pulse increment of the left rear wheel of the vehicle relative to the previous sampling period and the pulse increment of the right rear wheel of the vehicle relative to the previous sampling period;
s2.2: and obtaining the pose of the vehicle relative to the starting position according to the increment of the driving distance S and the increment of the heading angle theta.
In the embodiment of the present invention, as shown in fig. 7, the initial position of the vehicle needs to be set, and the initial position can be marked and corrected by referring to the GPS signal. Manually pressing a specific interaction button by a driver to finish setting the initial position of the vehicle; in the running process, the vehicle pose parameters can be calculated by utilizing the pulses of the left rear wheel and the right rear wheel, and the speed instantaneous centers of the two rear axle wheels are the same according to the Ackerman steering principle, so that the displacement and the angular speed of the center of the rear axle can be calculated according to the running distance or the speed of the two rear wheels. The whole process is divided into a plurality of cells through calculus knowledge, and the interval length is the measurement time interval. Respectively accumulating intervals to obtain the course angle variation at each moment and the coordinate variation of (x, y) in a vehicle body coordinate system; and then based on the assumed pose coordinates (0, 0) at the initial moment, obtaining the pose coordinates (x, y, theta) of the vehicle at each moment. The values of RL and RR CAN be read from the car CAN line and represent the pulse numbers of the pulse counters of the left and right rear wheels of the vehicle, respectively.
At each sampling point, pulse increment δrl and δrr of the pulse counter relative to the previous sampling period can be calculated, and the time synchronization controller refers to that the time synchronization controller is generally integrated in a certain controller without physical software, and needs to perform time synchronization on pulse signals.
In the embodiment of the invention, the impact pulse increment can obtain the running distance S increment:
Figure BDA0004163299850000091
the increment of the heading angle theta can be further obtained:
Figure BDA0004163299850000092
wherein n is the pulse increment of one wheel rotation, R is the wheel radius, and w b For rear wheel track, K n And k wb The vehicle radius adjustment coefficient and the rear wheel tread adjustment coefficient are respectively.
From this, the pose of the car relative to the starting position can be deduced:
x i =x i-1 +ΔS i ×sin(∑Δθ i )
y i =y i-1 +ΔS i ×coS(∑Δθ i )
heading i =ΣΔθ i
the course angle of the vehicle body can also be obtained by integrating the yaw rate data of the IMU of the vehicle body:
heading i =∫yawrate·dt
wherein yawrate is the real-time yaw rate.
Thus, in resolving the position of the vehicle, the (x, y) solution can be calculated using the heading obtained by integrating the yaw rate. Namely:
Figure BDA0004163299850000101
finally, the vehicle position and course angle are obtained as the representation of the actual condition of the vehicle.
As shown in fig. 8, the specific steps are as follows:
1. judging vehicle state information through a vehicle speed or GPS signal, determining that a function is in a standby state or an on state, then setting a vehicle starting point position, manually pressing a specific interactive button or operating a remote key by a driver to finish setting a vehicle initial position, setting a vehicle current position or a key current position as a positioning origin, defaulting the current point to be 0 point after the system receives a request signal, and setting vehicle mileage and angle information at the moment to be 0;
2. correcting the positioning, marking the initial position by combining with navigation GPS information, namely recording the GPS information of the starting point position of the vehicle, and carrying out re-correction by referring to the navigation information if mileage drift (such as a wheel slip working condition) occurs in the follow-up running;
3. the time synchronization module performs time synchronization on the pulse signals and signals sent by various sensors on the vehicle, so that subsequent operation under the unified time dimension is ensured;
4. the semantic identification acquired in real time through the looking-around camera is accurately matched with the semantic map, the semantic identification is accurately matched with the camera to acquire characteristic points or key semantic information in the environment, the position of the camera relative characteristic points is obtained through comparing the rotation and displacement of the characteristic points or the semantic information, and finally integral accumulation (integral accumulation is carried out on the displacement of the camera in unit time to form total displacement) is carried out, so that vehicle positioning information is output;
5. calculating a distance increment S (namely transverse dimension and longitudinal dimension), and calculating a course angle increment theta;
6. calculating the relative initial position of the vehicle by combining the initial position of the vehicle, the increment S of the distance of the vehicle and the increment theta of the course angle of the vehicle, and calculating the phase course angle of the vehicle;
7. and fusing the positioning information to output the final positioning position of the vehicle.
Through the technical scheme, the following beneficial effects can be achieved:
safety benefit: the invention can be used for positioning based on the information in the vehicle under various environmental conditions, especially under the condition of no high-precision map system, and ensures that part of driving functions are safe and available.
Economic benefit: under the condition that other equipment is not newly added, the design of the accurate positioning function of the vehicle in part of scenes is completed, and the cost is saved.
Potential customer benefit: the scheme of the invention can be popularized to each vehicle type, and a new function selling point (function release) is provided for the enterprise commodity. The safety of the customers is guaranteed.
Enterprise benefit: the invention is simple and practical, is suitable for all vehicle types, can interact with signals of various vehicle types and operates in a modularized manner.
Fig. 9 is a schematic diagram of a positioning device based on vehicle information according to an embodiment of the present invention, including:
the vehicle vision pose determining module 901 is configured to obtain original image information of a vehicle looking around camera, calculate a camera displacement by comparing rotation of feature points in the original image information, and then integrate the displacement of the camera in unit time to obtain the displacement of the camera in the environment, obtain pose information of the vehicle in the environment, and use the pose information as vehicle displacement for vehicle vision judgment;
the vehicle physical pose determining module 902 is configured to perform positioning of a vehicle position and a pose by using a parameter signal on a vehicle, so as to obtain displacement information determined by vehicle physical information;
the fusion module 903 is configured to fuse the vehicle displacement determined by the vehicle vision and the displacement information determined by the vehicle physical information, and compare the vehicle displacement with an origin point set at the beginning, so as to achieve accurate positioning of the vehicle.
The specific implementation of each module may refer to the description of the method embodiment, and the embodiment of the present invention will not be repeated.
It should be noted that each step/component described in the present application may be split into more steps/components, or two or more steps/components or part of the operations of the steps/components may be combined into new steps/components, as needed for implementation, to achieve the object of the present invention.
It will be readily appreciated by those skilled in the art that the foregoing description is merely a preferred embodiment of the invention and is not intended to limit the invention, but any modifications, equivalents, improvements or alternatives falling within the spirit and principles of the invention are intended to be included within the scope of the invention.

Claims (10)

1. The positioning method based on the vehicle-mounted information is characterized by comprising the following steps of:
acquiring original image information of a vehicle looking around camera, calculating camera displacement by comparing rotation of feature points in the original image information, and then carrying out accumulated integration on the displacement of the camera in unit time to obtain the displacement of the camera in the environment, so as to obtain pose information of the vehicle in the environment, and taking the pose information as vehicle displacement for vehicle vision judgment;
positioning the position and the posture of the vehicle by using parameter signals on the vehicle to obtain displacement information judged by the physical information of the vehicle;
the vehicle displacement judged by the vehicle vision and the displacement information judged by the vehicle physical information are fused, and compared with the original point which is set at the beginning, so that the accurate positioning of the vehicle is realized.
2. The method according to claim 1, wherein the calculating the camera displacement by comparing the rotation of the feature points in the original image information, and then integrating the displacement of the camera per unit time to obtain the displacement of the camera in the environment, and obtaining the pose information of the vehicle in the environment, comprises:
calculating the three-dimensional position of a point matched with the feature point in the environment under a camera coordinate system by adopting a cosine theorem through the pixel coordinates of the feature point in the original image information;
according to the three-dimensional positions of the same matching point under the camera coordinate systems of the front frame and the rear frame, pose transformation of the same matching point is obtained, and according to the conversion relation between the camera coordinate system and the world coordinate system, pose transformation of the camera is obtained;
according to the installation position of the camera on the vehicle, the pose information of the vehicle in the environment is obtained through the pose transformation of the camera, and the pose information of the vehicle in the environment is compared with an original point which is initially set, so that the vehicle displacement which is judged by the vision of the vehicle is obtained.
3. The method according to claim 2, wherein the calculating the three-dimensional position of the point matching the feature point in the environment under the camera coordinate system by using the cosine theorem through the pixel coordinates of the feature point in the original image information comprises:
let a, B, C be the position of the feature point in the original image information under the image pixel coordinate system, three points A, B, C be the points matching with a, B, C under the world coordinate system, O be the center point of the camera coordinate system;
recording device
Figure FDA0004163299840000021
From (1 mu) y 2 -μx 2 Cos (b, c) y+2 μxycos (a, b) +1=0, and (1- ω) x 2 -ωx 2 -cos (a, C) x+2μxycos (a, B) +1=0 is solved by wu-method to obtain the 3D coordinates of a, B, C in the camera coordinate system, u represents>
Figure FDA0004163299840000022
And->
Figure FDA0004163299840000023
Is proportional to, ω represents ∈ ->
Figure FDA0004163299840000024
And->
Figure FDA0004163299840000025
Is a proportional coefficient of (c).
4. A method according to any one of claims 1 to 3, wherein the positioning of the vehicle position and posture using the parameter signals on the vehicle to obtain the displacement information for determining the vehicle physical information comprises:
the driving distance S increment and the heading angle theta increment are obtained by the pulse increment of one circle of wheel rotation, the wheel radius, the wheel track of the rear wheel, the pulse increment of the left rear wheel of the vehicle relative to the previous sampling period and the pulse increment of the right rear wheel of the vehicle relative to the previous sampling period;
and obtaining the pose of the vehicle relative to the starting position according to the increment of the driving distance S and the increment of the heading angle theta.
5. The method according to claim 4, characterized by that, by
Figure FDA0004163299840000026
Figure FDA0004163299840000027
Obtaining the driving distance S increment from +.>
Figure FDA0004163299840000028
Obtaining the increment of the heading angle theta, wherein n is the pulse increment of one wheel rotation, R is the wheel radius and w b For rear wheel track, K n And k wb The vehicle radius adjustment coefficient and the rear wheel track adjustment coefficient, respectively, δrl represents the pulse increment of the left rear wheel of the vehicle with respect to the previous sampling period, δrr represents the pulse increment of the right rear wheel of the vehicle with respect to the previous sampling period.
6. The method according to claim 5, wherein the method is characterized by x i =x i-1 +ΔS i ×sin(∑Δθ i )、y i =y i-1 +ΔS i ×cos(∑Δθ i ) Head-end i =∑Δθ i Or head-ing i = ≡yawrate·dt to get the pose of the vehicle relative to the starting position, where x i-1 X is the abscissa of the vehicle at the previous moment i For the current momentAbscissa, y of vehicle i-1 Is the ordinate of the vehicle at the last moment, y i As the ordinate of the vehicle at the current moment, deltaS i Delta theta is the running distance S increment at the current moment i Heading angle theta increment and heading at the current moment i The yawrate is the real-time yaw rate for the heading angle of the vehicle at the current time.
7. The method of claim 1, wherein prior to calculating the vehicle displacement for the vehicle visual determination and the displacement information for the vehicle physical information determination, the method further comprises:
the vehicle initial position is set, the initial position is corrected by combining GPS information, and the vehicle-mounted information is time-synchronized.
8. A vehicle-based information positioning device, comprising:
the vehicle vision pose determining module is used for acquiring original image information of the vehicle looking around camera, calculating camera displacement by comparing rotation of feature points in the original image information, and then carrying out accumulated integration on the displacement of the camera in unit time to carry out accumulated integration on the displacement of the camera in unit time to obtain the displacement of the camera in the environment, so as to obtain pose information of the vehicle in the environment, and the pose information is used as vehicle displacement for vehicle vision judgment;
the vehicle physical pose determining module is used for positioning the position and the pose of the vehicle by utilizing the parameter signals on the vehicle to obtain displacement information judged by the vehicle physical information;
and the fusion module is used for fusing the vehicle displacement judged by the vehicle vision and the displacement information judged by the vehicle physical information, and comparing the vehicle displacement with the original point which is set at the beginning to realize accurate positioning of the vehicle.
9. A vehicle-based information positioning system, comprising: the system comprises an all-round camera, a four-wheel speed sensor, a wheel pulse counter, an inertia measurement unit and a central controller;
the looking-around camera is used for acquiring original image information of the vehicle;
the four-wheel speed sensor is used for acquiring wheel speed information of the wheels;
the wheel pulse counter is used for counting the wheel tooth pulse of the wheel speed sensor and converting pulse signals into distance information;
the inertia measurement unit is used for providing transverse acceleration, longitudinal acceleration and transverse swing angular velocity information of the vehicle;
the central controller being adapted to implement the steps of the method of any one of claims 1 to 7.
10. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method according to any one of claims 1 to 7.
CN202310355844.7A 2023-04-04 2023-04-04 Positioning method, device and system based on vehicle-mounted information Pending CN116222558A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310355844.7A CN116222558A (en) 2023-04-04 2023-04-04 Positioning method, device and system based on vehicle-mounted information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310355844.7A CN116222558A (en) 2023-04-04 2023-04-04 Positioning method, device and system based on vehicle-mounted information

Publications (1)

Publication Number Publication Date
CN116222558A true CN116222558A (en) 2023-06-06

Family

ID=86573303

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310355844.7A Pending CN116222558A (en) 2023-04-04 2023-04-04 Positioning method, device and system based on vehicle-mounted information

Country Status (1)

Country Link
CN (1) CN116222558A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116499456A (en) * 2023-06-28 2023-07-28 苏州中德睿博智能科技有限公司 Automatic positioning device and method for mobile robot and positioning system for unmanned mower

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116499456A (en) * 2023-06-28 2023-07-28 苏州中德睿博智能科技有限公司 Automatic positioning device and method for mobile robot and positioning system for unmanned mower
CN116499456B (en) * 2023-06-28 2023-09-05 苏州中德睿博智能科技有限公司 Automatic positioning device and method for mobile robot and positioning system for unmanned mower

Similar Documents

Publication Publication Date Title
CN113945206B (en) Positioning method and device based on multi-sensor fusion
CN113819914B (en) Map construction method and device
CN109946732B (en) Unmanned vehicle positioning method based on multi-sensor data fusion
US8855848B2 (en) Radar, lidar and camera enhanced methods for vehicle dynamics estimation
Gehrig et al. Dead reckoning and cartography using stereo vision for an autonomous car
CN113819905B (en) Mileage metering method and device based on multi-sensor fusion
EP3029538B1 (en) Vehicle position/bearing estimation device and vehicle position/bearing estimation method
CN112433531A (en) Trajectory tracking method and device for automatic driving vehicle and computer equipment
CN211956223U (en) Lane change track planning system
CN116222558A (en) Positioning method, device and system based on vehicle-mounted information
Wang et al. Vision-based vehicle body slip angle estimation with multi-rate Kalman filter considering time delay
KR20190040818A (en) 3D vehicular navigation system using vehicular internal sensor, camera, and GNSS terminal
CN111413990A (en) Lane change track planning system
CN112078570A (en) Automobile positioning method based on Ackerman steering model
CN113589820A (en) Auxiliary processing method, device and system for remote driving
CN113494910A (en) Vehicle positioning method and device based on UWB positioning and storage medium
CN107901913B (en) The vehicle centroid side drift angle and coefficient of road adhesion estimating system of Multi-source Information Fusion
CN117308972A (en) Vehicle positioning method, device, storage medium and electronic equipment
Katriniok Optimal vehicle dynamics control and state estimation for a low-cost GNSS-based collision avoidance system
Eising et al. 2.5 D vehicle odometry estimation
Kuyt et al. Mixed kinematics and camera based vehicle dynamic sideslip estimation for an rc scaled model
Chaichaowarat et al. Kinematics-based analytical solution for wheel slip angle estimation of a RWD vehicle with drift
CN114705199A (en) Lane-level fusion positioning method and system
US11662454B2 (en) Systems and methods for range-rate dealiasing using position consistency
CN111284496B (en) Lane tracking method and system for autonomous vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination