CN112068567B - Positioning method and positioning system based on ultra-wideband and visual image - Google Patents

Positioning method and positioning system based on ultra-wideband and visual image Download PDF

Info

Publication number
CN112068567B
CN112068567B CN202010974013.4A CN202010974013A CN112068567B CN 112068567 B CN112068567 B CN 112068567B CN 202010974013 A CN202010974013 A CN 202010974013A CN 112068567 B CN112068567 B CN 112068567B
Authority
CN
China
Prior art keywords
vehicle
coordinate system
base station
positioning
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010974013.4A
Other languages
Chinese (zh)
Other versions
CN112068567A (en
Inventor
吴一稷
黄旭东
欧阳乐
刘春明
王全宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Zhenghua Heavy Industries Co Ltd
CCCC Highway Long Bridge Construction National Engineering Research Center Co Ltd
Original Assignee
Shanghai Zhenghua Heavy Industries Co Ltd
CCCC Highway Long Bridge Construction National Engineering Research Center Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Zhenghua Heavy Industries Co Ltd, CCCC Highway Long Bridge Construction National Engineering Research Center Co Ltd filed Critical Shanghai Zhenghua Heavy Industries Co Ltd
Priority to CN202010974013.4A priority Critical patent/CN112068567B/en
Publication of CN112068567A publication Critical patent/CN112068567A/en
Application granted granted Critical
Publication of CN112068567B publication Critical patent/CN112068567B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/16Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods

Abstract

The invention discloses a positioning method and a positioning system based on ultra-wideband and visual images, wherein the positioning method comprises the following steps: establishing a vehicle coordinate system with the center of the vehicle to be positioned as an origin, and obtaining a first position relationship between the ultra-wideband positioning module and the center of the vehicle to be positioned in the vehicle coordinate system; establishing a global coordinate system of an area where the vehicle to be positioned is located, and obtaining a second position relationship between the center of the vehicle to be positioned and the ultra-wideband positioning module in the global coordinate system; obtaining a first functional formula which is satisfied by the coordinates of the center of the vehicle to be positioned in the global coordinate system according to the first position relation and the second position relation; acquiring image information of a lane line by using a camera device, and obtaining a second functional formula which is satisfied by coordinates of a vehicle center to be positioned in a global coordinate system according to the image information; and obtaining the coordinates of the center of the vehicle to be positioned in the global coordinate system according to the first functional formula and the second functional formula. The method has the characteristics of real-time positioning, high positioning precision and simple communication architecture.

Description

Positioning method and positioning system based on ultra-wideband and visual image
Technical Field
The invention relates to the field of automatic driving, in particular to a positioning method and a positioning system based on ultra-wideband and visual images.
Background
Currently, in the field of Ultra Wideband (UWB) wireless positioning, whether Time difference of Arrival (Time Difference of Arrival, TDOA) positioning or Time of Arrival (TOA) positioning, a plurality of base stations are used as coordinates, and a plurality of circles or hyperbolas intersect, so as to determine the position of a mobile point, wherein the base stations need to be set by means of a fixed support. However, in actual use, when the base station is in a scene without a fixed support, if a plurality of base stations are to be fixed, additional base equipment such as uprights, supports, power supplies, networks and the like are required to be added, so that the construction amount is increased, and the installation cost is increased.
In addition to UWB wireless positioning, visual positioning is also a relatively common positioning method, which mainly uses a visual sensor to identify a lane line, thereby achieving positioning. However, the lane line belongs to a one-dimensional feature, so that the position information in the direction perpendicular to the lane line is easy to distinguish, and the position information in the direction parallel to the lane line is difficult to distinguish, so that the problem of uneven geometric distribution of positioning accuracy exists in a visual positioning mode, and the application of an automatic driving vehicle in a high-precision scene is not facilitated.
Disclosure of Invention
The invention aims to solve the problems of high installation cost and uneven visual positioning precision distribution caused by the fact that in the scene without a fixed support, ultra-wideband wireless positioning needs to additionally install a plurality of fixed supports and other devices in the prior art. The invention provides a positioning method, which can realize positioning by using fewer devices such as mounting brackets and the like so as to reduce the mounting cost, and simultaneously solves the problem of uneven distribution of visual positioning precision, thereby achieving the effects of real-time positioning, high positioning precision and simple communication architecture.
Based on the above, the embodiment of the invention discloses a positioning method based on ultra-wideband and visual images, which is used for vehicle positioning and comprises the following steps:
a first acquisition step: establishing a vehicle coordinate system with the center of the vehicle to be positioned as an origin, and obtaining a first position relationship between the ultra-wideband positioning module and the center of the vehicle to be positioned in the vehicle coordinate system;
a second acquisition step: establishing a global coordinate system of an area where the vehicle to be positioned is located, and obtaining a second position relationship between the center of the vehicle to be positioned and the ultra-wideband positioning module in the global coordinate system;
ultra-wideband positioning: obtaining a first functional formula which is satisfied by the coordinates of the center of the vehicle to be positioned in the global coordinate system according to the first position relation and the second position relation;
image positioning: acquiring image information of a lane line by using a camera device, and obtaining a second functional formula which is satisfied by coordinates of a vehicle center to be positioned in a global coordinate system according to the image information;
the processing steps are as follows: and obtaining the coordinates of the center of the vehicle to be positioned in the global coordinate system according to the first functional formula and the second functional formula.
By adopting the technical scheme, the ultra-wideband positioning information and the visual positioning information are combined, so that the accurate position judgment is finally realized, and the method has the characteristics of real-time positioning, high positioning precision and simple communication architecture.
According to another specific embodiment of the invention, two camera devices are respectively arranged at the head and the tail of the vehicle to be positioned; the image positioning step further comprises: and obtaining a course angle of the vehicle to be positioned according to the image pickup device, wherein the course angle is an included angle between the positive direction of the vehicle to be positioned and the positive direction of the X coordinate axis of the global coordinate system.
According to another embodiment of the present invention, the ultra wideband positioning module includes a plurality of positioning tags and a base station, and the first obtaining step includes:
fixing a plurality of positioning labels on a vehicle to be positioned;
establishing a vehicle coordinate system with the center of the vehicle to be positioned as an origin, and respectively acquiring coordinates of each positioning tag in the vehicle coordinate system;
establishing a TDOA equation set according to the corresponding time stamp when the positioning tag receives the base station signal and the coordinates of each positioning tag in a vehicle coordinate system;
solving a TDOA equation set to obtain a first base station coordinate of the base station in a vehicle coordinate system;
and obtaining a first base station distance between the base station and the center of the vehicle to be positioned in the vehicle coordinate system according to the first base station coordinate.
According to another embodiment of the invention, the number of base stations is 1, and the TDOA equation set is
Wherein, (x) i ,y i ,z i ) Representing the coordinates of the ith positioning tag in the vehicle coordinate system, t i Representing the timestamp of the i-th positioning tag when it received the base station signal, i=1, 2,3,4; (x) v ,y v ,z v ) Representing the first base station coordinates, and C represents the propagation velocity of the base station signal.
According to another embodiment of the present invention, the ultra wideband positioning module includes a base station, and the second obtaining step includes:
establishing a global coordinate system of an area where the vehicle to be positioned is located, and acquiring a second base station coordinate of the base station in the global coordinate system;
and obtaining the distance between the base station and the second base station of the center of the vehicle to be positioned in the global coordinate system according to the second base station coordinates.
According to another embodiment of the invention, the first function is
Wherein x and y respectively represent the abscissa and the ordinate of the center of the vehicle to be positioned in the global coordinate system, and x bs ,y bs Representing the abscissa and ordinate, x, respectively, of the base station in the global coordinate system v ,y v Representing the abscissa and ordinate, respectively, of the base station in the vehicle coordinate system.
According to another embodiment of the present invention, the image positioning step includes:
acquiring image information of a lane line of a driving lane by using an image pickup device;
performing perspective transformation and image clipping on the image information of the lane lines, and performing image binarization conversion to obtain a binary image;
extracting a lane line area in the binary image and fitting to obtain a curve equation of the lane line in an image coordinate system;
obtaining the distance between the center of the vehicle to be positioned and the lane line according to the curve equation obtained in the last step;
and determining a second functional formula according to the curve equation of the lane line in the global coordinate system and the distance obtained in the last step.
According to another embodiment of the invention, the second function is
Wherein A, B, C is the parameter of the curve equation of the lane line in the global coordinate system, dis represents the distance between the center of the vehicle to be positioned and the lane line, and x and y represent the abscissa and ordinate of the center of the vehicle to be positioned in the global coordinate system, respectively.
According to another embodiment of the present invention, in the image locating step, before determining the second functional formula according to the curve equation of the lane line in the global coordinate system and the distance obtained in the previous step, the method further includes:
and acquiring the position information of the lane line in the global coordinate system by using the total station, and constructing a curve equation of the lane line in the global coordinate system according to the position information.
According to another embodiment of the present invention, a method for obtaining a heading angle of a vehicle to be positioned according to an image capturing device includes:
the method comprises the steps that a camera device is used for respectively acquiring lane line image information of a driving lane of a vehicle head and a driving lane of a vehicle tail to be positioned, and respectively acquiring a first distance between the vehicle head and the driving lane and a second distance between the vehicle tail and the driving lane according to the lane line image information;
and calculating a course angle according to the first distance, the second distance and the vehicle body distance between the vehicle head and the vehicle tail.
According to another embodiment of the invention, the processing steps comprise:
constructing a simultaneous equation set according to the first functional formula and the second functional formula;
and solving the equation set, and determining the coordinates of the center of the vehicle to be positioned in the global coordinate system according to the prestored priori coordinates.
According to another embodiment of the invention, after the processing step, it further comprises:
and a storage step of storing the coordinates obtained in the processing step.
Correspondingly, the embodiment of the invention also discloses a positioning system based on the ultra-wideband and the visual image, which comprises the following steps:
the ultra-wideband positioning module is used for positioning the vehicle to be positioned, respectively obtaining a first position relation between the ultra-wideband positioning module and the center of the vehicle to be positioned in a vehicle coordinate system and a second position relation between the ultra-wideband positioning module and the center of the vehicle to be positioned in a global coordinate system, and obtaining a first functional formula which is satisfied by the coordinates of the center of the vehicle to be positioned in the global coordinate system according to the first position relation and the second position relation;
an image positioning module, comprising: the image pick-up device is used for collecting image information of the lane lines; the image resolving engine is connected with the image pick-up device and used for obtaining a second functional formula which is satisfied by the coordinates of the center of the vehicle to be positioned in the global coordinate system according to the image information collected by the image pick-up device;
and the processing module is used for determining the coordinates of the center of the vehicle to be positioned in the global coordinate system according to the first function formula and the second function formula.
According to another specific embodiment of the invention, two camera devices are respectively arranged at the head and the tail of the vehicle to be positioned; the image resolving engine is further used for obtaining a course angle of the vehicle to be positioned according to the image capturing device, wherein the course angle is an included angle between the positive direction of the vehicle to be positioned and the positive direction of the X coordinate axis of the global coordinate system.
According to another embodiment of the present invention, an ultra wideband positioning module includes:
a base station transmitting base station signals at fixed intervals;
the positioning tag is arranged on the vehicle to be positioned and is used for receiving the base station signal;
the wireless positioning engine is used for establishing a TDOA equation set according to the corresponding time stamp when each positioning tag receives the base station signal and the coordinates of each positioning tag in the vehicle coordinate system, solving the TDOA equation set to obtain a first base station coordinate of the base station in the vehicle coordinate system, and obtaining a first base station distance between the base station and the center of the vehicle to be positioned in the vehicle coordinate system according to the first base station coordinate; and obtaining a second base station distance between the base station and the center of the vehicle to be positioned in the global coordinate system according to the second base station coordinates, and establishing a first function formula according to the first base station distance and the second base station distance, wherein the second base station coordinates are coordinates of the base station in the global coordinate system.
According to another specific embodiment of the invention, the number of the positioning labels is 4, and the positioning labels are respectively arranged at four corners of the vehicle; the number of base stations is 1.
According to another embodiment of the present invention, an image localization engine includes:
the image processing unit is used for performing perspective transformation and image clipping on the image information of the lane lines and performing image binarization conversion to obtain a binary image;
the data fitting unit is used for extracting a lane line area in the binary image and fitting to obtain a curve equation of the lane line in the image coordinate system;
the first calculation unit is used for obtaining the distance between the center of the vehicle to be positioned and the lane line according to the curve equation of the data fitting unit;
and the second calculation unit is used for determining a second function according to the curve equation of the lane line in the global coordinate system and the distance obtained by the first calculation unit.
According to another embodiment of the present invention, the image resolving engine further includes:
the angle calculation unit is used for respectively obtaining a first distance between the head and the lane line and a second distance between the tail and the lane line according to lane line image information of a driving lane of the head and the tail of the vehicle to be positioned, which are respectively acquired by the camera device; and calculating a course angle according to the first distance, the second distance and the vehicle body distance between the vehicle head and the vehicle tail.
According to another embodiment of the invention, the positioning system further comprises:
and the storage module is used for storing the coordinates of the center of the vehicle to be positioned in the global coordinate system.
Compared with the prior art, the invention has the following technical effects:
by adopting the technical scheme, the base station positioning information and the laser radar positioning information are combined, the accurate position is finally judged, equipment such as excessive fixed brackets and the like is not required to be installed, the construction quantity is reduced, the installation cost is also reduced, the problem of uneven distribution of laser positioning precision is solved, and the effects of real-time positioning, high positioning precision and simple communication architecture are achieved.
Drawings
FIG. 1 illustrates a flow chart of an ultra wideband and visual image based positioning method in accordance with an embodiment of the present invention;
fig. 2 shows a schematic structural diagram of an ultra-wideband and visual image based positioning system according to an embodiment of the present invention.
Detailed Description
Further advantages and effects of the present invention will become apparent to those skilled in the art from the disclosure of the present specification, by describing the embodiments of the present invention with specific examples. While the description of the invention will be described in connection with the preferred embodiments, it is not intended to limit the inventive features to the implementation. Rather, the purpose of the invention described in connection with the embodiments is to cover other alternatives or modifications, which may be extended by the claims based on the invention. The following description contains many specific details for the purpose of providing a thorough understanding of the present invention. The invention may be practiced without these specific details. Furthermore, some specific details are omitted from the description in order to avoid obscuring the invention. It should be noted that, without conflict, the embodiments of the present invention and features of the embodiments may be combined with each other.
It should be noted that in this specification, like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures.
In the description of the present embodiment, it should be noted that the terms "first," "second," and the like are used merely to distinguish between descriptions and should not be construed as indicating or implying relative importance.
In the description of the present embodiment, it should also be noted that, unless explicitly specified and limited otherwise, the terms "disposed", "connected" and "connected" are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium, and can be communication between two elements. The specific meaning of the above terms in the present embodiment can be understood in a specific case by those of ordinary skill in the art.
For the purpose of making the objects, technical solutions and advantages of the present invention more apparent, embodiments of the present invention will be described in further detail below with reference to the accompanying drawings.
As shown in fig. 1, one embodiment of the present invention provides an ultra-wideband and visual image-based positioning method for vehicle positioning, comprising:
step S1, a first acquisition step: establishing a vehicle coordinate system taking the center of the vehicle to be positioned as an origin, and obtaining a first position relationship between the UWB positioning module and the center of the vehicle to be positioned in the vehicle coordinate system;
step S2, a second acquisition step: establishing a global coordinate system of an area where the vehicle to be positioned is located, and obtaining a second position relationship between the center of the vehicle to be positioned and the UWB positioning module in the global coordinate system;
step S3, ultra-wideband positioning: obtaining a first functional formula which is satisfied by the coordinates of the center of the vehicle to be positioned in the global coordinate system according to the first position relation and the second position relation;
s4, an image positioning step: acquiring image information of a lane line by using a camera device, and obtaining a second functional formula which is satisfied by coordinates of a vehicle center to be positioned in a global coordinate system according to the image information;
step S5, processing steps: and obtaining the coordinates of the center of the vehicle to be positioned in the global coordinate system according to the first functional formula and the second functional formula.
Because the coordinates of the vehicle center to be positioned in the global coordinate system are unique, the coordinates satisfying both the first function and the second function are the required coordinates of the vehicle center to be positioned in the global coordinate system, i.e. the coordinates are solutions of the equation set consisting of the first function and the second function in parallel.
According to the positioning method provided by the invention, the accurate position judgment is finally realized by combining the UWB positioning information and the laser radar positioning information, and the positioning method has the characteristics of real-time positioning, high positioning precision and simple communication architecture.
Further, the ultra-wideband positioning module may include a plurality of positioning tags and base stations, optionally, the number of base stations is 1, and step S1 (i.e. the first obtaining step) may specifically include:
step S11, fixing a plurality of positioning labels on the vehicle to be positioned.
Alternatively, the number of tags is 4, and the tags are installed on a vehicle to be positioned, typically on four top corners of the vehicle, and the positions of the tags are fixed.
And step S12, establishing a vehicle coordinate system taking the center of the vehicle to be positioned as an origin, and respectively acquiring the coordinates of each positioning label in the vehicle coordinate system.
Specifically, a projection point of the center of the vehicle to be positioned, which is vertically projected to the ground, is taken as an origin, a positive direction of vehicle running is taken as an X axis, a direction perpendicular to the positive direction is taken as a Y axis, and a Z axis is taken as a Z axis to establish a vehicle coordinate system. Each positioning tag has a unique coordinate in the vehicle coordinate system, and the coordinates of each positioning tag in the vehicle coordinate system can be obtained through measurement.
And S13, establishing a TDOA equation set according to the corresponding time stamp and the coordinates of each positioning tag in the vehicle coordinate system when the positioning tag receives the base station signal.
Specifically, the base station signals are broadcast by the base stations at fixed intervals, wherein the base station signals carry base station IDs, which can be used by the positioning tags to distinguish between different base stations. The fixed interval may be determined according to actual requirements, typically in the order of milliseconds or ten milliseconds. In the invention, the number of the base stations is 1, and the base stations are arranged in the area to be positioned and have fixed positions. Of course, the number of base stations may also be set to 2 or more.
Further, the TDOA equation set is
Wherein, (x) i ,y i ,z i ) Representing the coordinates of the ith positioning tag in the vehicle coordinate system, t i Representing the timestamp of the i-th positioning tag when it received the base station signal, i=1, 2,3,4; (x) v ,y v ,z v ) Representing the first base station coordinates, and C represents the propagation velocity of the base station signal. It is noted that the base station signals received by each positioning tag are under the same clock reference, i.e. the corresponding time stamp t is when the base station signals received by each positioning tag 1 To t 4 Is at the same clock reference time.
Step S14, solving the TDOA equation set, i.e. the above equation (1), to obtain the first base station coordinates (x v ,y v ,z v )。
Specifically, z v The height of the base station from the ground can be obtained by measurement.
Step S15, according to the first base station coordinates, a first base station distance between the base station and the center of the positioning vehicle in the vehicle coordinate system is obtained.
Further, the first base station distance between the base station and the center of the positioning vehicle as referred to herein means a distance in the horizontal plane of both the base station and the center of the positioning vehicle, i.e., a distance in the XY plane of the vehicle coordinate system excluding the amount of change in distance due to the change in height, the abscissa and the ordinate of the center of the vehicle to be positioned are both zero, and thus the distanceCan be expressed asWherein x is v ,y v Representing the abscissa and ordinate, respectively, of the base station in the vehicle coordinate system.
Further, step S2 (i.e., the second obtaining step) may specifically include:
and S21, establishing a global coordinate system of the region where the vehicle to be positioned is located, and acquiring a second base station coordinate of the base station in the global coordinate system.
Specifically, in the present invention, the global coordinate system may be defined in a specific implementation environment, for example, the global coordinate system may be established with the position of the base station on the ground as the origin, the direction parallel to the lane line as the X-axis, the direction perpendicular to the lane line as the Y-axis, and the direction perpendicular to the ground upwards as the Z-axis. Because the base station is fixed in position, the base station has unique coordinates in a global coordinate system, and specific coordinate values can be obtained through measurement, for example, the base station can be obtained by using a measuring tool such as a total station.
Step S22, according to the second base station coordinates, obtaining the distance between the base station and the second base station of the center of the vehicle to be positioned in the global coordinate system, namelyWherein x, y represents the abscissa and the ordinate of the center of the vehicle to be positioned in the global coordinate system, and x bs ,y bs Representing the abscissa and ordinate, respectively, of the base station in the global coordinate system.
Further, since the distance does not change in either the global coordinate system or the vehicle coordinate system, the first function formula satisfied by the coordinates of the vehicle center to be positioned in the global coordinate system is established by using the fact that the distance between the base station and the vehicle center to be positioned is unchanged (i.e., the first base station distance R1 is equal to the second base station distance R2), specifically as follows:
wherein x and y respectively represent the abscissa and the ordinate of the center of the vehicle to be positioned in the global coordinate system, and x bs ,y bs Representing the abscissa and ordinate, x, respectively, of the base station in the global coordinate system v ,y v Representing the abscissa and ordinate, respectively, of the base station in the vehicle coordinate system.
Further, step S4 (i.e. the image positioning step) may specifically include:
step S41, acquiring image information of a lane line of a driving lane by using an image pickup device.
Specifically, the camera device may be a camera and be fixed on the vehicle to be positioned, and in this embodiment, only the coordinates of the vehicle to be positioned in the vertical direction relative to the lane line need to be known, so that excessive environmental information may not be acquired. When the camera device is fixed, the included angle between the central axis of the camera device and the ground is reasonably set, so that the visual angle range comprises a lane line.
And step S42, performing perspective transformation and image clipping processing on the image information of the lane lines, and performing image binarization conversion to obtain a binary image.
Specifically, image information of a lane line acquired by an image pickup device is received, and quasi perspective transformation is performed on the image information. Because camera imaging is a perspective effect formed according to the principle of near-far-small, although in the real world two lane lines are parallel, in the image imaged by the camera, the lane lines we see gradually converge to a point at a distance. At this time, the image can be restored to a top view effect (i.e., an effect of viewing the road surface from the sky) by using pseudo-perspective transformation (mainly by a 4-point transformation method). Thus in the new effect map the lane lines are parallel to each other. In addition, in implementation, the included angle between the central axis of the camera device and the ground can be set to be a larger angle, so that the angle is closer to the view angle of the top view, and the distortion of the changed image is less.
Further, since two lane lines of the current lane are involved in the final position determination, in order to reduce the calculation amount, the image may be cut out, and a portion far from the central axis of the image pickup device may be removed, and the distance may be generally 1 time the lane width.
Further, the image binarization conversion may specifically include: and determining a reasonable threshold according to the distribution histogram of the image pixels, and performing binarization conversion to obtain a binary image. Because the color of the lane line is obviously different from the color of the ground, the lane line is generally white and the ground is generally black in the binary image, so that the lane line is obviously different from the ground, and the image recognition is easy.
And S43, extracting a lane line area in the binary image, and fitting to obtain a curve equation of the lane line in the image coordinate system.
Specifically, since the lane lines are generally two, the curve equation of the lane line obtained in this step is generally two. And because of the parallel relationship of the lane lines, the obtained curve equation should also satisfy the parallel relationship, and according to the characteristics, the curve equation can be used for mutually checking the lane lines.
And S44, obtaining the distance between the center of the vehicle to be positioned and the lane line according to the curve equation obtained in the last step.
Specifically, the image resolving engine may be used to obtain the percentages of the distances between the image capturing device and the two lane lines in the image, and then the actual distances between the image capturing device and the two lane lines in the global coordinate system, that is, the actual distances between the center of the vehicle to be positioned and the lane lines, are calculated by using the proportional relationship between the distances between the lane lines in the image and the actual distances between the lane lines in the global coordinate system.
And step S45, determining a second function according to the curve equation of the lane line in the global coordinate system and the distance obtained in the previous step.
Further, before step S45, the image positioning step further includes:
and step S45-1, acquiring the position information of the lane line in the region to be positioned in the global coordinate system by using the total station, and constructing a curve equation of the lane line in the global coordinate system according to the position information. Specifically, a curve equation of the lane line in the global coordinate system may be expressed as ax+by+c=0.
Further, since the direction of the driver' S license is considered to be approximately parallel to the lane line in the region to be positioned during the running of the vehicle to be positioned, a second function satisfied by the coordinates of the center of the vehicle to be positioned in the global coordinate system can be constructed from the vertical distance of the center of the vehicle to be positioned with respect to the lane line (i.e., the distance between the center of the vehicle to be positioned and the lane line) and the parameter A, B, C in the curve equation of the lane line found in step S45-1, which can be expressed as:
wherein A, B, C is a parameter of a lane line curve equation under the global coordinate system, dis is a distance from the center of the vehicle to be positioned to the lane line, and x and y respectively represent an abscissa and an ordinate of the center of the vehicle to be positioned in the global coordinate system.
Further, step S5 (i.e., a processing step) may specifically include:
and S51, constructing a simultaneous equation set according to the first function formula and the second function formula.
Specifically, the system of equations is
And step S52, solving the equation set, and determining the coordinates of the center of the vehicle to be positioned in the global coordinate system according to the prestored priori coordinates.
Specifically, the solutions of the equation set (4) are two sets, but because the running process of the vehicle to be positioned is continuous, one set of solutions which do not conform to the actual situation can be removed according to the coordinate values in the global coordinate system at the moment on the center of the vehicle to be positioned, so that the unique coordinate (x, y) of the center of the vehicle to be positioned in the global coordinate system at the moment can be determined.
Further, two camera devices are respectively arranged at the head and the tail of the vehicle to be positioned; the image positioning step may further include: and obtaining a course angle of the vehicle to be positioned according to the camera device, wherein the course angle is an included angle between the positive direction (namely the direction of the head) of the vehicle to be positioned and the positive direction of the X coordinate axis of the global coordinate system.
Further, the course angle acquisition method specifically comprises the following steps:
the method comprises the steps that a camera device is used for respectively acquiring lane line image information of a driving lane of a head and a tail of a vehicle to be positioned, and respectively acquiring a first distance d1 between the head and the lane line and a second distance d2 between the tail and the lane line according to the lane line image information;
and calculating a course angle A according to the first distance d1, the second distance d2 and the vehicle body distance d3 between the vehicle head and the vehicle tail.
Specifically, the vehicle body distance d3 may be derived from the measurement. By using the related knowledge of the triangle angle, the method can obtain
Further, after step S5, the method further includes:
and S6, a storage step, namely storing the coordinates determined in the processing step. So as to facilitate the checking of prior coordinate information the next time the coordinate calculation is performed on the vehicle to be positioned. Optionally, the heading angle of the vehicle to be positioned obtained in the image positioning step may also be stored.
The positioning method based on the ultra-wideband and visual images provided by the invention carries out fusion calculation based on the ultra-wideband wireless positioning data and the image positioning data, namely, the base station is utilized to broadcast the base station signal at fixed intervals, a plurality of vehicle-mounted tags receive the base station signal, and the image information acquired by the camera device is combined, so that the base station positioning information and the image positioning information are combined on the basis of only setting one base station, and finally, the judgment of the accurate position of the vehicle to be positioned is realized, and the positioning method has the advantages of real-time positioning, high positioning precision and simple communication architecture. Because only one base station is arranged, equipment such as excessive fixed brackets and the like does not need to be installed, so that the construction amount is reduced, and the installation cost is also reduced.
Correspondingly, as shown in fig. 2, the invention also provides a positioning system for vehicle positioning, which comprises an ultra-wideband positioning module 1, an image positioning module 2 and a processing module 3.
The ultra-wideband positioning module 1 is used for positioning the vehicle to be positioned, respectively obtaining a first position relation between the ultra-wideband positioning module and the center of the vehicle to be positioned in a vehicle coordinate system and a second position relation between the ultra-wideband positioning module and the center of the vehicle to be positioned in a global coordinate system, and obtaining a first functional formula which is satisfied by the coordinates of the center of the vehicle to be positioned in the global coordinate system according to the first position relation and the second position relation.
The image positioning module 2 includes: an imaging device 21 for acquiring image information of a lane line; the image resolving engine 22 is connected to the image capturing device 21, and is configured to obtain a second functional formula that is satisfied by coordinates of the center of the vehicle to be positioned in the global coordinate system according to the image information acquired by the image capturing device 21. Specifically, the image pickup device may be a camera, and is disposed on the vehicle to be positioned.
And the processing module 3 is used for determining the coordinates of the center of the vehicle to be positioned in the global coordinate system according to the first function formula and the second function formula.
Further, the ultra wideband positioning module 1 may include a base station 13 transmitting base station signals at fixed intervals, a positioning tag 12 and a wireless positioning engine 11, which are communicatively connected to the base station 13, respectively.
Specifically, the positioning tag 12 may be disposed on a vehicle to be positioned, and is configured to receive a base station 13 signal transmitted by the base station 13; the wireless positioning engine is fixedly arranged on a vehicle to be positioned, and is used for establishing a TDOA equation set according to the corresponding time stamp when each positioning tag 12 receives the signal of the base station 13 and the coordinates of each positioning tag 12 in a vehicle coordinate system, and solving the TDOA equation set to obtain the first base station coordinate of the base station 13 in the vehicle coordinate system; obtaining a first base station distance between a base station and a positioning vehicle center in a vehicle coordinate system according to the first base station coordinates; and obtaining a second base station distance between the base station and the center of the vehicle to be positioned in the global coordinate system according to the second base station coordinates, and establishing a first function formula according to the first base station distance and the second base station distance, wherein the second base station coordinates are coordinates of the base station in the global coordinate system.
Specifically, in the present embodiment, the number of base stations 13 is 1, fixedly disposed in the area to be located; in other embodiments, the number of base stations 13 may also be set to 2 or more. The number of the positioning tags 12 may be 4, and are fixedly installed at four corners of the vehicle to be positioned, respectively, and receive the base station 13 signal under the control of the synchronizer with the consent of the clock reference. The time difference of the signals of the base station 13 is received by the four positioning tags 12, and an arrival time difference equation set is constructed, so that the coordinates of the base station 13 in the vehicle coordinate system.
Further, the base station 13 may include a control unit 131, a power unit 132, an ultra wideband unit 133, a timing unit 134, and a heartbeat backhaul unit 135.
The base station 13 obtains a time reference through the time service unit 134, and performs TDMA broadcasting time slot division according to the installation position, so as to avoid interference of adjacent cells. The base station 13 is divided into broadcast slots and non-broadcast slots according to the difference in the slot division result. In the broadcast slot, the base station 13 transmits a broadcast signal through the ultra wideband unit 133. In the non-broadcast time slot, the control unit 131 of the base station 13 controls the ultra-wideband unit 133 to enter a listening mode, and when the number of base stations is two or more, the ultra-wideband unit 133 is generally used for listening to a broadcast signal from a neighboring base station, and the broadcast signal has no meaning on the current base station in terms of positioning, and is only used for proving that the neighboring base station works normally. The normal information is transmitted back to the equipment maintenance unit through the heartbeat back transmission unit 135, and the base station without heartbeat back transmission for a long time can be considered to be damaged, so that the later maintenance is convenient. The transmission time slot of the heartbeat backhaul can also be controlled according to the time service unit 134, and the transmission interval is generally in the order of minutes or longer according to the requirement. The equipment maintenance unit generally comprises a LORA receiving base station and a server, wherein the LORA receiving base station is used for receiving the heartbeat signal, the server stores equipment information, and judges the equipment state according to the heartbeat signal to inform maintenance personnel. The power source of the power source unit 132 may be a wired power source, a storage battery, a solar battery, or the like.
Further, the image resolution engine 22 may include: an image processing unit 221, a data fitting unit 222, a first calculation unit 223, and a second calculation unit 224.
Wherein, the image processing unit 221 performs perspective transformation and image clipping processing on the image information of the lane lines, and then performs image binarization conversion to obtain a binary image; the data fitting unit 222 extracts and fits the lane line areas in the binary image to obtain a curve equation of the lane lines in the image coordinate system; a first calculation unit 223 that obtains a distance between the center of the vehicle to be positioned and the lane line according to the curve equation of the data fitting unit; the second calculation unit 224 determines a second function from the curve equation of the lane line in the global coordinate system and the distance obtained by the first calculation unit.
Further, the image resolving engine 22 may further include an angle calculating unit 225 for respectively obtaining a first distance between the head and the lane line and a second distance between the tail and the lane line according to lane line image information of a driving lane of the head and the tail of the vehicle to be positioned, which are respectively acquired by the image capturing device 21 fixed to the head and the tail positions of the vehicle to be positioned; and calculating a course angle according to the first distance, the second distance and the vehicle body distance between the vehicle head and the vehicle tail, wherein the course angle is an included angle between the positive direction of the vehicle to be positioned and the positive direction of the X coordinate axis of the global coordinate system.
Further, the positioning system may further comprise a storage module 4 for storing the heading angle of the vehicle to be positioned and the coordinates of the center of the vehicle to be positioned in a global coordinate system. Specifically, the storage module 4 and the processing module 3 may be integrated into an industrial personal computer to perform calculation and storage of data.
The positioning system provided by the invention utilizes the ultra-wideband positioning module 1 and the image positioning module 2 to realize fusion positioning, namely, utilizes the base station 13 to broadcast the signals of the base station 13 at fixed intervals, and a plurality of vehicle-mounted tags receive the signals of the base station 13 and combine the image information acquired by the camera device 21 to finally realize the judgment of the accurate position of the vehicle to be positioned, and has the advantages of real-time positioning, high positioning precision and simple communication architecture.
While the invention has been shown and described with reference to certain preferred embodiments thereof, it will be understood by those skilled in the art that the foregoing is a further detailed description of the invention with reference to specific embodiments, and it is not intended to limit the practice of the invention to those descriptions. Various changes in form and detail may be made therein by those skilled in the art, including a few simple inferences or alternatives, without departing from the spirit and scope of the present invention.

Claims (19)

1. A positioning method based on ultra-wideband and visual images for vehicle positioning, comprising:
a first acquisition step: establishing a vehicle coordinate system taking the center of a vehicle to be positioned as an origin, and obtaining a first position relationship between an ultra-wideband positioning module and the center of the vehicle to be positioned in the vehicle coordinate system;
a second acquisition step: establishing a global coordinate system of an area where a vehicle to be positioned is located, and obtaining a second position relation between the center of the vehicle to be positioned and the ultra-wideband positioning module in the global coordinate system;
ultra-wideband positioning: obtaining a first function formula which is satisfied by the coordinates of the center of the vehicle to be positioned in the global coordinate system according to the first position relation and the second position relation;
image positioning: acquiring image information of a lane line by using an image pickup device, and obtaining a second function formula satisfied by coordinates of the center of the vehicle to be positioned in the global coordinate system according to the image information;
the processing steps are as follows: and obtaining the coordinates of the center of the vehicle to be positioned in the global coordinate system according to the first functional formula and the second functional formula.
2. The method of claim 1, wherein the number of the image pick-up devices is two, and the image pick-up devices are respectively arranged at the head and the tail of the vehicle to be positioned; the image positioning step further includes: and obtaining a course angle of the vehicle to be positioned according to the camera device, wherein the course angle is an included angle between the positive direction of the vehicle to be positioned and the positive direction of the X coordinate axis of the global coordinate system.
3. The method of claim 1, wherein the ultra-wideband positioning module includes a plurality of positioning tags and a base station, the first obtaining step comprising:
fixing the plurality of positioning tags on the vehicle to be positioned;
establishing a vehicle coordinate system taking the center of the vehicle to be positioned as an origin, and respectively acquiring the coordinates of each positioning tag in the vehicle coordinate system;
establishing a TDOA equation set according to the corresponding time stamp when the positioning tag receives the base station signal and the coordinates of each positioning tag in the vehicle coordinate system;
solving the TDOA equation set to obtain a first base station coordinate of the base station in the vehicle coordinate system;
and obtaining a first base station distance between the base station and the center of the vehicle to be positioned in the vehicle coordinate system according to the first base station coordinate.
4. A method as recited in claim 3, wherein the number of base stations is 1, and the TDOA equation set is
Wherein, (x) i ,y i ,z i ) Representing the coordinates of the ith positioning tag in the vehicle coordinate system, t i Representing a timestamp of when the i-th positioning tag received the base station signal, i=1, 2,3,4; (x) v ,y v ,z v ) Representing the first base station coordinates, C representing the propagation velocity of the base station signal.
5. The method of claim 1, wherein the ultra-wideband positioning module comprises a base station, and the second obtaining step comprises:
establishing a global coordinate system of the region where the vehicle to be positioned is located, and acquiring a second base station coordinate of the base station in the global coordinate system;
and obtaining the distance between the base station and the second base station of the center of the vehicle to be positioned in the global coordinate system according to the second base station coordinates.
6. The method of claim 3 or 5, wherein the first function is
Wherein x and y respectively represent the abscissa and the ordinate of the center of the vehicle to be positioned in the global coordinate system, and x bs ,y bs Represents the abscissa and ordinate, x, respectively, of the base station in the global coordinate system v ,y v Representing the abscissa and the ordinate of the base station in the vehicle coordinate system, respectively.
7. The method of claim 1, wherein the image locating step comprises:
acquiring image information of the lane line of a driving lane by using the image pickup device;
performing perspective transformation and image clipping on the image information of the lane lines, and performing image binarization conversion to obtain a binary image;
extracting a lane line area in the binary image, and fitting to obtain a curve equation of the lane line in the image coordinate system;
obtaining the distance between the center of the vehicle to be positioned and the lane line according to the curve equation obtained in the last step;
and determining the second function according to a curve equation of the lane line in the global coordinate system and the distance obtained in the last step.
8. The method of claim 7, wherein the second function is
Wherein A, B, C is a parameter of a curve equation of the lane line in the global coordinate system, dis represents a distance between the center of the vehicle to be positioned and the lane line, and x and y represent an abscissa and an ordinate of the center of the vehicle to be positioned in the global coordinate system, respectively.
9. The method of claim 7, wherein in the image locating step, before the determining the second functional formula from the curve equation of the lane line in the global coordinate system and the distance obtained in the previous step, further comprises:
and acquiring the position information of the lane line in the global coordinate system by using a total station, and constructing a curve equation of the lane line in the global coordinate system according to the position information.
10. The method according to claim 2, wherein the method of obtaining the heading angle of the vehicle to be positioned from the image pickup device includes:
the method comprises the steps that lane line image information of a driving lane of a vehicle head and a driving lane of a vehicle tail to be positioned are respectively acquired by using the camera device, and a first distance between the vehicle head and the lane line and a second distance between the vehicle tail and the lane line are respectively obtained according to the lane line image information;
and calculating the course angle according to the first distance, the second distance and the vehicle body distance between the vehicle head and the vehicle tail.
11. The method of claim 1, wherein the processing step comprises:
constructing a simultaneous equation set according to the first functional formula and the second functional formula;
and solving the equation set, and determining the coordinates of the center of the vehicle to be positioned in the global coordinate system according to the prestored priori coordinates.
12. The method of claim 1, further comprising, after the processing step:
a storage step of storing the coordinates obtained in the processing step.
13. A positioning system based on ultra-wideband and visual images for vehicle positioning, comprising:
the ultra-wideband positioning module is used for positioning the vehicle to be positioned, respectively obtaining a first position relation between the ultra-wideband positioning module and the center of the vehicle to be positioned in a vehicle coordinate system and a second position relation between the ultra-wideband positioning module and the center of the vehicle to be positioned in a global coordinate system, and obtaining a first function formula which is satisfied by the coordinates of the center of the vehicle to be positioned in the global coordinate system according to the first position relation and the second position relation;
an image positioning module, comprising: the image pick-up device is used for collecting image information of the lane lines; the image resolving engine is connected with the image pick-up device and used for obtaining a second functional formula which is satisfied by the coordinates of the center of the vehicle to be positioned in the global coordinate system according to the image information acquired by the image pick-up device;
and the processing module is used for determining the coordinates of the center of the vehicle to be positioned in the global coordinate system according to the first functional formula and the second functional formula.
14. The system of claim 13, wherein the number of the camera devices is two, and the camera devices are respectively arranged at the head and the tail of the vehicle to be positioned; the image resolving engine is further used for obtaining a course angle of the vehicle to be positioned according to the image pickup device, wherein the course angle is an included angle between the positive direction of the vehicle to be positioned and the positive direction of the X coordinate axis of the global coordinate system.
15. The system of claim 13, wherein the ultra-wideband positioning module comprises:
a base station transmitting base station signals at fixed intervals;
the positioning tag is arranged on the vehicle to be positioned and is used for receiving the base station signal;
the wireless positioning engine is used for establishing a TDOA equation set according to the time stamp corresponding to each positioning tag when the positioning tag receives the base station signal and the coordinate of each positioning tag in a vehicle coordinate system, solving the TDOA equation set to obtain a first base station coordinate of the base station in the vehicle coordinate system, and obtaining a first base station distance between the base station and the center of the vehicle to be positioned in the vehicle coordinate system according to the first base station coordinate; and obtaining a second base station distance between the base station and the center of the vehicle to be positioned in the global coordinate system according to a second base station coordinate, and establishing the first function according to the first base station distance and the second base station distance, wherein the second base station coordinate is the coordinate of the base station in the global coordinate system.
16. The system of claim 15, wherein the number of the positioning tags is 4, and the positioning tags are respectively arranged at four corners of the vehicle; the number of the base stations is 1.
17. The system of claim 13, wherein the image resolution engine comprises:
the image processing unit is used for performing perspective transformation and image clipping on the image information of the lane lines and performing image binarization conversion to obtain a binary image;
the data fitting unit is used for extracting a lane line area in the binary image and fitting to obtain a curve equation of the lane line in the image coordinate system;
the first calculation unit is used for obtaining the distance between the center of the vehicle to be positioned and the lane line according to the curve equation of the data fitting unit;
and the second calculation unit is used for determining the second function formula according to a curve equation of the lane line in the global coordinate system and the distance obtained by the first calculation unit.
18. The system of claim 14, wherein the image resolution engine comprises:
the angle calculation unit is used for respectively obtaining a first distance between the head and the lane line and a second distance between the tail and the lane line according to lane line image information of the driving lanes of the head and the tail of the vehicle to be positioned, which are respectively acquired by the camera device; and calculating the course angle according to the first distance, the second distance and the vehicle body distance between the vehicle head and the vehicle tail.
19. The system as recited in claim 13, further comprising:
and the storage module is used for storing the coordinates of the center of the vehicle to be positioned in the global coordinate system.
CN202010974013.4A 2020-09-16 2020-09-16 Positioning method and positioning system based on ultra-wideband and visual image Active CN112068567B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010974013.4A CN112068567B (en) 2020-09-16 2020-09-16 Positioning method and positioning system based on ultra-wideband and visual image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010974013.4A CN112068567B (en) 2020-09-16 2020-09-16 Positioning method and positioning system based on ultra-wideband and visual image

Publications (2)

Publication Number Publication Date
CN112068567A CN112068567A (en) 2020-12-11
CN112068567B true CN112068567B (en) 2023-11-24

Family

ID=73696100

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010974013.4A Active CN112068567B (en) 2020-09-16 2020-09-16 Positioning method and positioning system based on ultra-wideband and visual image

Country Status (1)

Country Link
CN (1) CN112068567B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113569800A (en) * 2021-08-09 2021-10-29 北京地平线机器人技术研发有限公司 Lane recognition and verification method and device, readable storage medium and electronic equipment
CN115546766B (en) * 2022-11-30 2023-04-07 广汽埃安新能源汽车股份有限公司 Lane line generation method, lane line generation device, electronic device, and computer-readable medium
CN116700290B (en) * 2023-07-12 2024-01-26 湖南人文科技学院 Intelligent trolley positioning control system and method based on UWB

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106879067A (en) * 2017-01-17 2017-06-20 广州土圭垚信息科技有限公司 A kind of ultra-wideband wireless positioning method based on Double deference duplex
CN107886143A (en) * 2017-11-01 2018-04-06 阳光凯讯(北京)科技有限公司 Intelligent repository goods and materials orientation management system based on ultra wide band and Image Coding
CN110490936A (en) * 2019-07-15 2019-11-22 杭州飞步科技有限公司 Scaling method, device, equipment and the readable storage medium storing program for executing of vehicle camera
CN110487562A (en) * 2019-08-21 2019-11-22 北京航空航天大学 One kind being used for unpiloted road-holding ability detection system and method
CN111174792A (en) * 2020-01-16 2020-05-19 上海电机学院 UWB-based unmanned aerial vehicle indoor pipeline detection image acquisition method
CN111413970A (en) * 2020-03-18 2020-07-14 天津大学 Ultra-wideband and vision integrated indoor robot positioning and autonomous navigation method
CN111582079A (en) * 2020-04-24 2020-08-25 杭州鸿泉物联网技术股份有限公司 Lane positioning method and device based on computer vision

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2554417C (en) * 2004-02-17 2010-11-23 Jadi, Inc. Ultra wide band navigation system with mobile base stations
WO2008116168A1 (en) * 2007-03-21 2008-09-25 Jadi, Inc. Navigation unit and base station

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106879067A (en) * 2017-01-17 2017-06-20 广州土圭垚信息科技有限公司 A kind of ultra-wideband wireless positioning method based on Double deference duplex
CN107886143A (en) * 2017-11-01 2018-04-06 阳光凯讯(北京)科技有限公司 Intelligent repository goods and materials orientation management system based on ultra wide band and Image Coding
CN110490936A (en) * 2019-07-15 2019-11-22 杭州飞步科技有限公司 Scaling method, device, equipment and the readable storage medium storing program for executing of vehicle camera
CN110487562A (en) * 2019-08-21 2019-11-22 北京航空航天大学 One kind being used for unpiloted road-holding ability detection system and method
CN111174792A (en) * 2020-01-16 2020-05-19 上海电机学院 UWB-based unmanned aerial vehicle indoor pipeline detection image acquisition method
CN111413970A (en) * 2020-03-18 2020-07-14 天津大学 Ultra-wideband and vision integrated indoor robot positioning and autonomous navigation method
CN111582079A (en) * 2020-04-24 2020-08-25 杭州鸿泉物联网技术股份有限公司 Lane positioning method and device based on computer vision

Also Published As

Publication number Publication date
CN112068567A (en) 2020-12-11

Similar Documents

Publication Publication Date Title
CN112068567B (en) Positioning method and positioning system based on ultra-wideband and visual image
CN110174093B (en) Positioning method, device, equipment and computer readable storage medium
CA3027921C (en) Integrated sensor calibration in natural scenes
CN107734449B (en) Outdoor auxiliary positioning method, system and equipment based on optical label
CN106019264A (en) Binocular vision based UAV (Unmanned Aerial Vehicle) danger vehicle distance identifying system and method
CN104776849A (en) Vehicle positioning device and method
CN111932627B (en) Marker drawing method and system
CN113989766A (en) Road edge detection method and road edge detection equipment applied to vehicle
JP2019078700A (en) Information processor and information processing system
CN111353453A (en) Obstacle detection method and apparatus for vehicle
CN111538008B (en) Transformation matrix determining method, system and device
CN112598756B (en) Roadside sensor calibration method and device and electronic equipment
CN112040446B (en) Positioning method and positioning system
CN109631841B (en) Method and device for measuring cross section of expressway based on laser projection
CN115083203B (en) Method and system for inspecting parking in road based on image recognition berth
CN112150576A (en) High-precision vector map acquisition system and method
CN113727434B (en) Vehicle-road cooperative auxiliary positioning system and method based on edge computing gateway
CN113561897B (en) Method and system for judging ramp parking position of driving test vehicle based on panoramic all-round view
CN114697858A (en) Inspection vehicle berth positioning device, method and system
CN110887488A (en) Unmanned rolling machine positioning method
CN114241775B (en) Calibration method for mobile radar and video image, terminal and readable storage medium
CN115272490B (en) Method for calibrating camera of road-end traffic detection equipment
CN216771967U (en) Multi-laser radar calibration system and unmanned mining vehicle
Bassani et al. Alignment data collection of highways using mobile mapping and image analysis techniques
CN115440034A (en) Vehicle-road cooperation realization method and system based on camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20210913

Address after: 200125 No. 3261 Dongfang Road, Shanghai, Pudong New Area

Applicant after: SHANGHAI ZHENHUA HEAVY INDUSTRIES Co.,Ltd.

Applicant after: CCCC HIGHWAY BRIDGES NATIONAL ENGINEERING RESEARCH CENTER Co.,Ltd.

Address before: 200125 No. 3261 Dongfang Road, Shanghai, Pudong New Area

Applicant before: SHANGHAI ZHENHUA HEAVY INDUSTRIES Co.,Ltd.

GR01 Patent grant
GR01 Patent grant