CN116309814B - Vehicle pose determination method, device, computing equipment and medium - Google Patents

Vehicle pose determination method, device, computing equipment and medium Download PDF

Info

Publication number
CN116309814B
CN116309814B CN202211515807.XA CN202211515807A CN116309814B CN 116309814 B CN116309814 B CN 116309814B CN 202211515807 A CN202211515807 A CN 202211515807A CN 116309814 B CN116309814 B CN 116309814B
Authority
CN
China
Prior art keywords
vehicle
determining
coordinates
mounted camera
pose
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211515807.XA
Other languages
Chinese (zh)
Other versions
CN116309814A (en
Inventor
兰晓松
何贝
刘鹤云
张岩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Sinian Zhijia Technology Co ltd
Original Assignee
Beijing Sinian Zhijia Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Sinian Zhijia Technology Co ltd filed Critical Beijing Sinian Zhijia Technology Co ltd
Priority to CN202211515807.XA priority Critical patent/CN116309814B/en
Publication of CN116309814A publication Critical patent/CN116309814A/en
Application granted granted Critical
Publication of CN116309814B publication Critical patent/CN116309814B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30256Lane; Road marking
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the disclosure provides a vehicle pose determining method, a vehicle pose determining device, a computing device and a medium. The vehicle pose determining method comprises the following steps: acquiring a road image; extracting features of the road image to obtain straight lines where two parallel lane lines are located in the road image, determining vanishing points based on the two straight lines, and determining angle external parameters of the vehicle-mounted camera based on pixel coordinates of the vanishing points; determining the space coordinates of the lane lines based on the angle external parameters of the vehicle-mounted camera, the position external parameters of the vehicle-mounted camera, the internal parameters of the vehicle-mounted camera and the pixel coordinates of the lane lines, wherein the space coordinates are the coordinates of the lane lines in a vehicle coordinate system; and determining the pose of the vehicle according to the space coordinates of the lane lines. By adopting the scheme provided by the embodiment of the disclosure, the real-time angle external parameters of the vehicle-mounted camera can be determined according to the real-time road image, and the calculation of the space coordinates of the lane lines can be realized according to the real-time angle external parameters, so that the pose of the vehicle can be more accurately determined.

Description

Vehicle pose determination method, device, computing equipment and medium
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to a vehicle pose determining method, a device, a computing apparatus, and a storage medium.
Background
Because the traffic marking lines can be conveniently painted on the harbor ground, and a large number of buildings in the station can cause poor navigation positioning accuracy, the automatic driving technology of the vehicle based on image detection is widely applied to the application scenes of harbors, stations and the like. In the foregoing scenario, an in-vehicle camera for acquiring a road image is mostly installed in a cab top area at a vehicle head. Because the cab and the vehicle body are not completely rigidly connected (hinge connection is often adopted), when the vehicle passes through an uneven road (such as a portal crane track in a port), the external parameters of the vehicle-mounted camera change in real time along with the jolt of the vehicle head, so that the pose of the vehicle calculated based on the pre-calibrated external parameters has larger error with the pose of the actual vehicle, and the rationality of automatic driving decision is further influenced.
Disclosure of Invention
In order to solve the technical problems, embodiments of the present disclosure provide a vehicle pose determining method, device, computing equipment and medium.
In a first aspect, an embodiment of the present disclosure provides a vehicle pose determining method, including:
acquiring a road image, wherein the road image is an image formed by shooting a driving road by a vehicle-mounted camera;
extracting features of the road image to obtain straight lines where two parallel lane lines in the road image are located, and determining vanishing points based on the two straight lines, wherein the vanishing points are intersection points of the two straight lines in the road image;
determining an angle external parameter of the vehicle-mounted camera based on the pixel coordinates of the vanishing point;
determining the space coordinates of the lane lines based on the angle external parameters of the vehicle-mounted camera, the position external parameters of the vehicle-mounted camera, the internal parameters of the vehicle-mounted camera and the pixel coordinates of the lane lines, wherein the space coordinates are the coordinates of the lane lines in a vehicle coordinate system;
and determining the pose of the vehicle according to the space coordinates of the lane line, wherein the pose is the pose of the vehicle relative to the lane line.
Optionally, the angle external parameters of the vehicle-mounted camera comprise a pitch angle and a course angle;
the determining the angle external parameter of the vehicle-mounted camera based on the pixel coordinates of the vanishing point comprises the following steps:
calculating an intermediate vector based on the pixel coordinates of the vanishing point and the internal parameters of the vehicle-mounted camera;
the pitch angle and the heading angle are calculated based on elements in the intermediate vector.
Optionally, the calculating the intermediate vector based on the pixel coordinates of the vanishing point and the internal parameters of the in-vehicle camera includes:
constructing a coordinate vector p based on the pixel coordinates of the vanishing point =(u v 1) T Wherein u is V, the pixel abscissa of the vanishing point A pixel ordinate that is the vanishing point;
constructing an internal reference matrix based on the internal reference of the vehicle-mounted camera, and calculating an inverse matrix K of the internal reference matrix -1
By (R) xz R yz R zz ) T =K -1 p /||K -1 p Computing the intermediate vector (R xz R yz R zz ) T
The calculating the pitch angle and the heading angle based on the intermediate vector includes: with α=sin -1 R yz ,β=-tan -1 R xz /R zz And calculating the pitch angle and the course angle, wherein alpha is the pitch angle, and beta is the course angle.
Optionally, the angle external parameters of the vehicle-mounted camera comprise a pitch angle and a course angle;
the determining the angle external parameter of the vehicle-mounted camera based on the coordinates of the vanishing point comprises the following steps:
searching an angle external reference table based on the pixel coordinates of the vanishing points, and determining the pitch angle and the course angle;
the angle external reference table comprises a corresponding relation between a pixel coordinate and the pitch angle and a corresponding relation between the pixel coordinate and the course angle.
Optionally, the determining the spatial coordinates of the lane line based on the angle external parameter of the vehicle-mounted camera, the position external parameter of the vehicle-mounted camera, the internal parameter of the vehicle-mounted camera, and the pixel coordinates of the lane line includes:
calculating a coordinate conversion relation between the camera coordinate system and the vehicle coordinate system based on the angle external parameter of the vehicle-mounted camera, the position external parameter of the vehicle-mounted camera and the position external parameter of the vehicle-mounted camera;
and determining the space coordinates of the lane lines according to the coordinate conversion relation and the pixel coordinates of the lane lines.
Optionally, the determining the pose of the vehicle according to the spatial coordinates of the lane line includes:
determining the space coordinates of a central line according to the space coordinates of the two lane lines, wherein the central line is a virtual line which is positioned between the two lane lines and is parallel to the two lane lines;
determining the distance from the vehicle to the lane line according to the space coordinates of the central line and the space coordinates of the vehicle; or,
and determining the distance from the vehicle to the lane line according to the space coordinate of one lane line and the space coordinate of the vehicle.
Optionally, the determining the pose of the vehicle according to the spatial coordinates of the lane line includes:
determining the extending direction of the lane line in the vehicle coordinate system according to the space coordinates of the lane line;
and determining the course angle of the vehicle relative to the running road according to the extending direction.
In a second aspect, an embodiment of the present disclosure provides a vehicle pose determining apparatus, including:
an image acquisition unit for acquiring a road image, wherein the road image is an image formed by shooting a driving road by a vehicle-mounted camera;
the vanishing point determining unit is used for extracting the characteristics of the road image to obtain straight lines where two parallel lane lines are located in the road image, and determining vanishing points based on the two straight lines, wherein the vanishing points are intersection points of the two straight lines in the road image;
an angle external parameter determining unit for determining an angle external parameter of the vehicle-mounted camera based on the coordinates of the vanishing point;
a space coordinate determining unit, configured to determine a space coordinate of the lane line based on an angle external parameter of the vehicle-mounted camera, a position external parameter of the vehicle-mounted camera, an internal parameter of the vehicle-mounted camera, and a pixel coordinate of the lane line, where the space coordinate is a coordinate of the lane line in a vehicle coordinate system;
and the pose determining unit is used for determining the pose of the vehicle according to the space coordinates of the lane line, wherein the pose is the pose of the vehicle relative to the lane line.
In a third aspect, embodiments of the present disclosure provide a computing device comprising a processor and a memory for storing a computer program; the computer program, when loaded by the processor, causes the processor to perform the vehicle pose determination method as described above.
In a fourth aspect, the disclosed embodiments provide a computer-readable storage medium storing a computer program which, when executed by a processor, causes the processor to implement the vehicle pose determination method as described above.
Compared with the prior art, the technical scheme provided by the embodiment of the disclosure has the following advantages:
according to the scheme provided by the embodiment of the disclosure, based on the characteristic that the road image comprises two parallel lanes, vanishing points are determined according to pixel coordinates of the two parallel lanes, angle external parameters of the vehicle-mounted camera are determined according to the vanishing points, space coordinates of the lanes are determined according to the angle external parameters and other known parameters, and the pose of the vehicle is determined according to the space coordinates of the lanes. By adopting the scheme provided by the embodiment of the disclosure, the real-time angle external parameters of the vehicle-mounted camera can be determined according to the real-time road image, and the calculation of the space coordinates of the lane lines can be realized according to the real-time angle external parameters, so that the pose of the vehicle can be more accurately determined.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure.
In order to more clearly illustrate the embodiments of the present disclosure or the prior art, the drawings that are used in the description of the embodiments or the prior art will be briefly described below. It will be obvious to those skilled in the art that other figures can be obtained from these figures without inventive effort, in which:
FIG. 1 is a flow chart of a vehicle pose determination method provided by an embodiment of the present disclosure;
fig. 2 is a schematic structural view of a vehicle pose determination apparatus provided in an embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of a computing device provided by some embodiments of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure have been shown in the accompanying drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but are provided to provide a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the present disclosure are for illustration purposes only and are not intended to limit the scope of the present disclosure.
The term "including" and variations thereof as used herein are intended to be open-ended, i.e., including, but not limited to. The term "based on" is based at least in part on. The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments. Related definitions of other terms will be given in the description below. It should be noted that the terms "first," "second," and the like in this disclosure are merely used to distinguish between different devices, modules, or units and are not used to define an order or interdependence of functions performed by the devices, modules, or units.
It should be noted that references to "one", "a plurality" and "a plurality" in this disclosure are intended to be illustrative rather than limiting, and those of ordinary skill in the art will appreciate that "one or more" is intended to be understood as "one or more" unless the context clearly indicates otherwise.
The embodiment of the disclosure provides a vehicle pose determining method, which is used for determining an external parameter of a vehicle-mounted camera relative to a vehicle coordinate system in real time by processing a road image shot by the vehicle-mounted camera, so as to more accurately determine the pose of the vehicle relative to a road lane line based on the external parameter.
Fig. 1 is a flowchart of a vehicle pose determination method provided by an embodiment of the present disclosure. As shown in fig. 1, the vehicle pose determining method provided by the embodiment of the present disclosure includes S110 to S150.
The vehicle pose determination method provided by the embodiment of the disclosure is executed by a computing device. The computing device may be a local computing device configured in the vehicle, a remote server, or a virtual device formed by a combination of the local computing device and the remote server.
S110: and acquiring a road image, wherein the road image is an image formed by shooting a driving road by an on-board camera.
In the embodiment of the present disclosure, an in-vehicle camera is installed at a higher position of a head, a cabin, or the like of a vehicle, and captures a road image while the vehicle is traveling on a road. The vehicle driving road is provided with at least two lane lines, and the two lane lines are parallel lane lines, so that the road image shot by the vehicle-mounted camera comprises pixel coordinates representing the at least two parallel lane lines.
S120: and extracting the characteristics of the road image to obtain straight lines where two parallel lane lines in the road image are located, and determining vanishing points based on the two straight lines.
In the embodiment of the disclosure, the computing device may perform feature extraction on the road image by using various existing image processing methods, determine pixel coordinates of at least two parallel lane lines in the road image, and determine a straight line where the at least two parallel lanes are located by using the pixel coordinates of the at least two parallel lane lines. The aforementioned straight line is a straight line located in the road image.
Because only two parallel lane lines are needed to determine the vanishing point, in implementations, the computing device may select pixel coordinates characterizing the two lane lines from among the pixel coordinates of the at least two parallel lane lines and determine the corresponding straight line based on the pixel coordinates characterizing the two lanes. The pixel coordinates of the two parallel lane lines used above are preferably pixel coordinates of the road image middle area, that is to say the lane lines are preferably lane lines of the road image middle area.
In a specific implementation, the computing device may obtain a straight line where two parallel lane lines in the road image are located by using the following method: (1) Processing the road image by adopting a pre-trained lane line recognition network model; (2) The road image is processed using classical image processing methods such as hough transform, edge detection, etc. Considering that the driving road is an open road, and the road lane lines may cause errors of straight line recognition due to weather reasons, road obstacle shielding and the like, in specific implementation, a lane line recognition network model is preferably adopted to process the road image, so that straight lines where two parallel lane lines are located in the image are obtained.
After obtaining the straight lines where the two parallel lane lines in the road image are located, the computing device can calculate a function formula of the two straight lines, determine an intersection point of the two straight lines based on the function formula, and take the intersection point as a vanishing point.
The vanishing point is the visual intersection of the parallel lane lines in the road image. When parallel lane lines are shot with different shooting angles and shooting positions, the pixel coordinates of the vanishing point in the road image are different, so that the shooting angle can be determined according to the pixel coordinates of the vanishing point in the road image. The shooting visual angle is the external angle parameter of the shooting camera, and the shooting position is the external position parameter of the vehicle-mounted camera in the vehicle.
S130: and determining the angle external parameters of the vehicle-mounted camera based on the pixel coordinates of the vanishing point.
Because the pixel coordinates of the vanishing point in the road image have an association relationship with the angle external parameter of the vehicle-mounted camera, the angle external parameter of the vehicle-mounted camera can be reversely determined according to the pixel coordinates of the vanishing point.
Specifically, how to determine the angle external parameters of the vehicle-mounted camera based on the pixel coordinates of the vanishing point is analyzed later.
S140: and determining the space coordinates of the lane lines based on the angle external parameters of the vehicle-mounted camera, the position external parameters of the vehicle-mounted camera, the internal parameters of the vehicle-mounted camera and the pixel coordinates of the lane lines, wherein the space coordinates are the coordinates of the lane lines in a vehicle coordinate system.
As mentioned above, the pixel coordinates of the vanishing point in the road image also have a relation to the position outlier of the onboard camera. However, since the position of the vehicle-mounted camera in the vehicle is not changed greatly, that is, the position external parameter has little influence on the pixel coordinates of the solution point, the position external parameter of the vehicle-mounted camera is taken as a fixed value without considering the change of the position external parameter of the vehicle-mounted camera in the scheme. In a specific implementation, the position external parameter of the vehicle-mounted camera is the position coordinate of the vehicle-mounted camera in a vehicle coordinate system.
Intrinsic parameters of an in-vehicle camera are parameters characterizing the imaging characteristics of the interior of the camera, including the lateral focal length f of the camera u Longitudinal focal length f v Principal plane coordinates (u) 0 ,v 0 ) And an optical axis tilt parameter s. In practice, the internal parameters of the onboard camera may be represented by an internal parameter matrix K,
in a specific implementation of the embodiment of the disclosure, determining the spatial coordinates of the lane line may be divided into two steps based on the angle external parameter of the vehicle-mounted camera, the position external parameter of the vehicle-mounted camera, the internal parameter of the vehicle-mounted camera, and the pixel coordinates of the lane line. Firstly, determining a conversion relation between an imaging plane coordinate system of the vehicle-mounted camera and a vehicle coordinate system according to an angle external parameter, a position external parameter and an internal parameter of the vehicle-mounted camera, and then determining a space coordinate of a lane line according to the conversion relation and a pixel coordinate of the lane line.
If the position coordinates of the lane line in the vehicle-mounted camera coordinate system are (u) l ,v l ) The coordinates in the vehicle coordinate system are (x, y, z) T ThenWhere h is a position external parameter of the vehicle-mounted camera, specifically, a height coordinate of the vehicle-mounted camera relative to a vehicle coordinate system (in this embodiment, it is considered that a lateral coordinate of the vehicle-mounted camera is the same as a lateral coordinate of an origin of the vehicle coordinate system, so that it is not introduced into the formula any more to calculate). />R cr Characterizing the rotation of the road surface corresponding to the external parameter to the vehicle body,in a specific implementation, since the vehicle camera does not roll relative to the vehicle coordinate system, the roll angle roll in the above formula may be set directly to the initial calibration value, that is, the roll angle roll is a set fixed value.
The spatial coordinates of the lane line may be coordinates of a lane line middle line or coordinates of a lane line edge, and the embodiment of the present disclosure is not particularly limited. It should be noted, however, that if the coordinates of the lane line are coordinates of a lane line middle line, the width of the lane line should be known.
S150: and determining the pose of the vehicle according to the space coordinates of the lane lines.
In the disclosed embodiment, the external dimensions of the vehicle are known, and therefore the spatial coordinates of the vehicle edge profile in the vehicle coordinate system are known. After the spatial coordinates of the lane lines are determined, the pose of the vehicle may then be determined from the spatial coordinates of the lane lines and the spatial coordinates of the vehicle edge profile. The aforementioned pose is the pose of the vehicle with respect to the lane line.
The pose of the vehicle includes the pose of the vehicle relative to the lane line, including the distance from the lane line and the heading angle of the vehicle relative to the lane line, and the foregoing heading angle may also be understood as the heading angle of the vehicle relative to the driving road. In particular how to determine the pose of the vehicle and then make a deployment analysis later.
According to the vehicle pose determining method provided by the embodiment of the disclosure, based on the characteristic that the road image comprises two parallel lanes, vanishing points are determined according to pixel coordinates of the two parallel lanes, angle external parameters of the vehicle-mounted camera are determined according to the vanishing points, space coordinates of the lanes are determined according to the angle external parameters and other known parameters, and the pose of the vehicle is determined according to the space coordinates of the lanes. By adopting the method provided by the embodiment of the disclosure, the real-time angle external parameters of the vehicle-mounted camera can be determined according to the real-time road image, and the calculation of the lane line space coordinates can be realized according to the real-time angle external parameters, so that the pose of the vehicle can be more accurately determined.
As before, in the presently disclosed embodiments, the angular outlier of the onboard camera may be determined based on the pixel coordinates of the vanishing point. In practical application, the angle external parameters of the vehicle-mounted camera comprise a pitch angle and a course angle.
In some embodiments, determining the angular profile of the onboard camera from the pixel coordinates of the vanishing point includes S131-S132.
S131: an intermediate vector is calculated based on the pixel coordinates of the vanishing point and the internal parameters of the onboard camera.
The intermediate vector is a vector showing the direct association relation between the vanishing point pixel coordinates and the vehicle-mounted camera angle external parameters, and can be (R xz R yz R zz ) T And (3) representing. The calculating the intermediate vector based on the pixel coordinates of the vanishing point and the internal parameters of the in-vehicle camera may specifically include: first, a coordinate vector p is constructed based on the pixel coordinates of vanishing points =(u v 1) T Wherein u is The pixel abscissa, v, of vanishing points An internal reference matrix is built based on the internal reference of the vehicle-mounted camera and is used for calculating the inverse matrix K of the internal reference matrix -1 The method comprises the steps of carrying out a first treatment on the surface of the Subsequently adopt (R) xz R yz R zz ) T =K -1 p /||K -1 p Computing intermediate vector (R) xz R yz R zz ) T
S132: the pitch angle and heading angle are calculated based on the elements in the intermediate vector.
After the intermediate vector is obtained, the pitch angle and the course angle can be calculated by using the intermediate vector, so that the angle external parameters of the vehicle-mounted camera are obtained. In particular implementations, the computing device may employ α=sin -1 R yz ,β=-tan -1 R xz /R zz And calculating a pitch angle and a course angle, wherein alpha is the pitch angle, and beta is the course angle.
In still other embodiments, determining the angular outlier of the onboard camera from the pixel coordinates of the vanishing point includes S133.
S133: and searching an angle external reference table based on the pixel coordinates of the vanishing point, and determining a pitch angle and a course angle.
The angle external reference table comprises a corresponding relation between pixel coordinates and pitch angle and a corresponding relation between pixel coordinates and course angle. The corresponding relation in the angle external reference table can be determined by an experimental calibration method, namely, road images are shot under the conditions of various angle external references and vehicle pose, the pixel coordinates of vanishing points are determined, and the relation between the pixel coordinates and the pitch angle is established.
As before, the relative position of the vehicle includes the distance of the vehicle from the lane line and the heading angle of the vehicle with respect to the running road.
In particular implementations, determining the relative position of the vehicle in the driving road from the spatial coordinates of the lane lines by the computing device includes S151-S152.
S151: and determining the space coordinates of a central line according to the space coordinates of the two lane lines, wherein the central line is a virtual line which is positioned between the two lane lines and is parallel to the two lane lines.
S152: and determining the distance from the vehicle to the lane line according to the space coordinates of the central line and the space coordinates of the vehicle.
In a specific implementation, the two lane lines are preferably adjacent lane lines, and the center line determined by the space coordinates of the two corresponding lane lines is the center line of the lane. That is, S152 is to determine the distance of the vehicle with respect to the lane line based on the spatial coordinates of the lane center line and the spatial coordinates of the vehicle.
In other embodiments, the computing device may also determine the distance of the longitudinal centerline of the vehicle relative to the lane line based on the spatial coordinates of the lane line, and thus the distance of the vehicle from the lane line.
In particular implementations, determining the heading angle of the vehicle relative to the driving road based on the spatial coordinates of the lane lines may include S153-S154.
S153: and determining the extending direction of the lane line in the vehicle coordinate system according to the space coordinates of the lane line.
S154: and determining the course angle of the vehicle relative to the lane line according to the extending direction.
And determining the extending direction of the lane line in the vehicle coordinate system according to the space coordinates of the lane line, namely determining the included angle of the lane line relative to the coordinate axis in the vehicle coordinate system through the coordinates of the lane line in the vehicle coordinate system. After determining the included angle of the lane line relative to the vehicle coordinate system, then determining the included angle of the vehicle head direction relative to the extending direction of the lane line according to each included angle, and taking the roof of the included angle as the heading angle of the vehicle. In practical applications, since the vehicle is located on the driving road plane, the extending direction is determined according to the longitudinal axes of the lane lines and the coordinate axes of the vehicle.
In addition to providing the aforementioned vehicle pose determination method, embodiments of the present disclosure also provide a vehicle pose determination device. Fig. 2 is a schematic structural view of a vehicle pose determination apparatus provided in an embodiment of the present disclosure. As shown in fig. 2, the vehicle pose determining apparatus 200 includes an image acquisition unit 201, a vanishing point determining unit 202, an angle outlier determining unit 203, a spatial coordinate determining unit 204, and a pose determining unit 205.
The image acquisition unit 201 is configured to acquire a road image, which is an image formed by capturing a traveling road with an in-vehicle camera.
The vanishing point determining unit 202 is configured to perform feature extraction on the road image, obtain straight lines where two parallel lane lines in the road image are located, and determine vanishing points based on the two straight lines, where the vanishing points are intersection points of the two straight lines in the road image.
The angle-out-parameter determination unit 203 is configured to determine an angle-out parameter of the in-vehicle camera based on coordinates of the vanishing point.
The spatial coordinate determining unit 204 is configured to determine spatial coordinates of the lane line based on the angle external parameter of the vehicle-mounted camera, the position external parameter of the vehicle-mounted camera, the internal parameter of the vehicle-mounted camera, and the pixel coordinates of the lane line, where the spatial coordinates are coordinates of the lane line in a vehicle coordinate system.
The pose determining unit 205 is configured to determine a pose of the vehicle according to the spatial coordinates of the lane line, where the pose is a pose of the vehicle relative to a road where the lane line is located.
In some embodiments, the angular profile of the onboard camera includes a pitch angle and a heading angle. The angle outlier determination unit 203 includes an intermediate vector calculation subunit and an angle outlier calculation subunit. The intermediate vector calculation operator unit calculates an intermediate vector based on the pixel coordinates of the vanishing point and the internal parameters of the vehicle-mounted camera; the angle-extrinsic-calculation subunit calculates a pitch angle and a heading angle based on elements in the intermediate vector.
In some embodiments, the intermediate vector operator unit constructs the coordinate vector p based on the pixel coordinates of the vanishing point =(u v 1) T Wherein u is The pixel abscissa, v, of vanishing points Pixel ordinate that is vanishing point; internal reference matrix is built based on internal reference of vehicle-mounted camera, and inverse matrix K of the internal reference matrix is calculated -1 The method comprises the steps of carrying out a first treatment on the surface of the Subsequently adopt (R) xz R yz R zz ) T =K -1 p /||K -1 p Computing intermediate vector (R) xz R yz R zz ) T . Correspondingly, the angle external parameter calculation subunit adopts alpha=sin -1 R yz ,β=-tan -1 R xz /R zz And calculating a pitch angle and a course angle, wherein alpha is the pitch angle, and beta is the course angle.
In some embodiments, the angle external parameter determining unit 203 searches an angle external parameter comparison table based on the pixel coordinates of the vanishing point, and determines the pitch angle and the heading angle; the angle external reference table comprises a corresponding relation between pixel coordinates and pitch angles and a corresponding relation between pixel coordinates and course angles.
In some embodiments, the spatial coordinate determination unit 204 first calculates a coordinate conversion relationship of the camera coordinate system and the vehicle coordinate system based on the angle external parameter of the vehicle-mounted camera, the position external parameter of the vehicle-mounted camera, and then determines the spatial coordinates of the lane line according to the coordinate conversion relationship and the pixel coordinates of the lane line.
In some embodiments, the pose determination unit 205 includes a position determination subunit that determines the spatial coordinates of a center line from the spatial coordinates of two lane lines, the center line being a virtual line that is located intermediate and parallel to the two lane lines; then determining the distance from the vehicle to the lane line according to the space coordinates of the central line and the space coordinates of the vehicle; alternatively, the distance from the vehicle to a lane line is determined based on the spatial coordinates of the lane line and the spatial coordinates of the vehicle.
In some embodiments, the pose determination unit 205 includes a course angle determination subunit. The course angle determination subunit is used for determining the extending direction of the lane line in the vehicle coordinate system according to the space coordinates of the lane line and determining the course angle of the vehicle relative to the lane line according to the extending direction.
The embodiment of the disclosure also provides a computing device for realizing the vehicle pose determining method. Fig. 3 is a schematic structural diagram of a computing device provided by some embodiments of the present disclosure. Referring now in particular to FIG. 3, a schematic diagram of a computing device 300 suitable for use in implementing embodiments of the present disclosure is shown. The computing device illustrated in fig. 3 is merely an example and should not be taken as limiting the functionality and scope of use of embodiments of the present disclosure.
As shown in fig. 3, the computing device 300 may include a processing means (e.g., a central processor, a graphics processor, etc.) 301 that may perform various suitable actions and processes in accordance with programs stored in a read-only memory ROM302 or loaded from a storage means 308 into a random access memory RAM 303. In the RAM303, various programs and data required for the operation of the computing device 300 are also stored. The processing device 301, the ROM302, and the RAM303 are connected to each other via a bus 304. An input/output I/O interface 305 is also connected to bus 304.
In general, the following devices may be connected to the I/O interface 305: input devices 305 including, for example, a touch screen, a touch pad, an in-vehicle camera, a microphone, an accelerometer, a gyroscope, and the like; an output device 307 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 308 including, for example, magnetic tape, hard disk, etc.; and communication means 309. The communication means 309 may allow the computing device 300 to communicate wirelessly or by wire with other devices to exchange data. While fig. 3 illustrates a computing device 300 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may be implemented or provided instead.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a non-transitory computer readable medium, the computer program comprising program code for performing the method shown in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via a communication device 309, or installed from a storage device 308, or installed from a ROM 302. The above-described functions defined in the methods of the embodiments of the present disclosure are performed when the computer program is executed by the processing means 301.
It should be noted that the computer readable medium described in the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present disclosure, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, fiber optic cables, RF (radio frequency), and the like, or any suitable combination of the foregoing.
In some implementations, the clients, computing devices may communicate using any currently known or future developed network protocol, such as HTTP (HyperText Transfer Protocol ), and may be interconnected with digital data communications (e.g., communication networks) in any form or medium. Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the internet (e.g., the internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed networks.
The computer readable medium may be embodied in the computing device; or may exist alone without being assembled into the computing device.
The computer readable medium carries one or more programs which, when executed by the computing device, cause the computing device to: acquiring a road image, wherein the road image is an image formed by shooting a driving road by a vehicle-mounted camera; extracting features of the road image to obtain straight lines where two parallel lane lines in the road image are located, and determining vanishing points based on the two straight lines, wherein the vanishing points are intersection points of the two straight lines in the road image; determining an angle external parameter of the vehicle-mounted camera based on the pixel coordinates of the vanishing point; determining the space coordinates of the lane lines based on the angle external parameters of the vehicle-mounted camera, the position external parameters of the vehicle-mounted camera, the internal parameters of the vehicle-mounted camera and the pixel coordinates of the lane lines, wherein the space coordinates are the coordinates of the lane lines in a vehicle coordinate system; and determining the pose of the vehicle according to the space coordinates of the lane lines, wherein the pose is the pose of the vehicle relative to the lane lines.
Computer program code for carrying out operations of the present disclosure may be written in one or more programming languages, including, but not limited to, an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or computing device. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present disclosure may be implemented by means of software, or may be implemented by means of hardware. Wherein the names of the units do not constitute a limitation of the units themselves in some cases.
The functions described above herein may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), an Application Specific Standard Product (ASSP), a system on a chip (SOC), a Complex Programmable Logic Device (CPLD), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection according to one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The embodiments of the present disclosure further provide a computer readable storage medium, where a computer program is stored, where the computer program, when executed by a processor, may implement a method according to any one of the foregoing method embodiments, and the implementation manner and beneficial effects of the method are similar, and are not described herein again.
Embodiments of the present disclosure also provide a vehicle including the foregoing computing device. The specific vehicle may be a fuel vehicle, a pure electric vehicle, etc., and the embodiments of the present disclosure are not limited.
It should be noted that in this document, relational terms such as "first" and "second" and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The foregoing is merely a specific embodiment of the disclosure to enable one skilled in the art to understand or practice the disclosure. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the disclosure. Thus, the present disclosure is not intended to be limited to the embodiments shown and described herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (7)

1. A vehicle pose determination method, characterized by comprising:
acquiring a road image, wherein the road image is an image formed by shooting a driving road by a vehicle-mounted camera;
extracting features of the road image to obtain straight lines where two parallel lane lines in the road image are located, and determining vanishing points based on the two straight lines, wherein the vanishing points are intersection points of the two straight lines in the road image;
determining an angle external parameter of the vehicle-mounted camera based on the pixel coordinates of the vanishing point, including:
constructing a coordinate vector p based on the pixel coordinates of the vanishing point =(u v 1) T Wherein u is Pixel bar which is the vanishing pointCoordinates, v A pixel ordinate that is the vanishing point;
constructing an internal reference matrix based on the internal reference of the vehicle-mounted camera, and calculating an inverse matrix K of the internal reference matrix -1
By (R) xz R yz R zz ) T =K -1 p /||K -1 p infinity| calculation of intermediate vector (R xz R yz R zz ) T
With α=sin -1 R yz ,β=-tan -1 R xz /R zz Calculating a pitch angle and a course angle, wherein alpha is the pitch angle, and beta is the course angle;
determining the space coordinates of the lane lines based on the angle external parameters of the vehicle-mounted camera, the position external parameters of the vehicle-mounted camera, the internal parameters of the vehicle-mounted camera and the pixel coordinates of the lane lines, wherein the space coordinates are the coordinates of the lane lines in a vehicle coordinate system;
and determining the pose of the vehicle according to the space coordinates of the lane line, wherein the pose is the pose of the vehicle relative to the lane line.
2. The method of claim 1, wherein the determining the spatial coordinates of the lane line based on the angular profile of the onboard camera, the positional profile of the onboard camera, the internal profile of the onboard camera, and the pixel coordinates of the lane line comprises:
calculating a coordinate conversion relation between the camera coordinate system and the vehicle coordinate system based on the angle external parameter of the vehicle-mounted camera, the position external parameter of the vehicle-mounted camera and the internal parameter of the vehicle-mounted camera;
and determining the space coordinates of the lane lines according to the coordinate conversion relation and the pixel coordinates of the lane lines.
3. The method of claim 1, wherein the determining the pose of the vehicle from the spatial coordinates of the lane lines comprises:
determining the space coordinates of a central line according to the space coordinates of the two lane lines, wherein the central line is a virtual line which is positioned between the two lane lines and is parallel to the two lane lines;
determining the distance from the vehicle to the lane line according to the space coordinates of the central line and the space coordinates of the vehicle; or,
and determining the distance from the vehicle to the lane line according to the space coordinate of one lane line and the space coordinate of the vehicle.
4. The method of claim 1, wherein the determining the pose of the vehicle from the spatial coordinates of the lane lines comprises:
determining the extending direction of the lane line in the vehicle coordinate system according to the space coordinates of the lane line;
and determining the course angle of the vehicle relative to the running road according to the extending direction.
5. A vehicle pose determining apparatus, characterized by comprising:
an image acquisition unit for acquiring a road image, wherein the road image is an image formed by shooting a driving road by a vehicle-mounted camera;
the vanishing point determining unit is used for extracting the characteristics of the road image to obtain straight lines where two parallel lane lines are located in the road image, and determining vanishing points based on the two straight lines, wherein the vanishing points are intersection points of the two straight lines in the road image;
the angle external parameter determining unit is used for determining an angle external parameter of the vehicle-mounted camera based on the coordinates of the vanishing point, and the angle external parameter determining unit is specifically used for:
constructing a coordinate vector p based on the pixel coordinates of the vanishing point =(u v 1) T Wherein u is V, the pixel abscissa of the vanishing point A pixel ordinate that is the vanishing point;
based on the vehicle-mounted cameraAn internal reference matrix is constructed, and an inverse matrix K of the internal reference matrix is calculated -1
By (R) xz R yz R zz ) T =K -1 p /||K -1 p Computing intermediate vector (R) xz R yz R zz ) T
With α=sin -1 R yz ,β=-tan -1 R xz /R zz Calculating a pitch angle and a course angle, wherein alpha is the pitch angle, and beta is the course angle;
a space coordinate determining unit, configured to determine a space coordinate of the lane line based on an angle external parameter of the vehicle-mounted camera, a position external parameter of the vehicle-mounted camera, an internal parameter of the vehicle-mounted camera, and a pixel coordinate of the lane line, where the space coordinate is a coordinate of the lane line in a vehicle coordinate system;
and the pose determining unit is used for determining the pose of the vehicle according to the space coordinates of the lane line, wherein the pose is the pose of the vehicle relative to the lane line.
6. A computing device comprising a processor and a memory, the memory for storing a computer program;
the computer program, when loaded by the processor, causes the processor to perform the vehicle pose determination method according to any of claims 1-4.
7. A computer-readable storage medium, characterized in that the storage medium stores a computer program, which when executed by a processor causes the processor to implement the vehicle pose determination method according to any of claims 1-4.
CN202211515807.XA 2022-11-29 2022-11-29 Vehicle pose determination method, device, computing equipment and medium Active CN116309814B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211515807.XA CN116309814B (en) 2022-11-29 2022-11-29 Vehicle pose determination method, device, computing equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211515807.XA CN116309814B (en) 2022-11-29 2022-11-29 Vehicle pose determination method, device, computing equipment and medium

Publications (2)

Publication Number Publication Date
CN116309814A CN116309814A (en) 2023-06-23
CN116309814B true CN116309814B (en) 2024-03-08

Family

ID=86817374

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211515807.XA Active CN116309814B (en) 2022-11-29 2022-11-29 Vehicle pose determination method, device, computing equipment and medium

Country Status (1)

Country Link
CN (1) CN116309814B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109345593A (en) * 2018-09-04 2019-02-15 海信集团有限公司 A kind of detection method and device of video camera posture
CN112927303A (en) * 2021-02-22 2021-06-08 中国重汽集团济南动力有限公司 Lane line-based automatic driving vehicle-mounted camera pose estimation method and system
CN113012226A (en) * 2021-03-22 2021-06-22 浙江商汤科技开发有限公司 Camera pose estimation method and device, electronic equipment and computer storage medium
CN113781562A (en) * 2021-09-13 2021-12-10 山东大学 Lane line virtual and real registration and self-vehicle positioning method based on road model
CN114241062A (en) * 2021-12-27 2022-03-25 智道网联科技(北京)有限公司 Camera external parameter determination method and device for automatic driving and computer readable storage medium
CN114550042A (en) * 2022-02-18 2022-05-27 深圳海星智驾科技有限公司 Road vanishing point extraction method, vehicle-mounted sensor calibration method and device
CN114549654A (en) * 2022-01-19 2022-05-27 福思(杭州)智能科技有限公司 External parameter calibration method, device, equipment and storage medium for vehicle-mounted camera
CN114913500A (en) * 2022-07-12 2022-08-16 福思(杭州)智能科技有限公司 Pose determination method and device, computer equipment and storage medium
CN115031758A (en) * 2022-04-02 2022-09-09 腾讯科技(深圳)有限公司 Live-action navigation method, device, equipment, storage medium and program product

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109345593A (en) * 2018-09-04 2019-02-15 海信集团有限公司 A kind of detection method and device of video camera posture
CN112927303A (en) * 2021-02-22 2021-06-08 中国重汽集团济南动力有限公司 Lane line-based automatic driving vehicle-mounted camera pose estimation method and system
CN113012226A (en) * 2021-03-22 2021-06-22 浙江商汤科技开发有限公司 Camera pose estimation method and device, electronic equipment and computer storage medium
CN113781562A (en) * 2021-09-13 2021-12-10 山东大学 Lane line virtual and real registration and self-vehicle positioning method based on road model
CN114241062A (en) * 2021-12-27 2022-03-25 智道网联科技(北京)有限公司 Camera external parameter determination method and device for automatic driving and computer readable storage medium
CN114549654A (en) * 2022-01-19 2022-05-27 福思(杭州)智能科技有限公司 External parameter calibration method, device, equipment and storage medium for vehicle-mounted camera
CN114550042A (en) * 2022-02-18 2022-05-27 深圳海星智驾科技有限公司 Road vanishing point extraction method, vehicle-mounted sensor calibration method and device
CN115031758A (en) * 2022-04-02 2022-09-09 腾讯科技(深圳)有限公司 Live-action navigation method, device, equipment, storage medium and program product
CN114913500A (en) * 2022-07-12 2022-08-16 福思(杭州)智能科技有限公司 Pose determination method and device, computer equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
基于行车视频消失点和逆透视变换的车速测量;关 闯;《中 国 安 全 科 学 学 报》;第31卷(第6期);128-135 *
放 道 路 中 匹 配 高 精 度 地 图 的 在 线 相 机 外 参 标 定;廖 文 龙;《中国图像图形学报》;第26卷(第1期);208-217 *

Also Published As

Publication number Publication date
CN116309814A (en) 2023-06-23

Similar Documents

Publication Publication Date Title
CN110148185B (en) Method and device for determining coordinate system conversion parameters of imaging equipment and electronic equipment
CN112419385B (en) 3D depth information estimation method and device and computer equipment
EP3845927B1 (en) Merging multiple lidar point cloud data using an iterative closest point (icp) algorithm with weighting factor
CN114399588B (en) Three-dimensional lane line generation method and device, electronic device and computer readable medium
CN115761702B (en) Vehicle track generation method, device, electronic equipment and computer readable medium
CN116164770B (en) Path planning method, path planning device, electronic equipment and computer readable medium
CN116182878B (en) Road curved surface information generation method, device, equipment and computer readable medium
CN114663529B (en) External parameter determining method and device, electronic equipment and storage medium
CN115540894A (en) Vehicle trajectory planning method and device, electronic equipment and computer readable medium
CN114445597B (en) Three-dimensional lane line generation method and device, electronic device and computer readable medium
CN111382695A (en) Method and apparatus for detecting boundary points of object
CN111340880B (en) Method and apparatus for generating predictive model
CN116309814B (en) Vehicle pose determination method, device, computing equipment and medium
CN112649011A (en) Vehicle obstacle avoidance method, device, equipment and computer readable medium
CN114724116B (en) Vehicle traffic information generation method, device, equipment and computer readable medium
WO2020132965A1 (en) Method and apparatus for determining installation parameters of on-board imaging device, and driving control method and apparatus
CN116311155A (en) Obstacle information generation method, obstacle information generation device, electronic device, and computer-readable medium
CN107452230B (en) Obstacle detection method and device, terminal equipment and storage medium
CN115565158A (en) Parking space detection method and device, electronic equipment and computer readable medium
CN111383337B (en) Method and device for identifying objects
CN116563817B (en) Obstacle information generation method, obstacle information generation device, electronic device, and computer-readable medium
CN116563818B (en) Obstacle information generation method, obstacle information generation device, electronic device, and computer-readable medium
CN116740682B (en) Vehicle parking route information generation method, device, electronic equipment and readable medium
CN115848358B (en) Vehicle parking method, device, electronic equipment and computer readable medium
US20230068375A1 (en) Method and system for detecting a three-dimensional object in a two-dimensional image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant