WO2023066719A1 - Method for judging free space between vehicles and apparatus therefor - Google Patents

Method for judging free space between vehicles and apparatus therefor Download PDF

Info

Publication number
WO2023066719A1
WO2023066719A1 PCT/EP2022/078233 EP2022078233W WO2023066719A1 WO 2023066719 A1 WO2023066719 A1 WO 2023066719A1 EP 2022078233 W EP2022078233 W EP 2022078233W WO 2023066719 A1 WO2023066719 A1 WO 2023066719A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
feature point
information
free space
blind zone
Prior art date
Application number
PCT/EP2022/078233
Other languages
English (en)
French (fr)
Inventor
Lennard LIN
Original Assignee
Robert Bosch Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch Gmbh filed Critical Robert Bosch Gmbh
Publication of WO2023066719A1 publication Critical patent/WO2023066719A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • G06V20/54Surveillance or monitoring of activities, e.g. for recognising suspicious objects of traffic, e.g. cars on the road, trains or boats
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Definitions

  • the present invention relates to Internet of Vehicles technology, in particular to a method for judging free space between vehicles and an apparatus for judging free space between vehicles.
  • loV technology acquires state information about the vehicle itself and perception information about the surrounding environment (e.g. perception information from roadside sensors) by means of high-precision GPS and onboard sensors (such as millimetre wave radar, onboard cameras, etc.).
  • Information about a large number of vehicles can be analysed and processed by conducting wireless communication and information exchange between one vehicle and another vehicle, between vehicles and the roadside, and between vehicles and the internet, and used for different application scenarios.
  • Fig. 1 shows a schematic drawing of a blind zone in the prior art. As shown in Fig. 1, the roadside sensor has already been mounted at a high position, but a blind zone (represented as a shaded part) still exists between vehicles.
  • the aim of the present invention is to propose a method for judging free space between vehicles that is able to judge a blind zone between vehicles more precisely, and an apparatus for judging free space between vehicles.
  • a method for judging free space between vehicles is characterized in that the method is used for judging a space that exists between a first vehicle and a second vehicle, the method comprising: a blind zone determining step: determining a blind zone between the first vehicle and the second vehicle; an edge information computing step: at least acquiring current feature point information of the first vehicle and second vehicle, and computing edge information in the blind zone between the first vehicle and second vehicle according to the current feature point information of the first vehicle and second vehicle; a closed space generating step: generating a closed space according to the edge information in the blind zone between the first vehicle and second vehicle; and a free space judgement step: judging whether the closed space is a free space based on monitoring of the closed space.
  • the free space judgement step comprises: monitoring whether the first vehicle and second vehicle are continuously adjacent within a specified time and the first vehicle and second vehicle are located in the same lane or adjacent lanes, wherein the closed space is judged to be a free space if the first vehicle and second vehicle are continuously adjacent within the specified time and the first vehicle and second vehicle are located in the same lane or adjacent lanes, and the free space is otherwise judged to be a non-free space.
  • edge information computing step if historical feature point information of the first vehicle and second vehicle is further acquired, in the edge information computing step, edge information in the blind zone between the first vehicle and second vehicle is computed based on the current feature point information of the first vehicle and second vehicle and the historical feature point information of the first vehicle and second vehicle.
  • the edge information computing step comprises: matching the current feature point information of the first vehicle with the historical feature point information of the first vehicle, to obtain a first relative positional relationship of a feature point of the first vehicle relative to a reference point of the first vehicle, and matching the current feature point information of the second vehicle with the historical feature point information of the second vehicle, to obtain a second relative positional relationship of a feature point of the second vehicle relative to a reference point of the second vehicle; based on the first relative positional relationship, the second relative positional relationship, and absolute position information of the reference point of the first vehicle and absolute position information of the reference point of the second vehicle obtained from the outside, computing the vehicle feature point of the first vehicle and the feature point of the second vehicle in the blind zone between the first vehicle and second vehicle; and obtaining edge information in the blind zone between the first vehicle and second vehicle based on the vehicle feature point of the first vehicle and the feature point of the second vehicle.
  • the feature point information comprises one or more of the following: basic feature point information of the vehicle and in-depth feature point information of the vehicle.
  • edge information computing step if a pre-stored common vehicle model is further acquired, edge information in the blind zone between the first vehicle and second vehicle is computed based on the current feature point information of the first vehicle and second vehicle and the pre-stored common vehicle model.
  • the edge information computing step comprises: matching the current feature point information of the first vehicle with the common vehicle model to obtain a first relative positional relationship of a feature point of the first vehicle relative to a reference point of the first vehicle, and matching the current feature point information of the second vehicle with the common vehicle model to obtain a second relative positional relationship of a feature point of the second vehicle relative to a reference point of the second vehicle; based on the first relative positional relationship, the second relative positional relationship, and absolute position information of the reference point of the first vehicle and absolute position information of the vehicle reference point of the second vehicle obtained from the outside, obtaining the vehicle feature point of the first vehicle and the feature point of the second vehicle in the blind zone between the first vehicle and second vehicle; and obtaining edge information in the blind zone between the first vehicle and second vehicle based on the vehicle feature point of the first vehicle and the feature point of the second vehicle.
  • the feature point information comprises one or more of the following: basic feature point information of the vehicle, in-depth feature point information of the vehicle and appearance feature point information of the vehicle.
  • the appearance feature point information comprises one or more of the following: colour feature point information of the vehicle, pattern feature point information of the vehicle and contour feature point information of the vehicle.
  • the first relative positional relationship and the second relative positional relationship are obtained by any one of the following computing methods: Euclidean distance, Mahalanobis distance, Manhattan distance, Chebyshev distance and Minkowski distance.
  • an apparatus for judging free space between vehicles is characterized in that the apparatus is used for judging a space that exists between a first vehicle and a second vehicle, the apparatus comprising: a blind zone determining module, for determining a blind zone between the first vehicle and the second vehicle; an edge information computing module, for at least acquiring current feature point information of the first vehicle and second vehicle, and computing edge information in the blind zone between the first vehicle and second vehicle according to the current feature point information of the first vehicle and second vehicle; a closed space generating module, for generating a closed space according to the edge information in the blind zone between the first vehicle and second vehicle; and a free space judgement module, for judging whether the closed space is a free space based on monitoring of the closed space.
  • the free space judgement module is used for monitoring whether the first vehicle and second vehicle are continuously adjacent within a specified time and the first vehicle and second vehicle are located in the same lane or adjacent lanes.
  • edge information in the blind zone between the first vehicle and second vehicle is computed based on the current feature point information of the first vehicle and second vehicle and further obtained historical feature point information of the first vehicle and second vehicle.
  • the edge information computing module comprises: a matching submodule, for matching the current feature point information of the first vehicle with the historical feature point information of the first vehicle, to obtain a first relative positional relationship of a feature point of the first vehicle relative to a reference point of the first vehicle, and matching the current feature point information of the second vehicle with the historical feature point information of the second vehicle, to obtain a second relative positional relationship of a feature point of the second vehicle relative to a reference point of the second vehicle; and a first computing submodule for, based on the first relative positional relationship, the second relative positional relationship, and absolute position information of the reference point of the first vehicle and absolute position information of the reference point of the second vehicle obtained from the outside, computing the vehicle feature point of the first vehicle and the feature point of the second vehicle in the blind zone between the first vehicle and second vehicle; and a second computing submodule, for obtaining edge information in the blind zone between the first vehicle and second vehicle based on the vehicle feature point of the first vehicle and the feature point of the second vehicle.
  • edge information computing module if a pre-stored common vehicle model is further acquired, edge information in the blind zone between the first vehicle and second vehicle is computed in the edge information computing module based on the current feature point information of the first vehicle and second vehicle and the pre-stored common vehicle model.
  • the edge information computing module comprises: a matching submodule, for matching the current feature point information of the first vehicle with the common vehicle model to obtain a first relative positional relationship of a feature point of the first vehicle relative to a reference point, and matching the current feature point information of the second vehicle with the common vehicle model to obtain a second relative positional relationship of a feature point of the second vehicle relative to a reference point; a first computing submodule for, based on the first relative positional relationship, the second relative positional relationship, and absolute position information of the reference point of the first vehicle and absolute position information of the vehicle reference point of the second vehicle obtained from the outside, obtaining the vehicle feature point of the first vehicle and the feature point of the second vehicle in the blind zone between the first vehicle and second vehicle; and a second computing submodule, for obtaining edge information in the blind zone between the first vehicle and second vehicle based on the vehicle feature point of the first vehicle and the feature point of the second vehicle.
  • a computer device comprises a storage module, a processor, and a computer program stored on the storage module and capable of being run on the processor, characterized in that the processor, upon executing the computer program, realizes the method forjudging free space between vehicles.
  • the method for judging free space between vehicles and the apparatus for judging free space between vehicles according to the present invention can judge space in a blind zone between vehicles more precisely and can thereby expand a larger amount of usable space.
  • Fig. 1 shows a schematic drawing of a blind zone in the prior art.
  • Fig. 2 is a schematic flow chart showing the method of the present invention forjudging free space between vehicles.
  • Fig. 3 is a schematic flow chart showing the method forjudging free space between vehicles in a first embodiment of the present invention.
  • Fig. 4 is a structural block diagram showing the apparatus of the present invention forjudging free space between vehicles.
  • Fig. 2 is a schematic flow chart showing the method of the present invention for judging free space between vehicles.
  • the method of the present invention for judging free space between vehicles comprises the following steps: a blind zone determining step S100: determining a blind zone between a first vehicle and a second vehicle; an edge information computing step S200: at least acquiring current feature point information of the first vehicle and second vehicle, and computing edge information in the blind zone between the first vehicle and second vehicle according to the current feature point information of the first vehicle and second vehicle; a closed space generating step S300: generating a closed space according to the edge information in the blind zone between the first vehicle and second vehicle; and a free space judgement step S400: judging whether the closed space is a free space based on monitoring of the closed space.
  • the free space judgement step S400 is implemented for example in the following way: monitoring whether the first vehicle and second vehicle are continuously adjacent within a specified time and the first vehicle and second vehicle are located in the same lane or adjacent lanes, wherein the closed space is judged to be a free space if the first vehicle and second vehicle are continuously adjacent within the specified time and the first vehicle and second vehicle are located in the same lane or adjacent lanes, and the free space is otherwise judged to be a non-free space.
  • part of the blind zone can be judged to be a free space, i.e. a safe zone, which can be used as a free space.
  • Fig. 3 is a schematic flow chart showing the method forjudging free space between vehicles in a first embodiment of the present invention.
  • the method for judging free space between vehicles in the first embodiment of the present invention is used to judge whether a space between a first vehicle and a second vehicle is a free space; as shown in Fig. 3, the method for judging free space between vehicles in the first embodiment comprises the following:
  • Step SI determining a blind zone between the first vehicle and second vehicle; this is mainly achieved by a roadside sensor by sensing the positions of the first vehicle and second vehicle.
  • Step S2 determining whether historical feature point information exists, and proceeding to step S3 if historical feature point information exists, otherwise proceeding to step S4, wherein situations in which no historical feature point information exists generally comprise: a target vehicle having just entered a tracking list of the sensor, or re-entered the tracking list after being lost for a period of time in the middle.
  • Step S3 if historical feature point information exists, computing edge information in the blind zone between the first vehicle and second vehicle based on current feature point information of the first vehicle and second vehicle and historical feature point information of the first vehicle and second vehicle.
  • Step S4 if no historical feature point information exists, computing edge information in the blind zone between the first vehicle and second vehicle based on current feature point information of the first vehicle and second vehicle and a pre-stored common vehicle model; furthermore, because historical feature point information has not been obtained, and a pre-stored common vehicle model has been used instead to compute edge information, it is further possible to downgrade a confidence level to a lower level in step S4.
  • the setting of confidence level is used to characterize the credibility of the relative position (i.e. edge information) finally acquired; because this is not completely measured and may involve a matching error, the confidence level is set to indicate the credibility of the speculatively computed edge information.
  • the confidence level can be set to downgraded (conversely, it can be set to non-downgraded if historical feature point information exists). Furthermore, the size of the closed space obtained below can be further adjusted based on the confidence level.
  • Step S5 forming a closed space based on the edge information in the blind zone between the first vehicle and second vehicle obtained in step S3 or step S4.
  • Step S6 (optional step): further adjusting the size of the closed space according to the confidence level, e.g. reducing the size of the closed space if the confidence level is lower than a preset threshold.
  • Step S7 judging whether an object enters the closed space, and proceeding to step S8 if no object enters, otherwise proceeding to step S9.
  • Step S8 judging the closed space to be a free space, wherein “free space” means that the space is free and relatively safe, e.g. may be used for a lane change cutin.
  • Step S9 judging the closed space to be a non-free space, and proceeding to step S10.
  • Step S10 maintaining monitoring of the closed space, and if the object that has entered leaves the closed space, proceeding to step Sil.
  • Step Sil twice-confirming whether the object has entered (i.e. has entered and now left); specifically, confirming for example whether the number and features of objects entering the closed space (the features being as described previously) are consistent with the number and features of objects leaving the space, and if so, deeming that a non-visible region of the closed space is now empty, and returning to step S8, otherwise returning to step S10.
  • step S3 a situation is represented in which, if historical feature point information exists, edge information in the blind zone between the first vehicle and second vehicle is computed based on current feature point information of the first vehicle and second vehicle and historical feature point information of the first vehicle and second vehicle.
  • the following steps may be specifically included for example: matching current feature point information of the first vehicle with historical feature point information of the first vehicle, to obtain a first relative positional relationship of a feature point of the first vehicle relative to a reference point of the first vehicle, and matching current feature point information of the second vehicle with historical feature point information of the second vehicle, to obtain a second relative positional relationship of a feature point of the second vehicle relative to a reference point of the second vehicle; based on the first relative positional relationship, the second relative positional relationship, and absolute position information of the reference point of the first vehicle and absolute position information of the reference point of the second vehicle obtained from the outside, computing the vehicle feature point of the first vehicle and the feature point of the second vehicle in the blind zone between the first vehicle and second vehicle; and obtaining edge information in the blind zone between the first vehicle and second vehicle based on the vehicle feature point of the first vehicle and the feature point of the second vehicle.
  • the feature point information of the vehicle for example comprises: basic feature point information of the vehicle (length/width/height, centre point position or edge midpoints), and in- depth feature points of the vehicle (sensing information of key points of the target, e.g. lorry box edges, relative positions of wheels, roof edge and corner lines or feature points (e.g. midpoints, turning points, etc.)), and these are presented in the form of, for example, pixel coordinate positions or local feature maps, etc.
  • current feature point information refers to real-time feature point information at a current moment
  • historical feature point information refers to feature point information at a different moment in a period of time prior to the current moment.
  • a relative positional relationship of a feature point relative to a reference point is obtained, and then non-visible edge information located in the blind zone is computed according to the relative positional relationship and position information of the reference point (position information of the reference point is obtained from position information in basic features of the vehicle).
  • endpoint coordinates of any edge can be computed by computing a relative positional relationship between a feature point and a reference point, and further because the coordinates (relative or absolute coordinates) of the reference point can be obtained.
  • the above computation is performed separately for the first vehicle and second vehicle, two endpoints are respectively obtained for each of the two vehicles, and a region can then be enclosed, i.e. the closed region mentioned above.
  • step S3 Euclidean distance, Mahalanobis distance, Manhattan distance, Chebyshev distance and Minkowski distance, etc.
  • step S4 in the absence of historical feature point information, edge information in the blind zone between the first vehicle and second vehicle is computed based on current feature point information of the first vehicle and second vehicle and a pre-stored common vehicle model.
  • the following may for example be specifically included: matching current feature point information of the first vehicle with the common vehicle model to obtain a first relative positional relationship of a feature point of the first vehicle relative to a reference point of the first vehicle, and matching current feature point information of the second vehicle with the common vehicle model to obtain a second relative positional relationship of a feature point of the second vehicle relative to a reference point of the second vehicle; based on the first relative positional relationship, the second relative positional relationship, and absolute position information of the reference point of the first vehicle and absolute position information of the reference point of the second vehicle obtained from the outside, computing the vehicle feature point of the first vehicle and the feature point of the second vehicle in the blind zone between the first vehicle and second vehicle; and obtaining edge information in the blind zone between the first vehicle and second vehicle based on the vehicle feature point of the first vehicle and the feature point of the second vehicle.
  • Step S4 describes how to compute edge information in the blind zone between the first vehicle and second vehicle in the absence of historical feature point information; in this case, because historical feature point information is lacking, a greater number of feature points or feature values can be adopted and retained for real-time data or multiple frames of real-time data, i.e. in addition to comprising basic feature point information of the vehicle and in-depth feature point information of the vehicle, the feature point information may further comprise appearance feature point information of the vehicle, for example: colour extraction values (extraction values may be understood as being new values obtained after precision reduction and activation and can characterize current features, and this method can reduce the amount of computation), pattern extraction values and contour information, and a greater number of turning points can thus be obtained by adding a greater number of feature points.
  • colour extraction values extraction values may be understood as being new values obtained after precision reduction and activation and can characterize current features, and this method can reduce the amount of computation
  • pattern extraction values and contour information and a greater number of turning points can thus be obtained by adding a greater number of feature points.
  • the common vehicle model pre-stored in step S4 may be features of some typical vehicles (typical vehicles such as trucks, coaches, SUVs, etc.) held locally in a roadside computation unit, some principal vehicle types included in a library maintained in the cloud.
  • typical vehicles typically vehicles such as trucks, coaches, SUVs, etc.
  • step S4 Euclidean distance, Mahalanobis distance, Manhattan distance, Chebyshev distance and Minkowski distance.
  • step S7 judging whether an object enters the closed space may be monitoring whether the first vehicle and second vehicle are continuously adjacent within a specified time and the first vehicle and second vehicle are located in the same lane or adjacent lanes, wherein, if the first vehicle and second vehicle are continuously adjacent within the specified time and the first vehicle and second vehicle are located in the same lane or adjacent lanes, the method proceeds to step S8.
  • each perceived vehicle will maintain an ID in an algorithm, if an ID second vehicle B is at all times close to a first vehicle A (e.g. they are not separated by an ID of another vehicle C), and remain in an adjacent lane or the same lane.
  • Fig. 4 is a structural block diagram showing the apparatus of the present invention forjudging free space between vehicles.
  • the apparatus of the present invention for judging free space between vehicles comprises: a blind zone determining module 100, for determining a blind zone between a first vehicle and a second vehicle; an edge information computing module 200, for at least acquiring current feature point information of the first vehicle and second vehicle, and computing edge information in the blind zone between the first vehicle and second vehicle according to the current feature point information of the first vehicle and second vehicle; a closed space generating module 300, for generating a closed space according to the edge information in the blind zone between the first vehicle and second vehicle; and a free space judgement module 400, for judging whether the closed space is a free space based on monitoring of the closed space.
  • the free space judgement module 400 is used for monitoring whether the first vehicle and second vehicle are continuously adjacent within a specified time and the first vehicle and second vehicle are located in the same lane or adjacent lanes.
  • edge information in the blind zone between the first vehicle and second vehicle is computed based on the current feature point information of the first vehicle and second vehicle and further obtained historical feature point information of the first vehicle and second vehicle.
  • the edge information computing module 200 comprises: a matching submodule 210, for matching current feature point information of the first vehicle with historical feature point information of the first vehicle, to obtain a first relative positional relationship of a feature point of the first vehicle relative to a reference point, and matching current feature point information of the second vehicle with historical feature point information of the second vehicle, to obtain a second relative positional relationship of a feature point of the second vehicle relative to a reference point; and a first computing submodule 220 for, based on the first relative positional relationship, the second relative positional relationship, and absolute position information of the reference point of the first vehicle and absolute position information of the reference point of the second vehicle obtained from the outside, computing the vehicle feature point of the first vehicle and the feature point of the second vehicle in the blind zone between the first
  • edge information computing module 200 if a pre-stored common vehicle model is further acquired, in the edge information computing module 200, edge information in the blind zone between the first vehicle and second vehicle is computed based on current feature point information of the first vehicle and second vehicle and the pre-stored common vehicle model.
  • the edge information computing module 200 comprises: a matching sub-module 210, for matching current feature point information of the first vehicle with the common vehicle model to obtain a first relative positional relationship of a feature point of the first vehicle relative to a reference point, and matching current feature point information of the second vehicle with the common vehicle model to obtain a second relative positional relationship of a feature point of the second vehicle relative to a reference point; and a first computing submodule 220 for, based on the first relative positional relationship, the second relative positional relationship, and absolute position information of the reference point of the first vehicle and absolute position information of the vehicle reference point of the second vehicle obtained from the outside, obtaining the vehicle feature point of the first vehicle and the feature point of the second vehicle in the blind zone between the first vehicle and second vehicle; and a second computing submodule 230, for obtaining edge information in the blind zone between the first vehicle and second vehicle based on the vehicle feature point of the first vehicle and the feature point of the second vehicle.
  • the present invention further provides a computer readable medium having a computer program stored thereon, characterized in that the computer program, when executed by a processor, realizes the method described above for judging free space between vehicles.
  • the present invention further provides a computer device, comprising a storage module, a processor, and a computer program stored on the storage module and capable of being run on the processor, characterized in that the processor, upon executing the computer program, realizes the method described above for judging free space between vehicles.
  • the method for judging free space between vehicles and the apparatus for judging free space between vehicles according to the present invention can judge space in a blind zone between vehicles more precisely and can thereby expand a larger amount of usable space.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Traffic Control Systems (AREA)
PCT/EP2022/078233 2021-10-22 2022-10-11 Method for judging free space between vehicles and apparatus therefor WO2023066719A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111231800.0A CN116030432A (zh) 2021-10-22 2021-10-22 车辆之间的自由空间的判断方法及其装置
CN202111231800.0 2021-10-22

Publications (1)

Publication Number Publication Date
WO2023066719A1 true WO2023066719A1 (en) 2023-04-27

Family

ID=84330197

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2022/078233 WO2023066719A1 (en) 2021-10-22 2022-10-11 Method for judging free space between vehicles and apparatus therefor

Country Status (2)

Country Link
CN (1) CN116030432A (zh)
WO (1) WO2023066719A1 (zh)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100322476A1 (en) * 2007-12-13 2010-12-23 Neeraj Krantiveer Kanhere Vision based real time traffic monitoring
EP3404638A1 (en) * 2017-05-18 2018-11-21 Panasonic Intellectual Property Corporation of America Vehicle system, method of processing vehicle information, recording medium storing a program, traffic system, infrastructure system, and method of processing infrastructure information
US20210110711A1 (en) * 2019-06-06 2021-04-15 Verizon Patent And Licensing Inc. Monitoring a scene to analyze an event using a plurality of image streams
CN113077620A (zh) * 2020-01-06 2021-07-06 丰田自动车株式会社 移动体识别系统、移动体识别方法及程序

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100322476A1 (en) * 2007-12-13 2010-12-23 Neeraj Krantiveer Kanhere Vision based real time traffic monitoring
EP3404638A1 (en) * 2017-05-18 2018-11-21 Panasonic Intellectual Property Corporation of America Vehicle system, method of processing vehicle information, recording medium storing a program, traffic system, infrastructure system, and method of processing infrastructure information
US20210110711A1 (en) * 2019-06-06 2021-04-15 Verizon Patent And Licensing Inc. Monitoring a scene to analyze an event using a plurality of image streams
CN113077620A (zh) * 2020-01-06 2021-07-06 丰田自动车株式会社 移动体识别系统、移动体识别方法及程序

Also Published As

Publication number Publication date
CN116030432A (zh) 2023-04-28

Similar Documents

Publication Publication Date Title
Andrade et al. A novel strategy for road lane detection and tracking based on a vehicle’s forward monocular camera
JP6978491B2 (ja) 地面マーキングを認識するための画像処理方法、および地面マーキングを検出するためのシステム
CN110979313B (zh) 一种基于空间地图的自动泊车定位方法及系统
CN107392139B (zh) 一种基于霍夫变换的车道线检测方法及终端设备
EP3121761A1 (en) A system and method for verifying road position information for a motor vehicle
CN110930459A (zh) 灭点提取方法、相机标定方法以及存储介质
CN108197590B (zh) 一种路面检测方法、装置、终端及存储介质
US20200300967A1 (en) Sensor verification
EP3739361A1 (en) Method and system for fusing occupancy maps
US11796331B2 (en) Associating perceived and mapped lane edges for localization
CN108319931B (zh) 一种图像处理方法、装置及终端
CN110555801A (zh) 一种航迹推演的校正方法、终端和存储介质
Suhr et al. Dense stereo-based robust vertical road profile estimation using Hough transform and dynamic programming
US20090099767A1 (en) Light stripe detection method for indoor navigation and parking assist apparatus using the same
CN111376902B (zh) 一种自动驾驶的车道保持方法及系统
WO2023066719A1 (en) Method for judging free space between vehicles and apparatus therefor
CN110631577B (zh) 服务机器人导航路径跟踪方法及服务机器人
CN114475780B (zh) 一种自动泊车方法、装置、设备及存储介质
US20230281872A1 (en) System for calibrating extrinsic parameters for a camera in an autonomous vehicle
Oniga et al. A fast ransac based approach for computing the orientation of obstacles in traffic scenes
CN114973203A (zh) 不完整车位识别方法、装置和自动泊车方法
US10664997B1 (en) Method, camera system, computer program product and computer-readable medium for camera misalignment detection
CN115320637A (zh) 一种自动驾驶方法、装置及存储介质
CN115222779A (zh) 一种车辆切入检测方法、装置及存储介质
CN113643359A (zh) 一种目标对象定位方法、装置、设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22801124

Country of ref document: EP

Kind code of ref document: A1