WO2017116134A1 - Système de radar et de fusion d'images pour l'application des règlements de la circulation routière - Google Patents

Système de radar et de fusion d'images pour l'application des règlements de la circulation routière Download PDF

Info

Publication number
WO2017116134A1
WO2017116134A1 PCT/KR2016/015387 KR2016015387W WO2017116134A1 WO 2017116134 A1 WO2017116134 A1 WO 2017116134A1 KR 2016015387 W KR2016015387 W KR 2016015387W WO 2017116134 A1 WO2017116134 A1 WO 2017116134A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
image
speed
radar
cropping
Prior art date
Application number
PCT/KR2016/015387
Other languages
English (en)
Korean (ko)
Inventor
최광호
이상만
심광호
류승기
조영태
Original Assignee
건아정보기술 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020160173720A external-priority patent/KR101925293B1/ko
Application filed by 건아정보기술 주식회사 filed Critical 건아정보기술 주식회사
Publication of WO2017116134A1 publication Critical patent/WO2017116134A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/24Aligning, centring, orientation detection or correction of the image
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/052Detecting movement of traffic to be counted or controlled with provision for determining speed or overspeed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene

Definitions

  • the present invention relates to a radar and image fusion vehicle enforcement system.
  • the present invention relates to a radar and image fusion vehicle control system using an image captured by one camera and a radar for a vehicle on a multi-lane.
  • a method of measuring information and speed of a vehicle in a domestic ITS system includes a method of embedding a LOOP detector on a road, using a laser sensor, and acquiring information only through an image through a camera.
  • LOOP detector has the disadvantage that the road must be destroyed in the process of embedding on the road, and it can be a factor that hinders the flow of the vehicle in the process of construction.
  • Laser detectors also suffered from the effects of weather and climate (snow, rain, fog, dust, etc.) and had a narrow detection width.
  • the image detector also has a disadvantage that is affected by the weather and climate much, especially the detection rate is greatly reduced at night.
  • radar detectors are relatively less affected by weather and climate than other detectors, and are non-contact, so there is no road destruction.
  • the information output from the radar detector is the speed, distance, and angle of the vehicle. Through this, the exact position and information of the vehicle can be extracted.
  • the radar since the radar also uses radio waves, errors may occur due to the radio wave environment, thereby providing a method of simultaneously using the radar and the image.
  • the present invention provides a system that can calculate the speed of a vehicle accurately by using a radar and an image through a camera when calculating the speed of a vehicle on a multi-lane.
  • the present invention provides a system that can reduce the data processing capacity by using the image through the radar and one camera when calculating the speed for the vehicle on the multi-lane.
  • a radiation beam is emitted to a multi-lane road
  • radar transmission and reception means for receiving a reflected wave reflected from a vehicle traveling on the multi-lane road is triggered by the presence of a target vehicle sensed at high speed by the radar transmission and reception means.
  • Image capturing means for photographing a full frame image including a multi-lane road and the target vehicle
  • a radar speed calculator configured to generate the driving information for the target vehicle by analyzing the transmitted and received radar signals, and analyzing the driving information of the target vehicle.
  • a control unit including an image processing unit for storing the second cropping image as a second cropping image, and a number recognition unit for recognizing the number of the vehicle license plate in the second cropping image.
  • the image processing unit may further include an image speed calculator configured to receive images captured at a predetermined time interval from the image processor and calculate a vehicle speed.
  • the image velocity calculating unit converts a first coordinate value occupied by the target vehicle in the first cropping image coordinate system into a second coordinate value occupied in the full frame image coordinate system, and calculates a moving distance according to time of the second coordinate value. To calculate the speed of the vehicle.
  • An image taken at a predetermined time interval by defining the traveling speed of the target vehicle measured by the radar transmission and reception means from the radar speed calculating unit as a first vehicle speed, and receiving the position of the license plate determined by the image processing unit of the target vehicle.
  • the number of pixels of the moving vehicle number is defined and defined as the second vehicle speed, when both the first vehicle speed, the second vehicle speed, and the second vehicle speed after a predetermined time elapse from the threshold speed, It may include an intermittent determination unit to determine.
  • the control unit defines the first vehicle speed as the average speed when the deviation between the first vehicle speed and the second vehicle speed is less than a first threshold value, and the deviation between the first vehicle speed and the second vehicle speed is determined. If the range of the second threshold value is satisfied, the average value of the first vehicle speed and the second vehicle speed is defined as the average speed, and if the deviation exceeds the range of the second threshold value, the speed of the target vehicle is discarded. Can be.
  • the image photographing means may vary a time difference according to the speed of the first vehicle and capture an image of the target vehicle.
  • the embodiment emits a radar to measure the speed of all the vehicles traveling in the detection zone, and by taking a picture of a specific vehicle of the multi-lane vehicle through one camera, it is possible to reduce the demand of the camera .
  • the data processing capacity can be reduced, thereby improving the computation speed and reducing the computation amount.
  • the embodiment may calculate the speed using only the image of the recognition area, compensate for horizontal and vertical errors that may occur according to the coordinates when calculating the speed, and compensate the error according to the characteristics of the license plate to accurately calculate the speed from the image. Can improve the reliability.
  • the speed detected by the radar and the speed according to the video analysis do not coincide with each other and do not exist within the error range, it is treated as an error and the speed of the video analysis is sent to the control center, thereby eliminating the speed error. to provide.
  • FIG. 1 is an overall configuration diagram of a multi-lane vehicle speed measurement system according to an embodiment of the present invention.
  • FIG. 2 is a detailed block diagram of a multi-lane vehicle speed measuring system according to an embodiment of the present invention of FIG. 1.
  • FIG. 3 is a block diagram illustrating an exemplary embodiment of the controller of FIG. 2.
  • FIG. 4 is a flowchart illustrating a speed calculating method of the controller.
  • FIG. 5 is a diagram illustrating a step of acquiring a recognition region of an image velocity calculating unit.
  • FIG. 6 is a flowchart illustrating correction of the recognition region of FIG. 4.
  • FIG. 7 is a diagram showing obtaining the coordinates of FIG.
  • FIG. 8 is a diagram illustrating obtaining a reverse vehicle reference coordinate of FIG. 6.
  • FIG. 9 is a detailed flowchart illustrating license plate correction of FIG. 6.
  • FIG. 10 is a diagram illustrating an operation of extracting a license plate feature of FIG. 9.
  • FIG. 11 is a diagram illustrating a vehicle model analysis of FIG. 9.
  • FIG. 12 is a view showing the license plate height measurement of FIG.
  • first, second, A, and B may be used to describe various components, but the components should not be limited by the terms. The terms are used only for the purpose of distinguishing one component from another.
  • the first component may be referred to as the second component, and similarly, the second component may also be referred to as the first component.
  • FIG. 1 is an overall configuration diagram of a multi-lane vehicle speed measurement system 10 according to an embodiment of the present invention
  • Figure 2 is a multi-lane vehicle speed measurement system 10 according to an embodiment of the present invention
  • FIG. 3 is a block diagram illustrating an embodiment of the controller 100 of FIG. 2.
  • the multi-lane vehicle speed measuring system 10 emits radiation waves on a multi-lane road and receives reflected waves reflected from the vehicle 20 traveling on the multi-lane road.
  • the radar 300 one camera 200 for capturing an image of the vehicle 20 sensed by the radar 300, and a speed of the first vehicle 20 from the radar 300.
  • a controller 100 for calculating, correcting and comparing the second vehicle speed from the camera 200 to calculate the final speed of the vehicle 20.
  • the radar 300 has a short period and emits the radiation wave, and the camera 200 triggers when the speeding vehicle 20 is recognized as a target by the radiation wave obtained from the radar 300. Then, the photographing is performed twice based on the lane in which the target vehicle 20 is located among the multi lanes.
  • the control unit 100 receives the reflected wave from the radar 300, reads the radar speed calculation unit 110 for calculating the first vehicle speed of the vehicle 20, and receives an image from the camera 200, An image speed calculator 120 that calculates a second vehicle speed of the vehicle 20 by processing and correcting the same, and a final speed calculator 130 that receives the first vehicle speed and the second vehicle speed and compares them to obtain a final speed. ).
  • the radar speed calculating unit 110 obtains information on the emission point of the radiation wave and the reception point of the reflected wave from the radar 300 to calculate distance information of the vehicle 20 spaced apart from the radar 300 to obtain distance information.
  • the speed information is extracted from the first vehicle speed, and the speed information 20 is recognized as the speed vehicle 20 when the first vehicle speed exceeds a critical speed (intermittent speed).
  • a critical speed intermittent speed
  • the camera 200 is triggered to capture a full frame image including the target vehicle 20 twice.
  • the controller 100 analyzes the driving information of the target vehicle 20 to determine the position of the target vehicle 20 in the full frame image on the multi-lane road, and the full frame image on the multi-lane road. Crop the area in which the vehicle is located and store it as a first cropping image, recognize the license plate of the target vehicle 20 from the first cropping image, and crop the area where the license plate is located. It may further include an image processor (not shown) for storing as two cropping images.
  • the image processor crops an area in which the target vehicle 20 is located from the two images to obtain a region of interest (ROI) of two first cropped images.
  • ROI region of interest
  • the image processor recognizes a vehicle license plate from the two first cropping images and the first cropping image, and determines the number and position of the vehicle license plate.
  • the license plate position may be defined as a reference point of a point in the license plate of the target vehicle 20.
  • the reference point may be the center of the license plate, and when the license plate is rectangular, it may be one of four corners.
  • control unit 100 defines the traveling speed of the target vehicle measured by the radar transmission and reception means from the radar speed calculating unit as a first vehicle speed, and transmits the position of the license plate determined by the image processing unit of the target vehicle.
  • the first vehicle speed, the second vehicle speed, and the second vehicle speed after a predetermined time elapse, when the number of pixels of the moved vehicle number is counted and defined as the second vehicle speed in the images photographed at predetermined time intervals. If it is out of the speed may further include an enforcement decision unit (not shown) that is determined to be subject to enforcement.
  • the apparatus may further include a number recognition unit (not shown) that recognizes the number of the vehicle license plate in the second cropping image.
  • the image velocity calculating unit 120 converts the first coordinate value occupied by the target vehicle 20 in the first cropping image coordinate system into a second coordinate value occupied in the full frame image coordinate system, and the time of the second coordinate value. Calculate the speed of the vehicle by calculating the distance traveled by
  • the image speed calculating unit 120 may include an image correcting unit 125 that corrects coordinates of the position of the recognition area within the entire image and corrects it according to the feature of the license plate serving as a reference point.
  • the image speed calculator 120 may calculate a second vehicle speed based on the corrected second coordinate values from the image corrector 125.
  • the second vehicle speed may be calculated based on a distance between two corrected second coordinate values for a time difference between two images.
  • the final speed calculator 130 compares the first vehicle speed with the second vehicle speed to determine whether the two vehicle speeds match each other, whether the first vehicle speed by the radar 300 and the second vehicle speed by the camera 200 are determined. It is determined whether is within the error range to calculate the final speed.
  • the control unit 100 includes a communication unit 150 for transmitting the calculated final speed and vehicle information to the control center 400 in real time, and stores the captured image, vehicle number, average speed, date, time data Memory 140.
  • the control center 400 receives vehicle information and final speed information about the speeding vehicle 20 from the multi-lane vehicle 20 speed detection system 10, and displays a vehicle number and an average on the image screen of the vehicle 20. It includes a server that displays and stores speed, date, and time data in a defined area.
  • FIG. 4 is a flowchart illustrating a speed calculating method of the controller 100
  • FIG. 5 is a diagram illustrating a step of obtaining a recognition region of the image speed calculating unit 120.
  • the radar 300 periodically receives a carrier wave and calculates it to generate a first vehicle speed (S110). If it is determined that the first vehicle speed of the vehicle 20 calculated by the radar speed calculating unit 110 exceeds a threshold speed (interruption speed) (s120), the vehicle 20 is defined as the target vehicle 20.
  • a threshold speed interruption speed
  • the critical speed may be an intermittent speed of the corresponding multi-lane road.
  • the target vehicle 20 When the target vehicle 20 is defined, the lane in which the target vehicle 20 is located is recognized (S130), and one camera 200 disposed on the multi-lane road is triggered to adjust the time difference so that the target vehicle 20 is included. Photographing is performed two times (s140).
  • two images taken two times may be obtained with a time difference of 80 msec as shown in FIGS. 5A and 5B.
  • the time difference may vary according to the first vehicle speed, and may have a time difference of 160 msec when it is 60 km / h or less, and may have a time difference of 120 msec when it is 60 km / h to 80 km / h, and 80 km / h to In the case of 100 km / h, it may have a time difference of 80 msec, and in the case of 100 km / h or more, it may have a time difference of 40 msec.
  • the image processor obtains vehicle number information of the target vehicle 20 from the two images, and crops an area where the target vehicle 20 is disposed to the recognition area (S150).
  • an area defining the position of the target vehicle 20 is obtained from the recognition area of the radar 300 distance information defined by the red line, and the recognition area including the same is cut out to be a signal processing target. define.
  • the size of the recognition area may be adjusted according to the size of the area defining the location of the target vehicle 20 and the size of the vehicle 20 in the image.
  • the image correcting unit 125 of the image speed calculating unit 120 performs vertical correction and horizontal correction according to the position of the entire image of the recognition area, performs correction with respect to the license plate, and corrects the coordinates of the reference point. It generates (s160).
  • the vertical and horizontal correction is to correct the error caused by the difference in perspective as the three-dimensional space is converted to the two-dimensional image, the position of one pixel in the image is located above or below the image. Different distances are displayed depending on whether the image is different from each other.
  • the image speed calculator 120 calculates a second vehicle speed of the target vehicle 20 from the image based on the corrected coordinate value (S170).
  • the final speed calculator 130 calculates the final speed by comparing the first vehicle speed with the second vehicle speed (S180).
  • the final speed calculator 130 calculates the speed deviation to define the final speed.
  • the final speed calculating unit defines the first vehicle speed as the speed of the target vehicle when the deviation between the first vehicle speed and the second vehicle speed is less than a first threshold value, and the first vehicle speed and the second vehicle speed.
  • the average value of the first vehicle speed and the second vehicle speed is defined as the speed of the target vehicle, and the deviation determines the second threshold value. If exceeded, the speed of the target vehicle may be discarded.
  • the final speed of the target vehicle is the first vehicle speed and the deviation Is 3% to 7%
  • the final speed of the target vehicle is the average of the first vehicle speed and the second vehicle speed. If the deviation exceeds 7%, the final speed of the target vehicle is determined to be an imaginary number that cannot be calculated. Process.
  • the number of the cameras 200 can be reduced by performing the speed of the vehicle 20 in the multi-lane through one radar 300 and one camera 200, and recognition of the data processing target in the captured image.
  • Cropping areas can reduce data throughput and speed up processing.
  • the multi-lane vehicle speed measuring system 10 of the present invention can calculate a more reliable speed by correcting errors that may occur in processing data by cropping the recognition area.
  • FIG. 6 is a flowchart illustrating correction of the recognition region of FIG. 4
  • FIG. 7 is a diagram illustrating obtaining the coordinates of FIG. 6
  • FIG. 8 is a diagram illustrating obtaining the inverse vehicle reference coordinates of FIG. 6.
  • the image corrector 125 may determine an area of a region corresponding to the first cropping region in the full frame image coordinate system. A coordinate value is obtained (s161).
  • the coordinate values in the full frame image coordinate system of the first cropping region are basic data for correcting a distance error generated according to where the first cropping region is located in the full frame image.
  • the vehicle reference coordinate value of the target vehicle 20 is obtained in the first cropping area coordinate system (S163).
  • the vehicle reference coordinate value may be a coordinate value of the license plate and may be one point or a plurality of points of the license plate. For example, as shown in FIG. 7, the coordinate values of the upper left corner and the lower right corner may be recognized as the vehicle reference coordinate.
  • the vehicle reference coordinate value in the first cropping area is obtained by converting the vehicle reference coordinate value into the coordinate value of the full frame video coordinate system (S167).
  • the y-axis coordinate value difference between the lower right corners of the two first cropping images from the two images having the time difference is 74 pixels.
  • the coordinate difference is 124 pixels.
  • the data processing is performed only by recognizing the coordinates of the recognition area, thereby increasing the computation speed and ensuring the accuracy of the data.
  • the vertical correction and the horizontal correction are performed by correcting the moving distance of the 3D real vehicle 20 from the 2D image by weighting the coordinate values in which region of the full frame image.
  • one pixel in the image has the same distance value, but the pixel is positioned up or down in the 2D image.
  • the distance value of the corresponding pixel is different depending on whether the pixel is arranged at the position of the pixel, and the distance value is different depending on which position is disposed at the left or right side of the pixel.
  • the moving distance of the vehicle 20 may be calculated from the corrected full frame coordinate value by multiplying the reference value with weights according to the position of each pixel.
  • the image correction unit 125 may further perform license plate correction after performing vertical and horizontal correction to the full frame image coordinate value (S169).
  • FIG. 9 is a detailed flowchart illustrating license plate correction of FIG. 6,
  • FIG. 10 is a diagram illustrating an operation of extracting license plate features of FIG. 9,
  • FIG. 11 is a diagram illustrating a vehicle model analysis of FIG. 9, and
  • FIG. 12 is a license plate of FIG. 9. This figure shows the height measurement.
  • the image corrector 125 extracts features of the license plate of the target vehicle 20 from two captured images (S200).
  • the license plate features may be the size, color, aspect ratio, etc. of the license plate.
  • vehicle type analysis of the target vehicle 20 is performed from the two captured images (S210).
  • the vehicle model analysis is classified into a large, medium, and small vehicle 20 according to the size of the vehicle 20, and performs detailed vehicle classification.
  • the vehicle 20 may be classified into a passenger car, a bus, a truck, and other vehicles 20, and the speed correction may be performed only for other vehicle types without performing speed correction in the case of a bus.
  • the license plate height d may be defined as the shortest distance from the bottom of the vehicle 20, that is, from the bottom of the tire of the vehicle 20 to a reference point of the license plate, for example, the lower right corner of the license plate, and FIGS. 11A to 11D. As shown in FIG. 12A and FIG. 12B, different heights may be provided depending on the vehicle type, and even in the same vehicle type as shown in FIGS. 12A and 12B.
  • the vehicle 20 performs headlight position analysis (S230).
  • the headlight analysis may estimate the width of the vehicle 20 at intervals between the headlights to distinguish the large vehicle 20, and analyze the reference point position of the license plate from the headlight of the vehicle 20.
  • the license plate coordinate correction is performed based on previously performed license plate features, vehicle type, license plate height, and headlight analysis contents (S240).
  • the size of the license plate may be enlarged by 5 to 15% as the vehicle 20 approaches. Accordingly, the coordinate correction of the pixel may be performed by weighting the enlargement ratio, and the coordinate may be a reference point, for example, an edge or a center of the license plate.
  • the height (d) of the license plate from the ground may vary according to the arrangement of the vehicle model and the license plate as shown in Figs. 11 and 12 to generate the vertical error described above.
  • the coordinates of the reference point that compensates for the error can be generated by weighting the vehicle according to the height of the model and the license plate.
  • the weight may be given only when the height d of the license plate is greater than or equal to a threshold.
  • the speed of the vehicle 20 may be calculated from the coordinates of the corrected reference point on which the license plate correction is performed, thereby obtaining data having improved reliability.

Abstract

Dans un mode de réalisation, l'invention concerne un système de mesure de la vitesse de véhicules circulant sur des voies multiples, qui comprend : un radar qui émet des ondes radio au niveau d'une route à voies multiples et reçoit les ondes réfléchies par les véhicules circulant sur la route à voies multiples ; une caméra déclenchée par la présence d'un véhicule en excès de vitesse, détecté par le radar, en vue de capturer des images du véhicule en tant que véhicule cible à l'aide d'une différence de temps par rapport à la voie sur laquelle se trouve le véhicule cible ; et une unité de commande, qui calcule une première vitesse de véhicule sur la base d'informations relatives au véhicule cible et provenant du radar, calcule une deuxième vitesse de véhicule sur la base d'informations provenant de la caméra et compare la première vitesse du véhicule et la deuxième vitesse du véhicule afin de calculer la vitesse moyenne du véhicule cible. Par conséquent, les images d'un véhicule spécifique, parmi les véhicules se situant sur les multiples voies, sont capturées par l'intermédiaire d'une caméra, ce qui permet de réduire le nombre des caméras requises. En outre, un cas dans lequel la vitesse détectée par le radar et la vitesse obtenue selon une analyse d'image ne correspondent pas, et la différence entre celles-ci ne se trouve pas dans une plage d'erreur, est traité en tant qu'erreur ; et la vitesse obtenue selon une analyse d'image est transmise à un poste central, ce qui permet de supprimer une erreur de vitesse.
PCT/KR2016/015387 2015-12-30 2016-12-28 Système de radar et de fusion d'images pour l'application des règlements de la circulation routière WO2017116134A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR20150190070 2015-12-30
KR10-2015-0190070 2015-12-30
KR10-2016-0173720 2016-12-19
KR1020160173720A KR101925293B1 (ko) 2015-12-30 2016-12-19 레이더 및 영상 융합 차량 단속시스템

Publications (1)

Publication Number Publication Date
WO2017116134A1 true WO2017116134A1 (fr) 2017-07-06

Family

ID=59225279

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2016/015387 WO2017116134A1 (fr) 2015-12-30 2016-12-28 Système de radar et de fusion d'images pour l'application des règlements de la circulation routière

Country Status (1)

Country Link
WO (1) WO2017116134A1 (fr)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109444916A (zh) * 2018-10-17 2019-03-08 上海蔚来汽车有限公司 一种无人驾驶可行驶区域确定装置及方法
CN109886308A (zh) * 2019-01-25 2019-06-14 中国汽车技术研究中心有限公司 一种基于目标级别的双传感器数据融合方法和装置
CN110444026A (zh) * 2019-08-06 2019-11-12 北京万集科技股份有限公司 车辆的触发抓拍方法及系统
CN111177297A (zh) * 2019-12-31 2020-05-19 信阳师范学院 一种基于视频和gis的动态目标速度计算优化方法
CN111634290A (zh) * 2020-05-22 2020-09-08 华域汽车系统股份有限公司 高级驾驶辅助的前向融合系统及方法
CN112419712A (zh) * 2020-11-04 2021-02-26 同盾控股有限公司 道路断面车速检测方法及系统
CN113658427A (zh) * 2021-08-06 2021-11-16 深圳英飞拓智能技术有限公司 基于视觉与雷达的路况监控方法及系统、设备
CN114220285A (zh) * 2021-12-14 2022-03-22 中国电信股份有限公司 超速车辆的定位与示警方法、装置、电子设备及可读介质
CN115331457A (zh) * 2022-05-17 2022-11-11 重庆交通大学 一种车速管理方法及系统

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040252193A1 (en) * 2003-06-12 2004-12-16 Higgins Bruce E. Automated traffic violation monitoring and reporting system with combined video and still-image data
KR20100003381A (ko) * 2008-07-01 2010-01-11 서울시립대학교 산학협력단 교통정보 수집장치
KR101288264B1 (ko) * 2012-01-20 2013-07-26 이구형 다차로 속도감지 시스템 및 그 방법
KR101291301B1 (ko) * 2013-02-28 2013-07-30 심광호 영상 및 레이더를 이용한 차량 속도 측정시스템
KR20140126188A (ko) * 2013-04-22 2014-10-30 오성레이저테크 (주) 레이저 방식 과속 단속 장치

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040252193A1 (en) * 2003-06-12 2004-12-16 Higgins Bruce E. Automated traffic violation monitoring and reporting system with combined video and still-image data
KR20100003381A (ko) * 2008-07-01 2010-01-11 서울시립대학교 산학협력단 교통정보 수집장치
KR101288264B1 (ko) * 2012-01-20 2013-07-26 이구형 다차로 속도감지 시스템 및 그 방법
KR101291301B1 (ko) * 2013-02-28 2013-07-30 심광호 영상 및 레이더를 이용한 차량 속도 측정시스템
KR20140126188A (ko) * 2013-04-22 2014-10-30 오성레이저테크 (주) 레이저 방식 과속 단속 장치

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109444916A (zh) * 2018-10-17 2019-03-08 上海蔚来汽车有限公司 一种无人驾驶可行驶区域确定装置及方法
CN109444916B (zh) * 2018-10-17 2023-07-04 上海蔚来汽车有限公司 一种无人驾驶可行驶区域确定装置及方法
CN109886308B (zh) * 2019-01-25 2023-06-23 中国汽车技术研究中心有限公司 一种基于目标级别的双传感器数据融合方法和装置
CN109886308A (zh) * 2019-01-25 2019-06-14 中国汽车技术研究中心有限公司 一种基于目标级别的双传感器数据融合方法和装置
CN110444026A (zh) * 2019-08-06 2019-11-12 北京万集科技股份有限公司 车辆的触发抓拍方法及系统
CN111177297A (zh) * 2019-12-31 2020-05-19 信阳师范学院 一种基于视频和gis的动态目标速度计算优化方法
CN111177297B (zh) * 2019-12-31 2022-09-02 信阳师范学院 一种基于视频和gis的动态目标速度计算优化方法
CN111634290A (zh) * 2020-05-22 2020-09-08 华域汽车系统股份有限公司 高级驾驶辅助的前向融合系统及方法
CN111634290B (zh) * 2020-05-22 2023-08-11 华域汽车系统股份有限公司 高级驾驶辅助的前向融合系统及方法
CN112419712A (zh) * 2020-11-04 2021-02-26 同盾控股有限公司 道路断面车速检测方法及系统
CN113658427A (zh) * 2021-08-06 2021-11-16 深圳英飞拓智能技术有限公司 基于视觉与雷达的路况监控方法及系统、设备
CN114220285A (zh) * 2021-12-14 2022-03-22 中国电信股份有限公司 超速车辆的定位与示警方法、装置、电子设备及可读介质
CN115331457A (zh) * 2022-05-17 2022-11-11 重庆交通大学 一种车速管理方法及系统
CN115331457B (zh) * 2022-05-17 2024-03-29 重庆交通大学 一种车速管理方法及系统

Similar Documents

Publication Publication Date Title
WO2017116134A1 (fr) Système de radar et de fusion d'images pour l'application des règlements de la circulation routière
KR101925293B1 (ko) 레이더 및 영상 융합 차량 단속시스템
KR102267335B1 (ko) 객체와 감지 카메라의 거리차를 이용한 속도 검출 방법
US9886649B2 (en) Object detection device and vehicle using same
US10015394B2 (en) Camera-based speed estimation and system calibration therefor
KR101898051B1 (ko) 다차선 차량 속도 측정 시스템
KR101999993B1 (ko) 레이더 및 카메라를 이용한 무인 단속시스템
US10699567B2 (en) Method of controlling a traffic surveillance system
KR101446546B1 (ko) 위치기반 실시간 차량정보 표시시스템
US11367349B2 (en) Method of detecting speed using difference of distance between object and monitoring camera
US10163341B2 (en) Double stereoscopic sensor
EP3432265A1 (fr) Dispositif de traitement d'image, système de commande d'appareil, dispositif de capture d'image, procédé de traitement d'image, et programme
KR20160100788A (ko) 이동체의 이동속도 측정장치 및 그 방법
WO2020235734A1 (fr) Procédé destiné à estimer la distance à un véhicule autonome et sa position au moyen d'une caméra monoscopique
WO2020101071A1 (fr) Système de surveillance de circulation utilisant un lidar capable de fournir une notification d'obstacle routier et véhicule suivi
WO2022114455A1 (fr) Dispositif pour corriger un signal de position d'un véhicule autonome en utilisant des informations d'image de surface de roulement
KR102484688B1 (ko) 카메라와 레이더를 이용한 구간 단속 방법 및 구간 단속 시스템
WO2013022153A1 (fr) Appareil et procédé de détection de voie
KR102062579B1 (ko) 영상 보정을 통해 그림자 및 빛 반사로 훼손된 차량번호판을 인식하는 차량번호판 인식 시스템
CN117197779A (zh) 一种基于双目视觉的轨道交通异物检测方法、装置及系统
JPH07244717A (ja) 車両用走行環境認識装置
WO2020130209A1 (fr) Procédé et appareil de mesure de vitesse de véhicule à l'aide d'un traitement d'images
KR102418344B1 (ko) 교통정보 분석장치 및 방법
CN112406700B (zh) 一种基于上下双目视觉分析测距的盲区预警系统
KR102385907B1 (ko) 자율주행 차량 항법 장치 및 항법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16882086

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16882086

Country of ref document: EP

Kind code of ref document: A1