WO2022196884A1 - Système de détection de véhicule et procédé de détection de véhicule utilisant une caméra stéréo et un radar - Google Patents

Système de détection de véhicule et procédé de détection de véhicule utilisant une caméra stéréo et un radar Download PDF

Info

Publication number
WO2022196884A1
WO2022196884A1 PCT/KR2021/016266 KR2021016266W WO2022196884A1 WO 2022196884 A1 WO2022196884 A1 WO 2022196884A1 KR 2021016266 W KR2021016266 W KR 2021016266W WO 2022196884 A1 WO2022196884 A1 WO 2022196884A1
Authority
WO
WIPO (PCT)
Prior art keywords
radar
image
vehicle
camera
stereo camera
Prior art date
Application number
PCT/KR2021/016266
Other languages
English (en)
Korean (ko)
Inventor
김병성
박세경
김병철
최승운
Original Assignee
주식회사 바이다
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 바이다 filed Critical 주식회사 바이다
Publication of WO2022196884A1 publication Critical patent/WO2022196884A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/04Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/62Text, e.g. of license plates, overlay texts or captions on TV images
    • G06V20/625License plates
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/017Detecting movement of traffic to be counted or controlled identifying vehicles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/133Equalising the characteristics of different image components, e.g. their average brightness or colour balance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance

Definitions

  • Embodiments of the present invention relate to a vehicle detection system and a vehicle detection method using a stereo camera and radar.
  • a camera or radar is used to detect an object.
  • object detection performance is good, but there is a limitation in that the precision of distance and speed measurement for an object is low.
  • object detection technology through radar although the performance of accurate distance and speed measurement is excellent, two adjacent objects are mistaken for one or an object is misdetected due to disturbance of the radar signal (electromagnetic wave) in a place where there is no object. problems may arise.
  • Embodiments of the present invention may provide a vehicle detection system and vehicle detection method using a stereo camera and radar.
  • embodiments of the present invention provide a vehicle detection system and a vehicle detection method capable of accurately detecting a vehicle in a dynamic road environment in which vehicle speed or vehicle distance is always changed and the distance between vehicles may also vary in various ways. can do.
  • embodiments of the present invention may provide a vehicle detection system and a vehicle detection method capable of accurately recognizing and reading a desired item by variously controlling a stereo camera.
  • Embodiments of the present invention in a vehicle detection system using a stereo camera and a radar, using at least one of the first image and the second image through the stereo camera (eg, including a depth map) an image sensing unit for generating image-based detection result data for a vehicle by additionally using information obtained from radar data generated by the radar; a radar detection unit that generates radar-based detection result data for a vehicle using the radar data generated by the radar but additionally using information obtained from at least one image; And it is possible to provide a vehicle detection system using a stereo camera and radar including a fusion processing unit for generating final vehicle detection result data using the image-based detection result data and the radar-based detection result data.
  • Embodiments of the present invention may provide a vehicle detection system using a stereo camera and radar.
  • the vehicle detection system may include: a stereo camera configured to acquire at least one of a first image and a second image of the surroundings; a radar that transmits a radar signal forward and generates radar data based on a signal received according to the transmission of the radar signal; and a fusion detection controller configured to detect a vehicle through mutual analysis between at least one image and radar data.
  • the fusion detection control unit may include: an image sensing unit configured to generate image-based detection result data for a vehicle using at least one image but additionally using information obtained from radar data; a radar detector for generating radar-based detection result data for a vehicle using radar data but additionally using information obtained from at least one image; and a fusion processing unit that corrects the image-based detection result data and the radar-based detection result data to generate final vehicle detection result data.
  • the image sensing unit extracts a primary depth map from at least one image, sets a region of interest in the primary depth map based on the moving object position information provided from the radar sensing unit, and moves within the region of interest. Detects an object, extracts a secondary depth map that reflects the detection result of a moving object, recognizes an object type for a moving object based on the secondary depth map, and converts a moving object into a vehicle according to the recognition result of the object type It is possible to recognize, recognize the license plate of the vehicle to read the license plate number from the license plate, measure the image-based speed for the vehicle, and generate image-based detection result data including the license plate number and image-based speed.
  • the radar detector detects one track ID or two or more track IDs using radar data, and determines one track ID as a valid track ID or two or more track IDs based on distance information for each location provided from the image detector. determine an effective track ID, detect a moving object based on the valid track ID, calculate a radar-based speed for the moving object, and generate radar-based detection result data including the effective track ID and radar-based speed. .
  • the fusion processing unit extracts an image for each valid track ID, finally determines the vehicle speed based on the image-based speed and the radar-based speed, and the final vehicle detection result including the image for each valid track ID, vehicle number, vehicle speed and vehicle type You can create and output data.
  • the stereo camera includes a first camera for acquiring a first image and a second camera for acquiring a second image, the first camera, the second camera, and the radar are installed on a road structure, and the first camera is one side of the radar is installed, and the second camera may be installed on the other side of the radar.
  • It may include a first light and a second light installed on the road structure, wherein the first light is located between the first camera and the radar, and the second light is located between the second camera and the radar.
  • the imaging range of the first camera may overlap the detection range of the radar, and the imaging range of the second camera may overlap the detection range of the radar.
  • the vehicle detection system includes, according to a camera behavior control signal provided from the fusion detection control unit, a pan operation of the first pan-tilt device and a pan operation of the second camera for providing a pan operation, a tilt up operation, or a tilt down operation of the first camera; Further comprising a second pan-tilt device for providing a tilt-up operation or a tilt-down operation, according to a radar behavior control signal provided from the fusion detection control unit, provides a pan operation, a tilt-up operation, or a tilt-down operation of the radar It may further include a pan-tilt device for
  • the vehicle detection system may further include a pan-tilt device that is installed on a road structure and provides a pan operation, a tilt-up operation, or a tilt-down operation of the first camera, the second camera, and the radar.
  • the first camera, the second camera and the radar are all mounted on the pan-tilt device, so they can move at the same time.
  • the fusion detection controller independently controls the first camera and the second camera, and outputs a first control signal to the first camera to control at least one of whether the first camera operates, image color, image brightness, and shooting range and outputting the second control signal to the second camera to control at least one of whether the second camera operates, whether an image is color, image brightness, and a shooting range.
  • the first camera may operate in the first time zone or day time zone to obtain a color image as a first image
  • the second camera may operate in the second time zone or night time zone to obtain a black-and-white image as the second image.
  • the first camera and the second camera operate together in the first time zone or the day time zone to acquire a first image that is a color image and a second image that is a color image, respectively, and the first camera and the second camera work together in a second time zone Alternatively, they may operate together during the night time to acquire a first image that is a black-and-white image and a second image that is a black-and-white image, respectively.
  • the image sensing unit in the fusion detection control unit may acquire a vehicle image for vehicle recognition from a color image, recognize a vehicle license plate from a black-and-white image, and read the vehicle number from the vehicle license plate.
  • the first camera may acquire a first image according to the first brightness setting value
  • the second camera may acquire a second image darker than the first image according to a second brightness setting value lower than the first brightness setting value.
  • the image sensor may acquire a vehicle image for vehicle recognition from the first image, recognize the vehicle license plate from the second image, and read the vehicle number from the vehicle license plate.
  • the image sensing unit compares the license plate of the vehicle recognized from the first image and the license plate of the vehicle recognized from the second image, recognizes the final license plate, reads the vehicle number from the final license plate, or from the first image By comparing the vehicle number read from the license plate of the recognized vehicle with the vehicle number read from the license plate of the vehicle recognized from the second image, the final vehicle number may be read.
  • the first camera may acquire a first image of a first angle of view
  • the second camera may acquire a second image of a second angle of view that is narrower than the first angle of view.
  • the image sensor may acquire a vehicle image for vehicle recognition from the first image, recognize the vehicle license plate from the second image, and read the vehicle number from the vehicle license plate.
  • the acquisition target eg, vehicle image, license plate image
  • the type of detection information eg, vehicle information such as vehicle type, vehicle shape, vehicle color, vehicle number, etc.
  • the first camera 111a and whether each of the second cameras 111b operates (whether shooting), operation time (daytime, night, first time zone, second time zone, etc.), image color status (color image, black and white image), image brightness (bright image) , dark image), angle of view (wide angle, narrow angle), etc.
  • operation time daytime, night, first time zone, second time zone, etc.
  • image color status color image, black and white image
  • image brightness bright image
  • dark image angle of view
  • the vehicle detection system determines whether each of the stereo camera and the radar fails based on whether the stereo camera acquires an image and whether the radar detects a signal, and when it is determined that at least one of the stereo camera and the radar has failed, A failure diagnosis unit for transmitting a failure report message including failure information by using the receiver information may be further included.
  • Embodiments of the present invention may provide a vehicle detection method using a stereo camera and radar.
  • the vehicle detection method includes: an image acquisition step of acquiring at least one of a first image and a second image of the surroundings through a stereo camera; a radar data acquisition step of acquiring radar data based on a signal received according to the transmission of the radar signal through the radar; and a vehicle detection step of detecting a vehicle through mutual analysis between at least one image and radar data.
  • the vehicle detection step may include: an image detection step of generating image-based detection result data for a vehicle using at least one image but additionally using information obtained from radar data; a radar detection step of generating radar-based detection result data for a vehicle using radar data but additionally using information obtained from at least one image; and a fusion processing step of generating final vehicle detection result data by correcting the image-based detection result data and the radar-based detection result data.
  • the image sensing step may include: extracting a primary depth map from at least one image; setting a region of interest in the primary depth map based on the moving object position information obtained from the radar data; detecting a moving object within a region of interest; extracting a secondary depth map to which a detection result of a moving object is reflected; and recognizing the object type for the moving object based on the secondary depth map, recognizing the moving object as a vehicle according to the recognition result of the object type, recognizing the vehicle license plate of the vehicle to read the vehicle number from the vehicle license plate, and the vehicle It may include measuring the image-based speed for the , and generating image-based detection result data including the vehicle number and the image-based speed.
  • the radar detection step may include detecting one track ID or two or more track IDs using radar data; determining one track ID as a valid track ID or determining a valid track ID among two or more track IDs based on distance information for each location obtained from at least one image; detecting a moving object based on the valid track ID; and calculating a radar-based velocity for the moving object, and generating radar-based detection result data including an effective track ID and a radar-based velocity.
  • images by valid track ID are extracted, vehicle speed is finally determined based on image-based speed and radar-based speed, and final vehicle detection including images by valid track ID, vehicle number, vehicle speed and vehicle type Result data can be generated and output.
  • a vehicle detection system and vehicle detection capable of accurately detecting a vehicle in a dynamic road environment in which a vehicle speed or vehicle distance is always changed and the distance between vehicles may also vary in various ways method can be provided.
  • FIG. 1 is a block diagram of a vehicle detection system according to embodiments of the present invention.
  • FIG. 2 is a flowchart of a vehicle detection method according to embodiments of the present invention.
  • FIG. 3 is a detailed flowchart of a vehicle detection method according to embodiments of the present invention.
  • FIG. 4 is an exemplary view in which a vehicle detection system according to embodiments of the present invention is installed on a road structure.
  • FIG. 5 is an exemplary view illustrating installation of a stereo camera and radar included in a vehicle detection system according to embodiments of the present invention.
  • FIG. 6 is another exemplary installation view of a stereo camera and radar included in a vehicle detection system according to embodiments of the present invention.
  • FIG. 7 is a diagram for explaining independent control of a stereo camera and a radar by a fusion detection controller in a vehicle detection system according to embodiments of the present invention.
  • FIGS. 8 to 12 are diagrams for explaining a method of using a stereo camera of a vehicle detection system according to embodiments of the present invention.
  • FIG. 13 is a diagram for explaining a malfunction diagnosis function of a stereo camera and a radar included in a vehicle detection system according to embodiments of the present invention.
  • temporal precedence relationship such as “after”, “after”, “after”, “before”, etc.
  • a flow precedence relationship when a flow precedence relationship is described, it may include a case where it is not continuous unless “immediately” or "directly” is used.
  • FIG. 1 is a block diagram of a vehicle detection system 100 according to embodiments of the present invention.
  • a vehicle detection system 100 may include a heterogeneous sensor system 110 and a fusion detection controller 120 .
  • the heterogeneous sensor system 110 may include a stereo camera 111 and a radar 112 .
  • the stereo camera 111 may acquire at least one of a first image and a second image of the surroundings.
  • the radar 112 may transmit a radar signal forward and generate radar data based on a signal received according to the transmission of the radar signal.
  • the fusion detection controller 120 may detect a vehicle through mutual analysis between at least one image and radar data.
  • the fusion detection controller 120 may include an image detector 121 , a radar detector 122 , a fusion processor 123 , and a controller 124 .
  • the image sensing unit 121 may generate image-based detection result data of the vehicle using at least one image but additionally using information obtained from radar data.
  • the radar detector 122 may generate radar-based detection result data for a vehicle using radar data but additionally using information obtained from at least one image.
  • the fusion processing unit 123 may generate final vehicle detection result data by correcting the image-based detection result data and the radar-based detection result data.
  • the controller 124 may control the operation of one or more of the image detector 121 , the radar detector 122 , and the fusion processor 123 .
  • the controller 124 may control the respective operations of the stereo camera 111 and the radar 112 .
  • the image sensing unit 121 extracts a primary depth map from at least one image, and sets a region of interest in the primary depth map based on the moving object position information provided from the radar sensing unit 122 .
  • the image sensing unit 121 may acquire a first image and a second image including one depth map implemented through a stereo camera and an image processing algorithm.
  • the image sensing unit 121 detects a moving object within a set ROI, extracts a secondary depth map reflecting a detection result of the moving object, and recognizes an object type of the moving object based on the secondary depth map And, it is possible to recognize a moving object as a vehicle according to the recognition result of the object type, read the vehicle number from the vehicle license plate by recognizing the vehicle license plate of the vehicle, and measure the image-based speed of the vehicle.
  • the image sensor 121 may generate image-based detection result data including a vehicle number and an image-based speed.
  • the above-described image detection unit 121 detects a vehicle (object) based on the image, tracks the vehicle, and an image tracking unit for determining the center of the vehicle, and vehicle number recognition for recognizing the vehicle license plate and reading the vehicle number from the license plate may include parts and the like.
  • the vehicle number recognition unit may generate and output enforcement data through vehicle number plate recognition and vehicle number reading.
  • the radar detection unit 122 detects one track ID or two or more track IDs using radar data, and based on the distance information for each location provided from the image detection unit 121 , sets one track ID as a valid track ID. , or a valid track ID among two or more track IDs may be determined.
  • the radar detector 122 may detect a moving object based on the valid track ID and calculate a radar-based speed of the moving object.
  • the radar detector 122 may generate radar-based detection result data including an effective track ID and a radar-based speed.
  • the convergence processing unit 123 extracts an image for each valid track ID, finally determines the vehicle speed based on the image-based speed and the radar-based speed, and finally includes an image for each valid track ID, vehicle number, vehicle speed, and vehicle type. Vehicle detection result data can be generated and output.
  • the image detector 121 may be included in the stereo camera 111 .
  • the radar detector 122 may be included in the radar 112 .
  • the fusion processing unit 123 may be included in the stereo camera 111 or the radar 112 .
  • the vehicle detection system 100 detects a vehicle using the stereo camera 111 and the radar 112 , and utilizes characteristics of the stereo camera 111 and the radar 112 , respectively. This will be described below.
  • the stereo camera 111 may exhibit excellent performance in object detection, but there is a limit to increasing the precision in measuring distance and speed to an object in an outdoor environment due to resolution and separation distance between the cameras.
  • the radar 112 shows excellent performance in measuring an accurate distance and speed.
  • the radar 112 there may be a problem in that two adjacent vehicles are detected as one or a vehicle is erroneously detected due to disturbance of a radar signal (electromagnetic wave) in a place where there is no vehicle.
  • a radar signal electromagnettic wave
  • the vehicle detection system 100 reduces the system load by normally operating the stereo camera 111 at a low resolution in order to detect a vehicle in a vision method through the stereo camera 111, Only during real-time detection, the stereo camera 111 may be operated in high resolution to concentrate the system load.
  • the vehicle detection system 100 is a high-resolution image taken by shooting a high-resolution image when the vision method detection using the stereo camera 111 and the detection using the radar 112 are simultaneously performed.
  • the vehicle number can be read using the image of the vehicle, and the captured high-resolution image can be used as evidence for the vehicle number.
  • FIG. 2 is a flowchart of a vehicle detection method according to embodiments of the present invention.
  • a vehicle detection method is a vehicle detection method using a stereo camera 111 and a radar 112 , and includes a first image and a second image of the surroundings through the stereo camera 111 .
  • An image acquisition step (S210) of acquiring at least one image among the two images, a radar data acquisition step (S220) of acquiring radar data based on a signal received according to transmission of a radar signal through the radar 112 (S220), and at least one It may include a vehicle detection step (S230) of detecting a vehicle through mutual analysis between the image and radar data of the .
  • the image acquisition step S210 and the radar data acquisition step S220 may be performed simultaneously. Alternatively, the image acquisition step S210 may be performed first, and the radar data acquisition step S220 may be performed later. Alternatively, the radar data acquisition step S220 may be performed first, and the image acquisition step S210 may be performed later.
  • the above-described vehicle detection step S230 may be performed by the fusion detection controller 120 .
  • the stereo camera 111 After the vehicle detection step (S230), based on whether the image is acquired from the stereo camera 111 and whether the signal is detected from the radar 120, the stereo camera 111 and It is determined whether each of the radars 120 has a failure, and when it is determined that at least one of the stereo camera 111 and the radar 120 has failed, a failure report message including the failure information is transmitted using preset receiver information. It may further include a fault diagnosis step (S240).
  • FIG. 3 is a detailed flowchart of a vehicle detection method according to embodiments of the present invention.
  • the vehicle detection step S230 may include an image detection step S310 , a radar detection step S320 , and a fusion processing step S330 .
  • the fusion detection controller 120 In the image detection step ( S310 ), the fusion detection controller 120 generates image-based detection result data for the vehicle by using at least one image but additionally using information obtained from radar data.
  • the fusion detection control unit 120 In the radar detection step ( S320 ), the fusion detection control unit 120 may generate radar-based detection result data for the vehicle using radar data but additionally using information obtained from at least one image.
  • the fusion detection control unit 120 may generate final vehicle detection result data by correcting the image-based detection result data and the radar-based detection result data.
  • the image sensing step ( S310 ) may include extracting a primary depth map from at least one image ( S311 ); setting a region of interest in the primary depth map based on the moving object position information obtained from the radar data (S313); detecting a moving object within the ROI (S315); extracting a secondary depth map in which the detection result of the moving object is reflected (S317); and recognizing the object type for the moving object based on the secondary depth map, recognizing the moving object as a vehicle according to the recognition result of the object type, recognizing the vehicle license plate of the vehicle to read the vehicle number from the vehicle license plate, and the vehicle It may include measuring the image-based speed for the , and generating image-based detection result data including the vehicle number and the image-based speed (S319), and the like.
  • step S313 the image detection unit 121 sets the region of interest in the primary depth map based on the moving object position information obtained from the radar data by the radar detection unit 122 , thereby determining the amount of calculation required for setting the region of interest. can be significantly reduced.
  • the radar detection step ( S320 ) includes: detecting one track ID or two or more track IDs using radar data ( S321 ); determining one track ID as a valid track ID or determining a valid track ID among two or more track IDs based on distance information for each location obtained from at least one image (S323); detecting a moving object based on a valid track ID (S325); and calculating the radar-based speed for the moving object, and generating radar-based detection result data including the valid track ID and the radar-based speed (S327).
  • the radar detection unit 122 determines one track ID as a valid track ID or one of two or more track IDs based on location-specific distance information obtained from at least one image by the image detection unit 121.
  • the valid track ID it is possible to significantly reduce the amount of computation required for determining the valid track ID, prevent erroneous detection, and separate large object track IDs exceeding the lane width.
  • the fusion processing unit 123 in the fusion detection control unit 120 extracts images for each valid track ID, and finally determines the vehicle speed based on the image-based speed and the radar-based speed. And, it is possible to generate and output the final vehicle detection result data including an image for each valid track ID, a vehicle number, a vehicle speed, and a vehicle type.
  • FIG. 4 is an exemplary view in which the vehicle detection system 100 according to embodiments of the present invention is installed on a road structure 400 .
  • the vehicle detection system 100 may be installed on a road structure 400 installed on or around a road.
  • a heterogeneous sensor system 110 including a stereo camera 111 and a radar 112 is installed on a road structure 400 , and is installed at a position capable of detecting vehicles traveling on the road. can
  • the road structure 400 may include pillars 410 and 420 erected on one or both sides of the road and an upper bar 430 connected to the top of the pillars 410 and 420 . have.
  • the upper bar 430 may be installed at the top of the road and cross all lanes of the road.
  • the heterogeneous sensor system 110 may be installed on the upper bar 430 of the road structure 400 , and the fusion detection control unit 120 may be installed on one of the pillars 410 among the pillars 410 of the road structure 400 . have.
  • the heterogeneous sensor system 110 and the fusion detection control unit 120 may be connected to each other in a wired or wireless manner.
  • the fusion detection control unit 120 may be connected to the control center device 450 .
  • the control center device 450 may include a computer and a recording medium, and receives various information or messages provided by the fusion detection control unit 120 , and various data (including images) provided by the fusion detection control unit 120 . ), and the received data can be stored and managed in a recording medium.
  • FIG. 5 is an exemplary installation view of the stereo camera 111 and the radar 112 included in the vehicle detection system 100 according to embodiments of the present invention
  • FIG. 6 is a vehicle detection system according to embodiments of the present invention. It is another example of installation of the stereo camera 111 and the radar 112 included in 100 .
  • the stereo camera 111 in the heterogeneous sensor system 110 includes a first camera 111a for acquiring a first image and a second camera 111b for acquiring a second image can do.
  • the first camera 111a, the second camera 111b, and the radar 112 may be installed on the upper bar 430 of the road structure 400 .
  • the first camera 111a may be installed on one side of the radar 112
  • the second camera 111b may be installed on the other side of the radar 112 . That is, the radar 112 is located in the center among the first camera 111a , the second camera 111b and the radar 112 , and the first camera 111a and the second camera 111b are located on both sides of the radar 112 . can be located in each.
  • the vehicle detection system 100 may further include a first light 501 and a second light 502 installed on the upper bar 430 of the road structure 400 .
  • the first illumination 501 may be positioned between the first camera 111a and the radar 112 .
  • the second light 502 may be positioned between the second camera 111b and the radar 112 .
  • the first light 501 and the second light 502 may illuminate the photographing direction of the first camera 111a and the second camera 111b.
  • the first camera 111a and the second camera 111b may be able to acquire images of good quality even at night.
  • the first light 501 and the second light 502 may illuminate the crosswalk to help the safety of pedestrians.
  • a photographing range of the first camera 111a may overlap a detection range of the radar 112 . Accordingly, vision sensing using the first camera 111a and sensing using the radar 112 can be efficiently fused.
  • the photographing range of the second camera 111b may overlap the detection range of the radar 112 . Accordingly, vision sensing using the second camera 111b and sensing using the radar 112 can be efficiently fused.
  • the vehicle detection system 100 provides a pan operation, a tilt-up operation, or a tilt-down operation of the first camera 111a according to the camera behavior control signal provided from the fusion detection controller 120 . and a second pan-tilt device 511b for providing a pan operation, a tilt-up operation, or a tilt-down operation of the first pan-tilt device 511a and the second camera 111b.
  • the radar 112 performs a pan operation, a tilt-up operation, or a tilt-down operation of the radar 112 according to the radar behavior control signal provided from the fusion detection control unit 120 . It may include a pan-tilt device 512 for providing.
  • the first camera 111a , the second camera 111b and the radar 112 are pan-operated, individually and independently, by the respective pan-tilt devices 511a , 511b , 512 . , a tilt-up operation, or a tilt-down operation can be performed.
  • the vehicle detection system 100 is installed on the upper bar 430 of the road structure 400 , and a fan operation of the first camera 111a , the second camera 111b and the radar 112 . , a tilt-up operation, or a pan-tilt device 600 for providing a tilt-down operation may be further included.
  • the first camera 111a , the second camera 111b , and the radar 112 are all mounted on the pan-tilt device 600 and can move at the same time.
  • the first light 501 may move in synchronization with a pan operation, a tilt-up operation, or a tilt-down operation of the first camera 111a. That is, the first light 501 may perform a pan operation, a tilt-up operation, or a tilt-down operation by the pan-tilt device to illuminate the photographing direction of the first camera 111a.
  • the second light 502 may move in synchronization with the second camera 111b in a pan operation, a tilt-up operation, or a tilt-down operation. That is, the second light 502 may perform a pan operation, a tilt-up operation, or a tilt-down operation by the pan-tilt device to illuminate the photographing direction of the second camera 111b.
  • FIG. 7 is a view for explaining independent control of the stereo camera 111 and the radar 112 by the fusion detection controller 120 in the vehicle detection system 100 according to embodiments of the present invention.
  • the fusion detection controller 120 may independently control the first camera 111a and the second camera 111b.
  • the fusion detection control unit 120 outputs the first control signal CS1 to the first camera 111a, whether the first camera 111a operates, image color, image brightness, and shooting range (angle of view), etc. You can control one or more of them.
  • the fusion detection control unit 120 outputs the second control signal CS2 to the second camera 111b, and at least one of whether the second camera 111b operates, whether the image color, image brightness, and the shooting range (angle of view). can be controlled.
  • the first image and the second image obtained from each of the first camera 111a and the second camera 111b included in the stereo camera 110 may be configured in various types.
  • the first image and the second image may be composed of a color image and a color image, respectively, or the first image and the second image may be composed of a color image and a black-and-white image, respectively.
  • the first image and the second image are each composed of a high-resolution image and a high-resolution image
  • the first image and the second image are respectively composed of a high-resolution image and a low-resolution image
  • the first image and the second image can be composed of a low-resolution image and a low-resolution image, respectively.
  • the first image and the second image may be respectively composed of a wide-angle image captured with a wide angle of view and a narrow-angle image captured with a narrow angle of view.
  • the first image and the second image may be configured as a normal image and a thermal image, respectively.
  • FIGS. 8 to 12 are diagrams for explaining a method of using the stereo camera 111 of the vehicle detection system 100 according to embodiments of the present invention.
  • FIGS. 8 and 9 are diagrams illustrating the operation and shooting control of the stereo camera 110 for each time period.
  • the first camera 111a may acquire a color image as a first image by operating in a first time zone or a day time zone.
  • the second camera 111b may obtain a black-and-white image as the second image by operating in the second time zone or the night time zone.
  • the image sensing unit 121 in the fusion detection control unit 120 may acquire a vehicle image for vehicle recognition from the first image, which is a color image.
  • the image detection unit 121 in the fusion detection control unit 120 reads the vehicle number from the vehicle license plate by recognizing the vehicle license plate from the first image, which is a color image. can do.
  • the image detection unit 121 in the fusion detection control unit 120 recognizes the vehicle license plate from the second image, which is a black-and-white image, and reads the vehicle number from the license plate.
  • the first camera 111a and the second camera 111b may operate together during the first time period or the day time period to obtain a first image that is a color image and a second image that is a color image, respectively. have.
  • the first camera 111a and the second camera 111b may operate together in the second time zone or the night time zone to acquire a first image that is a black-and-white image and a second image that is a black-and-white image, respectively.
  • the image sensing unit 121 in the fusion detection control unit 120 receives a vehicle image for vehicle recognition from both the first image and the second image, which are color images. can be obtained
  • the image detection unit 121 in the fusion detection control unit 120 recognizes the license plate from both the first image and the second image, which are black and white images, and recognizes the license plate. You can read the vehicle number from
  • the image sensing unit 121 may acquire a vehicle image from a color image.
  • the image sensing unit 121 may acquire an image for license plate recognition and license plate reading from a low-light black-and-white image, and simultaneously acquire the shape, color, or license plate image of the vehicle by expressing it at the same time.
  • FIGS. 10 and 11 are diagrams illustrating image brightness control of the stereo camera 110 .
  • the first camera 111a may acquire a first image according to a first brightness setting value.
  • the second camera 111b may acquire a second image darker than the first image according to a second brightness setting value lower than the first brightness setting value. That is, the first image acquired by the first camera 111a may have a first brightness, and the second image acquired by the second camera 111b may have a second brightness darker than the first brightness.
  • the image sensing unit 121 may acquire a vehicle image for vehicle recognition from a bright first image having a first brightness.
  • the image sensing unit 121 may recognize the license plate from the second dark image of the second brightness and read the license plate number from the license plate.
  • the image sensing unit 121 compares the license plate of the vehicle recognized from the first image that is a bright image with the license plate of the vehicle recognized from the second image that is a dark image, and recognizes the final license plate. and read the license plate number from the final license plate.
  • the image sensing unit 121 compares the vehicle number read from the vehicle license plate of the vehicle recognized from the first image, which is a bright image, with the vehicle number read from the vehicle license plate of the vehicle recognized from the second image, which is a dark image. , the final vehicle number can be read.
  • the above-described image brightness control for both the reflective type or non-reflective type of license plate, accurate recognition of the license plate and accurate reading of the vehicle number may be possible. For example, even if there is an operation to obstruct the reading of the license plate, such as attaching a half-ring to the license plate, accurate recognition of the license plate and accurate reading of the number of vehicles may be possible through the above-described image brightness control.
  • one of the first and second cameras 111a and 111b is set to low brightness, and the other is set to high brightness, regardless of whether the reflective/non-reflective license plate image can be obtained more accurately.
  • the old non-reflective license plate and the new vehicle reflective license plate are mixed and used, it may be easier to acquire the license plate image.
  • FIG. 12 is a view showing the angle of view control of the stereo camera 110 .
  • the first camera 111a may acquire a first image of a first angle of view
  • the second camera 111b may acquire a second image of a second angle of view that is narrower than the first angle of view. That is, the first image is a wide-angle image compared to the second image, and the second image is a narrow-angle image compared to the first image.
  • the first camera 111a may have a wide-angle lens or include a digital angle-of-view control module
  • the second camera 111b may include a narrow-angle lens or include a digital angle-of-view control module.
  • the image sensor 121 may acquire a vehicle image for vehicle recognition from a first image that is a wide-angle image.
  • the image sensing unit 121 may recognize the license plate from the second image, which is a narrow-angle image, and read the license plate number from the license plate.
  • the acquisition target eg, vehicle image, vehicle license plate image
  • the type of detection information eg, vehicle information such as vehicle type, vehicle shape, vehicle color, vehicle number, etc.
  • the first camera 111a and whether each of the second cameras 111b operates (whether shooting), operation time (daytime, night, first time zone, second time zone, etc.), image color status (color image, black and white image), image brightness (bright image) , dark image), angle of view (wide angle, narrow angle), etc.
  • operation time daytime, night, first time zone, second time zone, etc.
  • image color status color image, black and white image
  • image brightness bright image
  • dark image angle of view
  • FIG. 13 is a view for explaining a malfunction diagnosis function of the stereo camera 111 and the radar 112 included in the vehicle detection system 100 according to embodiments of the present invention.
  • the vehicle detection system 100 further includes a failure diagnosis unit 1300 that diagnoses a failure of the vehicle detection system 100 and reports to a control center or an administrator if necessary. can do.
  • the failure diagnosis unit 1300 monitors whether an image is acquired from the stereo camera 111 and whether a signal is detected from the radar 112 , and based on the monitoring result, the stereo camera 111 and the radar 112 . ) can determine whether each failure occurs.
  • the fault diagnosis unit 1300 determines that at least one of the stereo camera 111 and the radar 112 has failed, a fault report message (FRM: Fault) including fault information using preset recipient information.
  • Report Message can be sent.
  • the preset recipient information may be network identification information (eg, a domain address, an IP address, etc.) of the control center device 450 or a mobile communication phone number of an administrator.
  • the vehicle detection system 100 capable of accurately detecting a vehicle in a dynamic road environment in which the vehicle speed or vehicle distance is always changed and the distance between vehicles is also varied in various ways. and a vehicle detection method.
  • a vehicle detection system 100 and a vehicle detection method that allow the stereo camera 111 to be variously controlled to accurately recognize and read a desired one.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Traffic Control Systems (AREA)

Abstract

Des modes de réalisation de la présente invention concernent un système de détection de véhicule et un procédé de détection de véhicule qui peuvent détecter un véhicule par l'utilisation mixte d'une caméra stéréo et d'un radar, afin de détecter avec précision un véhicule dans un environnement de route dynamique dans lequel les vitesses et les distances du véhicule changent continuellement et les distances entre les véhicules peuvent également changer de différentes façons.
PCT/KR2021/016266 2021-03-15 2021-11-09 Système de détection de véhicule et procédé de détection de véhicule utilisant une caméra stéréo et un radar WO2022196884A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2021-0033338 2021-03-15
KR1020210033338A KR102484691B1 (ko) 2021-03-15 2021-03-15 스테레오 카메라 및 레이더를 이용한 차량 감지 시스템 및 차량 감지 방법

Publications (1)

Publication Number Publication Date
WO2022196884A1 true WO2022196884A1 (fr) 2022-09-22

Family

ID=83320684

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2021/016266 WO2022196884A1 (fr) 2021-03-15 2021-11-09 Système de détection de véhicule et procédé de détection de véhicule utilisant une caméra stéréo et un radar

Country Status (2)

Country Link
KR (1) KR102484691B1 (fr)
WO (1) WO2022196884A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102618924B1 (ko) * 2023-10-18 2023-12-28 한화시스템(주) Fmcw 레이다 기반의 객체 탐지 시스템 및 방법

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001126184A (ja) * 1999-10-29 2001-05-11 Matsushita Electric Ind Co Ltd ナンバープレート自動認識装置と車両速度測定方法
KR20140000403A (ko) * 2012-06-22 2014-01-03 경북대학교 산학협력단 스테레오 카메라를 이용한 차량 속도 검출 장치 및 방법
KR101403936B1 (ko) * 2013-02-20 2014-06-27 (주)인펙비전 차량촬영시스템
KR20170080480A (ko) * 2015-12-30 2017-07-10 건아정보기술 주식회사 레이더 및 영상 융합 차량 단속시스템
KR102092936B1 (ko) * 2019-11-22 2020-03-26 (주)알티솔루션 레이더를 이용한 무인 교통단속시스템 및 방법

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5632762B2 (ja) * 2011-01-25 2014-11-26 パナソニック株式会社 測位情報形成装置、検出装置、及び測位情報形成方法
JP6299182B2 (ja) * 2013-11-28 2018-03-28 サクサ株式会社 死活監視システム
JP6825569B2 (ja) * 2015-09-30 2021-02-03 ソニー株式会社 信号処理装置、信号処理方法、およびプログラム
KR101907875B1 (ko) * 2017-03-16 2018-10-15 주식회사 만도 퓨전 탐지 시스템, 탐지 프로세서, 레이더 장치 및 물체 탐지 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001126184A (ja) * 1999-10-29 2001-05-11 Matsushita Electric Ind Co Ltd ナンバープレート自動認識装置と車両速度測定方法
KR20140000403A (ko) * 2012-06-22 2014-01-03 경북대학교 산학협력단 스테레오 카메라를 이용한 차량 속도 검출 장치 및 방법
KR101403936B1 (ko) * 2013-02-20 2014-06-27 (주)인펙비전 차량촬영시스템
KR20170080480A (ko) * 2015-12-30 2017-07-10 건아정보기술 주식회사 레이더 및 영상 융합 차량 단속시스템
KR102092936B1 (ko) * 2019-11-22 2020-03-26 (주)알티솔루션 레이더를 이용한 무인 교통단속시스템 및 방법

Also Published As

Publication number Publication date
KR20220128779A (ko) 2022-09-22
KR102484691B1 (ko) 2023-01-05

Similar Documents

Publication Publication Date Title
WO2018105842A1 (fr) Système de détection d'incident à haute précision basé sur un radar
WO2016186458A1 (fr) Système de collecte d'informations d'images et procédé de collecte d'informations d'images sur des objets mobiles
WO2016153233A1 (fr) Dispositif lidar
WO2019182355A1 (fr) Téléphone intelligent, véhicule et caméra ayant un capteur d'image thermique, et procédé d'affichage et de détection l'utilisant
WO2017116134A1 (fr) Système de radar et de fusion d'images pour l'application des règlements de la circulation routière
WO2022196884A1 (fr) Système de détection de véhicule et procédé de détection de véhicule utilisant une caméra stéréo et un radar
WO2016167499A1 (fr) Appareil de photographie et procédé permettant de commander un appareil de photographie
WO2017195965A1 (fr) Appareil et procédé de traitement d'image en fonction de la vitesse d'un véhicule
WO2016024680A1 (fr) Boîte noire de véhicule permettant de reconnaître en temps réel une plaque d'immatriculation de véhicule en mouvement
WO2020189831A1 (fr) Procédé de surveillance et de commande de véhicule autonome
WO2022039323A1 (fr) Dispositif pour le zoomage et la mise au point à grande vitesse d'une caméra fournissant en continu des images de haute qualité par suivi et prédiction d'un objet mobile à grande vitesse, et procédé pour le zoomage et la mise au point à grande vitesse d'une caméra l'utilisant
KR100467143B1 (ko) 거리측정장치
WO2022196883A1 (fr) Procédé de commande de section et système de commande de section utilisant un appareil de prise de vues et un radar
WO2020130209A1 (fr) Procédé et appareil de mesure de vitesse de véhicule à l'aide d'un traitement d'images
WO2020218716A1 (fr) Dispositif de stationnement automatique et procédé de stationnement automatique
KR20220065384A (ko) 실시간 항구 영상과 선박 식별 장치 신호를 이용한 선박 입출항 감시 시스템
WO2020230921A1 (fr) Procédé d'extraction de caractéristiques d'une image à l'aide d'un motif laser, et dispositif d'identification et robot l'utilisant
WO2023146071A1 (fr) Appareil de contrôle de conditions dans un véhicule extérieur de zone latérale avant
WO2019151704A1 (fr) Procédé et système de mesure de visibilité tridimensionnelle à l'aide d'une caméra de surveillance de la circulation
WO2022231039A1 (fr) Système de feu de signalisation intelligent mobile
WO2018074707A1 (fr) Capteur de pluie pour véhicule et dispositif d'entraînement d'essuie-glace de véhicule équipé de celui-ci
JP2002092751A (ja) 監視システム
WO2012108577A1 (fr) Dispositif et procédé de surveillance
WO2017003257A1 (fr) Dispositif et procédé permettant de reconnaître un état de surface de route
WO2023027538A1 (fr) Véhicule autonome

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21931825

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 13/02/2024)