CN108226917B - High-precision emergency detection system based on radar - Google Patents

High-precision emergency detection system based on radar Download PDF

Info

Publication number
CN108226917B
CN108226917B CN201710593264.6A CN201710593264A CN108226917B CN 108226917 B CN108226917 B CN 108226917B CN 201710593264 A CN201710593264 A CN 201710593264A CN 108226917 B CN108226917 B CN 108226917B
Authority
CN
China
Prior art keywords
radar
emergency
target object
detection
road
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710593264.6A
Other languages
Chinese (zh)
Other versions
CN108226917A (en
Inventor
朴宰亨
李才均
金正大
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Metabuild Co ltd
Original Assignee
Metabuild Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Metabuild Co ltd filed Critical Metabuild Co ltd
Publication of CN108226917A publication Critical patent/CN108226917A/en
Application granted granted Critical
Publication of CN108226917B publication Critical patent/CN108226917B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/52Discriminating between fixed and moving objects or between objects moving at different speeds
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/52Discriminating between fixed and moving objects or between objects moving at different speeds
    • G01S13/56Discriminating between fixed and moving objects or between objects moving at different speeds for presence detection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/017Detecting movement of traffic to be counted or controlled identifying vehicles
    • G08G1/0175Detecting movement of traffic to be counted or controlled identifying vehicles by photographing vehicles, e.g. when violating traffic rules
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes

Abstract

The invention gives the identification ID to the target object on the road, and analyzes the mode of the speed data of the target object during tracking the target object, thereby being capable of more rapidly and accurately judging the vehicle and the pedestrian. Further, since a plurality of radar pulses are transmitted in one transmission cycle and the pulse widths of the plurality of radar pulses are set according to the road line shape, it is possible to more accurately detect an emergency regardless of the road line shape. In addition, one vehicle can accurately detect a target for one target, thereby accurately processing the target and improving the signal processing performance of radar signals.

Description

High-precision emergency detection system based on radar
Technical Field
The present invention relates to a radar-based high-precision emergency detection system, and more particularly, to a radar-based high-precision emergency detection system capable of sensing an emergency that may occur on a road using a radar sensor.
Background
Road traffic utilized by vehicles and people always has potential accident possibility, and a large number of traffic accidents and dangerous situations occur every day in the whole country. Particularly, in an ultra-high speed driving environment such as an expressway or at night, the driver's cognitive ability with respect to surrounding situations is likely to be significantly reduced, which is likely to cause a large traffic accident. Although various technologies for sensing a condition on a road exist, the conventional technologies are limited to a technology of sensing an object in front by using an infrared camera provided on a road or a vehicle and notifying the object, and this method has a problem of low sensing efficiency.
There is a tendency to focus on techniques for grasping the road surface state of a road or sensing an accident, a stationary vehicle, a pedestrian, a reverse run, a falling object, or the like on the road.
Disclosure of Invention
Technical problem
It is an object of the present invention to provide a radar-based high-precision emergency detection system that is capable of more accurately detecting an emergency on a road.
Technical scheme
The high-precision emergency detection system based on the radar comprises: a radar sensor unit that is provided around a road, transmits a wireless signal to the road, and receives a wireless signal reflected from a target object on the road; a control section that receives a wireless signal from the radar sensor section, and generates sensing information about an emergency from the received wireless signal; a communication unit for transmitting the sensing information generated by the control unit to at least one external terminal of a terminal including a driver or a pedestrian, a terminal installed in a vehicle, and a traffic condition control center for managing road conditions through wired or wireless communication; and a tracking unit that tracks the target object based on the sensing information generated by the control unit.
Technical effects
According to the present invention, a pattern of analyzing the speed data of the target object during the period of assigning the identification ID to the target object on the road and tracking the target object can more quickly and accurately determine the vehicle and the pedestrian.
Further, a plurality of radar pulses are transmitted in one transmission cycle, and pulse widths of the plurality of radar pulses are set according to the road line shape, so that an emergency can be sensed accurately regardless of the road line shape.
Further, the detection regions of two radar pulses that are continuously transmitted are partially overlapped, so that detection errors occurring at the edge portion where the detection regions of the radar pulses end are minimized, and accuracy can be improved.
In addition, an X-Y axis coordinate system is set with distance values and velocity values of a plurality of signals received from one object as references, detection object data is displayed in each cell on the X-Y axis coordinate system, and then object information displayed in successive distance and velocity cells is processed into one object through a clustering (Target clustering) process, thereby enabling accurate and precise detection of the object.
In addition, one vehicle can accurately detect a target for one target, thereby accurately processing the target and improving the signal processing performance of radar signals.
Drawings
FIG. 1 is a block diagram showing the construction of a radar-based high-precision emergency detection system of the present invention;
fig. 2 is a block diagram showing a configuration of the radar sensor section shown in fig. 1;
fig. 3 is a block diagram showing a configuration of the control section shown in fig. 1;
FIG. 4 is a flow chart showing a radar-based high-precision emergency detection method according to an embodiment of the present invention;
FIG. 5 is a flow chart showing an example of an emergency detection method using multiple radar pulses in accordance with an embodiment of the present invention;
FIG. 6 is a diagram showing an example of a transmission state of multiple radar pulses of the present invention;
FIG. 7 is a diagram showing an example of detecting an emergency from a detection area with respect to multiple radar pulses according to an embodiment of the present invention;
FIG. 8 is a diagram showing another example of detecting an emergency from a detection region with respect to multiple radar pulses according to an embodiment of the present invention;
FIG. 9 is a flow chart showing another example of an emergency situation detection method using multiple radar pulses in accordance with an embodiment of the present invention;
FIG. 10 is a diagram showing another example of a transmission state of multiple radar pulses of the present invention;
FIG. 11 is a flow chart illustrating a signal processing method according to an embodiment of the present invention;
fig. 12 is a schematic view of a clustering chart showing a state of performing a target clustering process in the signal processing method shown in fig. 11;
fig. 13 is a diagram showing a cluster map representing a state in which a process of extracting target data is performed in the signal processing method shown in fig. 11;
fig. 14 is a schematic diagram showing a method for detecting a fixed obstacle in a radar-based high-precision emergency detection method according to another embodiment of the present invention.
Detailed Description
Embodiments of the present invention are described below with reference to the drawings.
Fig. 1 is a block diagram showing a control configuration of a radar-based high-precision emergency detection system according to the present invention, fig. 2 is a block diagram showing a configuration of a radar sensor unit shown in fig. 1, and fig. 3 is a block diagram showing a configuration of a control unit shown in fig. 1.
Referring to fig. 1 to 3, the high-precision emergency detection system based on radar of the present invention includes a radar sensor unit 10, a control unit 20, a communication unit 40, a tracking unit 50, and a terminal unit 60.
The radar sensor section 10 is a sensor that is provided around a road and transmits a wireless signal to the road and receives a wireless signal reflected by a target object on the road.
As shown in fig. 2, a plurality of the radar sensor portions 10 may be disposed at predetermined intervals from each other around the road. The radar sensor unit 10 includes a transmission antenna 12, a reception antenna 13, a transmission/reception unit 11, a waveform generation unit 14, and a signal processing unit 15. Each of the plurality of radar sensor units 10 receives a reflected signal of a signal transmitted by itself, and filters the reflected signal of another sensor unit 10 in the vicinity of the radar sensor unit without receiving the reflected signal.
The control part 20 receives a wireless signal from the radar sensor part 10 and generates sensing information about an emergency from the received wireless signal. The control unit 20 can detect the emergency and classify the type of emergency detection. The burst detection types may include stationary vehicles, jogging vehicles, speeding vehicles, reversing vehicles, jammed or stuck vehicles, falling objects, pedestrians, and the like.
As shown in fig. 3, the control unit 20 includes a radar interface 21, a collected information analyzing module 22, an emergency determining module 23, a storage module 70, a tracking unit control module 28, a tracking unit interface 29, an external terminal interface 30, and an external system contact module 31.
The radar interface 21 is a module that interconnects the radar sensor section 10 and the control section 20. The radar interface 21 receives a signal from the radar sensor unit 10 and transmits a control command to the radar sensor unit 10.
The collected information analysis module 22 converts a signal received through the radar interface 21 into data and classifies the converted data to generate sensing information. The collected information analysis module 22 assigns a unique identification ID to the target object from the wireless signal received by the radar sensor section 10, and the sensed information includes data on the position, speed, and size of the identification ID. By giving a unique identification ID to the target object, the target object can be tracked based on the given identification ID. The plurality of radar sensor units 10 can track one target object over a wide area by transmitting and receiving the identification ID to and from each other.
The emergency determination module 23 determines an emergency or tracking of the target object or extracts traffic information according to the sensed information generated by the collected information analysis module 22. The emergency determination module 23 determines whether the emergency is an emergency according to the data about the position, the speed and the size collected by the collected information analysis module 22.
The storage module 70 is a module that stores the sensed information generated by the collected information analysis module 22 or the information processed by the emergency determination module 23.
The storage module 70 includes an emergency storage module 24, a burst image database 25, a burst information collection module 26, and a burst information database 27. In the present embodiment, the case where the burst video and the burst information are stored in different storage spaces is described, but the present invention is not limited thereto, and the burst video and the burst information may be stored in one storage space.
The tracking unit control module 28 is a module that controls the tracking unit 50 so as to track the target object based on the information determined by the emergency determination module 23.
The tracking unit interface 29 transmits and receives signals to and from the tracking unit 50.
The external terminal interface 30 transmits and receives signals to and from an external terminal 60. The terminal unit 60 includes at least one of a terminal unit of a driver or a pedestrian, a terminal unit provided in a vehicle, a traffic condition control center for managing a road condition, and a display unit provided in a road. The display component arranged on the road comprises a variable photoelectric plate or a delineator and the like.
The external system contact module 31 is a module for contacting data with a system of a police station or a road traffic management department other than the terminal.
The communication unit 40 transmits the sensing information generated by the control unit 20 to the terminal 60 by wired or wireless communication. The communication unit 40 and the terminal 60 may be connected via an external communication network.
The tracking unit 50 tracks the target object based on the sensing information generated by the control unit 20. In the present embodiment, the tracking unit 50 is exemplified by a camera (not shown) capable of capturing an image of the emergency situation or the like. A plurality of the cameras (not shown) may be provided around the road so that the condition on the road can be photographed according to the control signal of the tracking part control module 28, and the photographed image is transmitted to the control part 20. That is, when the control unit 20 determines that an emergency has occurred, the tracking unit 50 can image and track the target object to which the identification ID is assigned, based on the control signal of the control unit 20.
A method of sensing a pedestrian by the radar-based emergency detection method according to the embodiment of the present invention is explained with reference to fig. 4 as follows.
First, when the detection area of the radar sensor unit 10 is set, the radar sensor unit 10 transmits a wireless signal to the detection area, and when a target object such as a vehicle or a pedestrian is present on the road, receives a wireless signal reflected from the target object (S1, S2).
The collected information analysis module 22 generates detection target data on a position, a velocity, and a size from the movement information of the target object and analyzes the data pattern (S3).
Here, the detection target data is constituted by a plurality of data for one target object. In the case of directly processing the plurality of detected target data, one target may be displayed as a plurality of targets, and therefore a target clustering process of classifying the same target is required. The method for processing the detection target data is described in detail later with reference to fig. 11 to 13.
The collected information analysis module 22 processes the detection target data and assigns a unique identification ID to the identified target object (S4).
The emergency determination module 23 analyzes the pattern of the speed data generated above. The pattern of the speed data is analyzed by distinguishing an increase or decrease in the speed and comparing the speed with a preset speed.
The emergency determination module determines whether the speed is within an absolute value range of a preset stationary threshold (S5).
The absolute value range of the stationary threshold value is set in advance in a speed range in which the target object can be determined to be stationary.
If the speed is within the absolute value range of a predetermined stationary threshold, the magnitude is compared with a predetermined vehicle threshold (S6).
The vehicle threshold value is set in consideration of the size of a general vehicle.
Determining that the target object is a stationary vehicle when the magnitude is greater than the vehicle threshold value (S7)
That is, when the speed is within the absolute value range of the stationary threshold value and the magnitude is greater than the vehicle threshold value, it is possible to determine that the target object is a stationary vehicle.
Further, since the target object is smaller than the vehicle when the size is equal to or smaller than the vehicle threshold value, it is determined whether the vehicle is a pedestrian or a falling object (S8).
In the case where the wireless signal separation with respect to the target object is sensed during the tracking of the target object, the separation objects may be respectively given auxiliary identification IDs and the distance, size, and speed variation between the separation objects may be compared to determine whether or not the object is a drop from the target object (S9). That is, it can be determined that one of the separate objects is a vehicle and the other is a falling object. Therefore, the occurrence of a falling object can be determined.
In addition, it can be determined that the speed of the target object is higher than the stationary threshold value, and the continuously moving target object having a size smaller than the vehicle threshold value is a pedestrian (S10).
The emergency determination module determines whether the speed is a positive value or a negative value (S11).
The target object has a positive value by increasing the speed when the vehicle travels in the forward direction toward the radar sensor unit. The speed of the target object has a negative value when the target object travels in a reverse direction away from the radar sensor section.
Comparing the magnitude with the vehicle threshold value when it is judged that the speed is a negative value (S12).
The speed is a negative value, and when the magnitude is greater than the vehicle threshold value, it can be determined that the target object is a reverse-run vehicle (S13).
The emergency determination module 23 generates a corresponding alarm and transmits the alarm to the storage module 70 and the external terminal interface 30 when determining that the target object is any one of a stationary vehicle, a pedestrian, a falling object, and a reverse driving vehicle.
The storage module 70 stores the position information of the target object and the alarm.
The external terminal interface 30 sends the location information of the target object and the alarm to the terminal 60.
The present embodiment takes the terminal 60 as the traffic condition control center as an example for explanation. The traffic condition control center receives the position information and the alarm from the control unit, and can take measures for dealing with them.
In addition, the present invention is not limited to the above embodiment, and the tracking unit 50 may capture an image of the target object, and the control unit 20 may transmit the captured image of the target object to the terminal 60. The terminal 60 includes a display unit such as a variable electro-optic panel or a delineator installed on a road.
Fig. 5 is a flowchart illustrating an example of an emergency situation detection method using multiple radar pulses according to an embodiment of the present invention, and fig. 6 is a diagram illustrating a transmission state of multiple radar pulses according to an embodiment of the present invention.
Referring to fig. 5, the radar interface 21 of the control unit 20 may set the radar pulse transmitted from the radar sensor unit 10 to a multi-radar pulse (S21).
The multi-radar pulse means that the radar sensor section 10 transmits the multi-radar pulse P a plurality of times in one transmission cycle.
Referring to fig. 6, the radar sensor section 10 transmits n radar pulses at one transmission cycle. The n radar pulses are transmitted at predetermined time intervals within the one-time transmission period. The number of the radar pulses may be set differently according to the length of a road on which an emergency is to be detected, and the respective widths W1, W2, W3,. Wn of the radar pulses. That is, the number of radar pulses is set so as to be proportional to the length of the road.
The radar interface 21 sets the widths W1, W2, W3,. Wn of the n radar pulses P in a differentiated manner. The widths W1, W2, W3,. Wn of the radar pulses are set so that the widths of the radar pulses increase with the passage of time. That is, the n radar pulses whose width increases with the passage of time are set so as to sequentially transmit the radio signal from the time point S at which the one-time transmission period T1 starts. Wherein the width of the radar pulse is proportional to the output and the strength of the wireless signal, so the greater the width of the radar pulse, the further the wireless signal can be transmitted.
The radar interface 21 sets the widths W1, W2, W3,. Wn of the radar pulses in accordance with the line shape of the road. The line shape of the road represents the shape of the road. The radar interface 21 sets the width of the radar pulse according to the length ratio of the straight line section included in the road. That is, when the length of the straight section included in the road is equal to or greater than a preset ratio, it is determined that the road is a straight road, and the widths W1, W2, W3,. Wn of the N radar pulses are set to be increased from the first width W1, which is the minimum value, to the nth width Wn, which is the maximum value. When the length of the straight section included in the road is less than a preset ratio, the curved road is determined, and the widths W1, W2, W3,. cndot.wn of the N radar pulses are increased from the first width W1, but the upper limit width is set in proportion to the length of the straight section, without increasing to the nth width Wn. That is, the smaller the length of the straight line section is, the smaller the maximum width of the radar pulse is.
An example in which the radar sensor section 10 transmits three of the first radar pulse P1, the second radar pulse P2, and the third radar pulse P3 in one transmission cycle will be described below. When the road is a straight road, the first radar pulse P1 is set to a first width W1 that is a minimum value, the second radar pulse P2 is set to a second width W2 that is greater than the first width W1, and the third radar pulse P3 is set to a third width W3 that is greater than the second width W2. The larger the pulse, the greater the intensity of the radio signal, and therefore, the longer the distance can be transmitted, and a wider detection area can be detected. The first radar pulse P1 is a Short (Short) pulse capable of being transmitted to a range of about 0 to 200m, the second radar pulse P2 is a medium (Middle) pulse capable of being transmitted to a range of about 200 to 600m, and the third radar pulse P3 is a Long (Long) pulse capable of being transmitted to a range of about 600 to 1000 m. That is, when the road is a straight road, the short pulse to the long pulse can be used because the road can be smoothly received even when the road is transmitted to about 600m or more.
The radar sensor unit 10 transmits three first, second, and third radar pulses P1, P2, and P3 in one transmission cycle, and when the road is a curved road, only short pulses may be applied or short pulses and medium pulses may be selectively used. That is, when the road is a curved road, if the road is transmitted from the radar sensor unit to a long distance of about 600m or more, the road cannot be transmitted to a precise position, and a reception error may occur.
The method of dividing the road into the straight road and the curved road may be divided according to a ratio of a length of a straight section included in the road in the total length of the road, and may be previously stored in the database, etc. through a preliminary investigation. The total length of the road corresponds to a length of a detection area that can be detected by the radar sensor section 10.
As shown in fig. 7, the radar interface 21 is set such that detection areas of two radar pulses sequentially transmitted from among the detection areas of the plurality of radar pulses overlap each other. The radar interface 21 may set the detection regions of the radar pulses to overlap when the width of the radar pulse is set.
In the case where the number and width of the radar pulses are set by the above method, the radar sensor unit transmits a wireless signal related to the n radar pulses to the road according to the set width.
The radar sensor unit 10 receives a plurality of reflection signals reflected from the detection areas of the radio signals of the n radar pulses on the road (S22).
The collected information analysis module 22 and the emergency determination module 23 detect the presence or absence of an emergency based on the received plurality of reflected signals (S23).
The emergency determination module 23 can analyze the received multiple reflected signals to determine whether an emergency exists.
When the emergency is detected, the emergency determination module 23 determines whether or not the detection area in which the emergency is detected is the overlapping area (S24).
The overlap region is a region in which detection regions of two radar pulses sequentially transmitted from the plurality of radar pulses overlap.
Fig. 7 is a diagram showing an example of detection of an emergency from a detection area with respect to multiple radar pulses.
Referring to fig. 7, at least a portion of the detection region of the first radar pulse P1 and the detection region of the second radar pulse P2 overlap to form an overlap region O1. Also, the detection region of the second radar pulse P2 and the detection region of the third radar pulse P3 overlap at least partially to form an overlap region O2. In general, detection errors often occur in the most marginal areas in the detection areas of the radar pulses. In this way, when the detection regions of the two radar pulses overlap, it is possible to reduce the occurrence of detection errors in the edge-most region of the detection regions of the radar pulses.
The emergency determination module 23 determines that the detection area in which the emergency is detected is not the overlapping area, and generates the emergency information on the emergency (S25).
The emergency determination module 23 can determine which of a reverse-run vehicle, a stationary vehicle, a pedestrian, and a falling object the emergency is based on the data on the position, speed, and size of the target object collected by the collected information analysis module 22.
The emergency determination module 23 analyzes the pattern of the speed data generated above. The pattern of the speed data is analyzed by distinguishing an increase or decrease in the speed and comparing the speed with a preset speed.
The emergency determination module 23 determines whether the speed is within an absolute value range of a preset stationary threshold. And comparing the magnitude with a preset vehicle critical value under the condition that the speed is within the absolute value range of the preset static critical value. And judging that the target object is a stationary vehicle under the condition that the size is larger than the vehicle critical value. That is, when the speed is within the absolute value range of the stationary threshold value and the magnitude is greater than the vehicle threshold value, it is possible to determine that the target object is a stationary vehicle.
In addition, when the size is equal to or smaller than the vehicle threshold value, it can be determined that the target object is smaller than the vehicle, and therefore, it can be determined that the pedestrian or the falling object is present. Here, in the case of a pedestrian, the pattern of the generated speed data is displayed as alternately repeating an increase and a decrease in the speed. That is, since the speed increases when the pedestrian's arm is forward and decreases when the pedestrian's arm is backward, the number of times of alternating repetition of increasing and decreasing the speed exceeds the critical number when the pedestrian's arm is continuously moving forward and backward during walking. The threshold number may be set in advance by an experiment or the like, and a pedestrian and a falling object can be distinguished by the threshold number.
In addition, when the speed exceeds the absolute value range of the stationary threshold value, it may be determined that the target object is in a traveling state. Therefore, the increase or decrease in the speed is determined in order to determine whether the target object is traveling in the reverse direction or in the forward direction. The emergency determination module determines whether the speed is increasing or decreasing. The speed of the target object increases and shows a positive value when the target object travels in the forward direction toward the radar sensor section. The speed decreases and shows a negative value when the target object travels in a reverse direction away from the radar sensor section. And comparing the magnitude with the vehicle critical value when the speed is judged to be reduced. When the speed is reduced and the magnitude is larger than the vehicle critical value, the target object can be judged to be a reverse driving vehicle.
The emergency determination module 23 generates emergency information when determining that the target object is any one of a stationary vehicle, a pedestrian, a falling object, and a reverse-traveling vehicle.
The control unit 20 transmits the generated burst information to a terminal of a driver or a pedestrian through wired or wireless communication, and is installed in a communicable electronic device of a vehicle, a traffic situation control center for managing a road situation, and the like (S26).
The control unit 20 can capture the emergency by the tracking unit 50 and can collectively transmit the captured images.
In addition, when the emergency determination module 23 determines that the emergency is detected from the overlap region, it determines whether or not the emergency is detected from both the detection regions of the two radar pulses (S27).
Fig. 7 shows a case where the emergency I1 is detected from both the detection areas of the first and second radar pulses. As shown in fig. 7, the control unit generates burst information regarding the emergency when determining that the emergency I1 is detected from both detection regions of the first and second radar pulses (S25).
The control unit 20 transmits the generated burst information to a terminal of a driver or a pedestrian by wired or wireless communication, a communicable electronic device provided in a vehicle, a traffic condition control center for managing a road condition, a display unit such as a variable electro-optic panel or a delineator provided in a road, or the like.
In addition, the emergency determination module 23 determines that the control unit applies a predetermined weight value when the emergency I2 is detected only from the detection region of the second radar pulse P2 and the emergency I2 is not detected from the detection region of the first radar pulse P1 (S28).
Fig. 8 is a diagram showing another example of detecting an emergency from a detection area with respect to multiple radar pulses according to an embodiment of the present invention.
Fig. 8 shows a case where it is judged that the emergency I2 is not detected from both the detection areas of the first and second radar pulses, but that the emergency I2 is detected only from the detection area of the second radar pulse P2.
The emergency determination module 23 determines the emergency according to I2 whether the emergency is detected from the detection area of the radar pulse with the high weight value among the first and second radar pulses P1 and P2 (S29).
The weighting values are set in advance based on the accuracy of a plurality of radar pulses and stored in a database or the like. Since the signal edge of the radar pulse is weak, the detection accuracy of the portion where the detection region of the radar pulse starts is higher than that of the portion where the detection region of the radar pulse ends. Therefore, in the present embodiment, the weighted value of the radar pulse at the start of the detection area in the two detection areas constituting the overlap area is set to be high. That is, the overlap area O1 contains a portion where the detection area of the second radar pulse P2 starts, so the weight value of the second radar pulse P2 is higher than the weight value of the first radar pulse P1, and thus an emergency situation is judged according to the result of detection from the detection area of the second radar pulse P2. However, the present invention is not limited to this, and the weighted value may be set more variously by experiments to reflect the weight.
The emergency determination module 23 determines that the emergency I2 is detected from the detection area of the second radar pulse, and generates the emergency information on the emergency (S25).
The emergency detection system of the embodiment of the invention sends a plurality of radar pulses in one sending period, and sets the pulse widths of the plurality of radar pulses according to the line shape difference of the road, so that the emergency detection system has the effect of accurately detecting the emergency regardless of the line shape of the road. Further, the detection error occurring at the edge-most portion where the detection regions of the radar pulses end is set so as to be minimized by overlapping a part of the detection regions of two radar pulses that are continuously transmitted, so that the accuracy can be improved.
Fig. 9 is a flowchart illustrating another example of an emergency situation detection method using multiple radar pulses according to an embodiment of the present invention.
Referring to fig. 9, tracking a camera image to determine an emergency situation when an emergency situation is sensed from an overlapping area and an emergency situation is sensed from only one of detection areas of two radar pulses constituting the overlapping area.
First, the radar interface 21 sets a radar pulse transmitted by the radar device as a multi-radar pulse (S31).
The radar interface 21 is set so that the radar apparatus transmits n radar pulses at one transmission cycle. Widths W1, W2, W3, and. Widths W1, W2, W3,. Wn of the radar pulses are set in accordance with the line shape of the road.
In the case where the number and width of the radar pulses are set by the above method, the radar apparatus transmits a wireless signal regarding the n radar pulses to the road in accordance with the width set above. And, the radar device receives a plurality of reflected signals reflected from the detection areas of the wireless signals of the n radar pulses on the road (S32).
The emergency determination module 23 detects the presence or absence of an emergency from the received plurality of reflected signals (S33).
When the emergency is detected, the emergency determination module 23 determines whether or not the detection area in which the emergency is detected is the overlapping area (S34).
The emergency determination module 23 determines that the detection region in which the emergency is detected is not the overlapping region, and generates the emergency information on the emergency (S35).
The burst information includes the distance, speed and burst type of the object causing the burst. The burst types include stationary vehicles, slow moving vehicles, speeding vehicles, reverse running vehicles, falling objects, pedestrians, vehicle congestion, stagnation and the like.
The emergency determination module 23 can determine which of a reverse-run vehicle, a stationary vehicle, a pedestrian, and a falling object the emergency is based on the data on the position, speed, and size of the target object collected by the collected information analysis module 22.
The emergency determination module 23 analyzes the generated pattern of the speed data. The pattern of the speed data is analyzed by distinguishing an increase or decrease in the speed and comparing the speed with a preset speed.
The emergency determination module 23 determines whether the speed is within an absolute value range of a preset stationary threshold. The speed is compared to a predetermined vehicle threshold value within an absolute range of a predetermined stationary threshold value. And when the size is larger than the vehicle critical value, judging that the target object is a stationary vehicle. That is, when the speed is within the absolute value range of the stationary threshold value and the magnitude is greater than the vehicle threshold value, it is possible to determine that the target object is a stationary vehicle.
In addition, when the size is equal to or smaller than the vehicle threshold value, it is determined that the target object is smaller than the vehicle, and therefore, it can be determined that the pedestrian or the falling object is present. In the case where the wireless signal separation with respect to the target object is sensed during the tracking of the target object, the separation objects may be given auxiliary identification IDs, respectively, and whether or not the falling object falls from the target object may be determined by comparing the distance, size, and speed variation between the separation objects. That is, it can be determined that one of the separate objects is a vehicle and the other is a falling object. Therefore, it can be determined that a falling object has occurred. In addition, the speed of the target object is higher than the stationary threshold value, and the continuously moving target object with the size smaller than the vehicle threshold value can be judged to be a pedestrian.
In addition, when the speed exceeds the absolute value range of the stationary threshold value, it may be determined that the target object is in a traveling state. Therefore, the increase or decrease in the speed is determined in order to determine whether the target object is traveling in the reverse direction or the forward direction. The emergency determination module determines whether the speed is increasing or decreasing. The speed of the target object increases and shows a positive value when the target object travels in the forward direction near the radar sensor unit. The speed decreases and shows a negative value when the target object travels in a reverse direction away from the radar sensor section. And comparing the magnitude with the vehicle critical value when the speed is judged to be reduced. When the speed is reduced and the magnitude is larger than the vehicle critical value, the target object can be judged to be a reverse driving vehicle.
The emergency determination module 23 generates emergency information when determining that the target object is any one of a stationary vehicle, a pedestrian, a falling object, and a reverse-traveling vehicle.
The control unit 20 transmits the generated burst information to a terminal of a driver or a pedestrian, a communicable electronic device installed in a vehicle, a traffic condition control center for managing road conditions, a display unit such as a variable electro-optic panel or a delineator installed on a road, and the like through the external terminal interface 30 (S36).
The control unit 20 captures the emergency by using a camera as the tracking unit 50, and collectively transmits the captured images.
In addition, when the emergency is detected in the overlap region, the emergency determination module 23 determines whether an emergency is detected from both the detection regions of the two radar pulses (S37).
The emergency determination module 23 determines that the emergency is detected from both the detection areas of the two radar pulses, and generates the emergency information on the emergency (S35).
The control unit 20 transmits the generated burst information to a terminal of a driver or a pedestrian, a communicable electronic device provided in a vehicle, a traffic situation control center for managing a road situation, and the like through wired or wireless communication (S36).
In addition, the emergency determination module 23 does not detect the emergency from both the detection regions of the two radar pulses, and tracks the video captured by the camera through the tracking unit control module 28 and the tracking unit interface 29 when the emergency is detected from only one detection region of the radar pulse (S38).
Therefore, the control unit 20 checks the video captured by the camera to finally determine whether there is an emergency (S39).
The emergency determination module 23 can confirm the image captured by the camera, and determine the movement, size, etc. of the object captured in the image to determine the type of emergency or emergency detection. However, the present invention is not limited to this, and a person such as an administrator may directly confirm the image captured by the camera and directly confirm the emergency or the type of the emergency.
When the emergency determination module 23 determines that the emergency exists, it generates the emergency information about the emergency (S35).
The control unit 20 transmits the generated burst information to a terminal of a driver or a pedestrian, a communicable electronic device provided in a vehicle, a traffic condition control center for managing road conditions, a display means such as a variable electro-optic panel or a delineator provided on a road, and the like through wired or wireless communication (S36).
As described above, when it is determined that the emergency is not detected from both the detection regions of the two radar pulses and only one detection region of one radar pulse is detected, the tracking unit control module 28 and the tracking unit interface 29 track the image captured by the camera, thereby minimizing an error when only one detection region of one radar pulse is detected, and enabling more accurate detection.
Fig. 10 is a diagram showing another example of the transmission state of multiple radar pulses according to the embodiment of the present invention.
Referring to fig. 10, the radar sensor unit 10 is set to transmit a plurality of radar pulses a plurality of times in one transmission cycle, and outputs of the plurality of radar pulses are different.
The radar sensor unit 10 transmits n radar pulses P in one transmission cycle. The n radar pulses are transmitted at predetermined time intervals within the one-time transmission period. The number of the radar pulses is set according to the length of a road to be detected for an emergency and the outputs O1, O2, O3,. On of the radar pulses.
The radar interface 21 sets outputs O1, O2, O3,. On of the n radar pulses P in a differentiated manner. The outputs O1, O2, O3,. On of the radar pulses are set so that the output of the transmitted radar pulse gradually increases with the passage of time in one transmission period. That is, the n radar pulses sequentially transmit the wireless signal from a time point S at which one transmission period T1 starts, wherein the output of the radar pulses gradually increases with the passage of time. However, the outputs O1, O2, O3,. On of the radar pulses may be gradually increased and then decreased with the lapse of time during the one-time transmission period T1, and the process of gradually increasing and then decreasing may be repeated a plurality of times. Here, the example is explained in which the widths W1, W2, W3,. Wn of the radar pulses are predetermined levels. However, the width of the radar pulse may be set differently.
Here, the output of the radar pulse is proportional to the strength of the radio signal, and therefore the radio signal can be transmitted farther the larger the output of the radar pulse. At a time point when the next transmission period T2 starts after the one transmission period T1 ends, the output of the radar pulse is returned to the initial output at the time point S when the transmission period T1 starts.
Generally, in the case where radar pulses are transmitted with the same output, the intensity of a reflected signal reflected by the target object is inversely proportional to the distance to the target object. Therefore, the intensity of the reflected signal is weak when the distance is long, and accurate detection is difficult. And, the smaller the size of the target object, the smaller the intensity of the reflected signal reflected by the target object. In contrast, according to the present embodiment, since the radar pulses having different outputs are transmitted in one transmission cycle, the target object can be detected more accurately without being affected by the distance between the radar sensor section 10 and the target object and the size of the target object.
For example, when the distance between the radar sensor unit 10 and the target object is too long and the size of the target object is very small when the output of the radar pulse is at a predetermined level, the intensity of the reflected signal reflected from the target object is weaker than a predetermined minimum threshold value, and therefore, the target object may not be sensed because the reflected signal is not received.
On the other hand, when the output of the radar pulse is set differently, if the distance between the radar sensor unit 10 and the target object is too long and the size of the target object is very small, the intensity of the reflected signal with respect to the relatively large output of the radar pulse is greater than the minimum threshold value, and therefore, the reflected signal can be received, and the target object can be sensed. That is, even when the distance between the radar sensor section 10 and the target object is too long and the size of the target object is very small, the target object can be detected from a relatively large output among the outputs of the radar pulses.
When the output of the radar pulse is at a predetermined level, if the distance between the radar sensor unit 10 and the target object is too close and the size of the target object is very large, the intensity of the reflected signal reflected from the target object is stronger than a predetermined maximum threshold value, and therefore, there may be a case where the reflected signal cannot be recognized or the target object cannot be sensed due to erroneous recognition, which may cause a false start of the system.
On the other hand, when the output of the radar pulse is set differently, if the distance between the radar sensor unit 10 and the target object is too close and the size of the target object is very large, the intensity of the reflected signal of the output of the radar pulse, which is relatively small, is smaller than the maximum threshold value, and therefore the reflected signal can be received, and the target object can be sensed. That is, even when the distance between the radar sensor section 10 and the target object is too close and the size of the target object is very large, the target object can be detected from a relatively small output among the outputs of the radar pulses.
Therefore, the control is performed so that the target object can be sensed more accurately without being affected by the distance between the radar sensor section 10 and the target object and the size of the target object when radar pulses of different outputs are transmitted in one transmission cycle.
Fig. 11 is a flowchart illustrating a processing method of detecting target data according to an embodiment of the present invention.
Referring to fig. 11, the radar sensor section 10 transmits a signal to the target object and receives a wireless signal (hereinafter, referred to as 'signal') reflected upon encountering the vehicle (S41).
The collected information analysis module 22 calculates detection target data by Constant false alarm rate detection (CFAR detection) based on the signal received from the radar sensor unit 10.
There are a plurality of the detection target data calculated for one target object. The plurality of detection target data respectively include a velocity Index (Doppler Index), a distance Index (Range Index), a velocity value, a distance value, and the like. In the case of directly processing the plurality of detected target data, one target object may be displayed as a plurality of target objects, and therefore a target clustering process for classifying the same target is required.
For the target clustering, the plurality of detected target data are first displayed in a grid in a cluster map (S42).
The cluster map is a map set to represent the speed index and the distance index by X-Y axis coordinates and displayed in a grid. According to this embodiment, the description will be given taking as an example a case where the horizontal axis (X axis) in the cluster map is a distance grid representing a distance index and the vertical axis (Y axis) is a velocity grid representing a velocity index.
Fig. 12 shows a state in which the cluster map shows the plurality of detected target data, and target clustering is performed.
The bin related to the values of the speed index and the distance index is represented by "1", and the bin not related to the values of the speed index and the distance index is represented by "0".
After the cluster map displays the plurality of detection target data, a target clustering process is performed in which the same target number is assigned to a cell having a value continuous in the vertical and horizontal directions among the plurality of cells, and the cells are classified by the same target object (S43).
For example, the same target number is assigned to a cell having a cell value difference of 1 or less among the plurality of cells.
Referring to fig. 12, the present embodiment shows an example of clustering five target objects (1 to 5) in total.
By the method, the target clustering process is performed on all the detected target data, and the detected target data can be grouped according to grids endowed with the same target number and classified according to the same target object.
In addition, since a plurality of detection target data about one target are included in the case of performing target clustering as described above, a process of extracting one representative target data from a plurality of detection target data about one target object is carried out (S44).
FIG. 13 illustrates a state of extracting representative target data from a cluster map according to an embodiment of the present invention.
Referring to fig. 13, the representative target data extraction process extracts a value of a bin having the greatest signal intensity from bins having the same target number (1-5) as representative target data B.
The representative target data B is extracted as follows.
Since there are a plurality of cells having the same target number (1-5), a cell located at the center among the plurality of cells is selected as a comparison center. For example, the lattice with the first object number 1 has eight, the lattice with the second object number 2 has seven, and the lattice with the third object number 3 has eight.
The comparison center is selected by finding the center position from the range of distance indexes occupied by the plurality of bins having the same target number, and finding the center position from the range of speed indexes to select the comparison center.
A method of selecting the comparison center from the lattice having the first object number 1, for example, is explained below.
The distance index occupied by the total eight bins having the first target number 1 ranges from r3 to r 6. Therefore, the center position of the range of r3 to r6 corresponds to the line between r4 and r 5.
And, the speed index range occupied by the bins having the first target number 1 is d2 to d 4. Thus, the center position of the d 2-d 4 range corresponds to d 3.
Therefore, the line between r4 and r5 and d3 may be selected as the comparison center from the bin having the first target number 1.
After the comparison center is selected, the cells adjacent to the comparison center are selected as comparison object cells a for marking signal intensity.
The front and rear cells of the cell adjacent to the comparison center are also selected as comparison object cells a.
That is, the front cell r4 and the rear cell r5 of the line between r4 and r5 are included in the comparison cell a. The comparison object cell a is included from the front cell d2 of d3 to the rear cell d 4.
Accordingly, five cells corresponding to (r4, d2), (r5, d2), (r4, d3), (r5, d3), and (r5, d4) among the eight cells having the first object number 1 can be selected as the comparison target cell a. That is, by selecting the comparison target cell for comparing the signal strengths, the number of cells for which the signal strengths need to be compared is reduced, and thus the data processing time can be reduced.
And after the comparison object grid A is selected, extracting the value of the grid with the maximum signal intensity in the comparison object grid A as the representative target data B.
S1 to S5 in fig. 13 indicate the intensity of the signal. The value of the maximum intensity of the signal in the comparison object cell a given the first target number 1 is S4, and thus a cell having a signal intensity of S4 is extracted as the representative target data B.
Therefore, the emergency determination module can determine an emergency using the representative target data B.
In the present invention, the signal intensities of all the detection target data are not compared to find the maximum value, but the comparison target cell is found from the detection target data, and the signal intensities are compared only for the comparison target cell, so that the data processing time can be reduced.
In addition, fig. 14 is a schematic diagram of a method for detecting a fixed obstacle in a radar-based high-precision emergency detection method according to another embodiment of the present invention.
Fig. 14 shows a method of detecting a fixed obstacle. Here, the fixed obstacle includes a stationary vehicle and a drop. Since the fixed obstacle has almost no speed change and cannot obtain a speed value from a wireless signal, the fixed obstacle is determined by comparing a reference signal described below with a received signal.
First, the radar sensor section 10 transmits a wireless signal in real time and receives a wireless signal reflected from a target object on a road (S41).
The emergency determination module 23 compares the received signal with a preset reference signal (S42).
The Reference signal is set as an average value of the reception signals received by the radar sensor section 10 for a predetermined time. The received signal is a signal received in real time.
The emergency determination module 23 determines that a fixed obstacle exists on the road when the difference between the received signal and the reference signal exceeds a preset threshold value (S43).
For example, when the level of the reference signal is 3 and the level of the received signal is 5, the difference between the levels of the reference signal and the received signal is 2, and it can be determined that the threshold value is exceeded. The presence of the fixed obstacle can be determined when the difference between the level of the reference signal and the level of the received signal exceeds the threshold value.
If the fixed obstacle is identified, the size of the fixed obstacle is compared with a preset vehicle threshold value (S44).
And determining that the fixed obstacle is a stationary vehicle when the size of the fixed obstacle exceeds the vehicle threshold value (S45).
When the size of the fixed obstacle is equal to or smaller than the vehicle threshold value, it is determined that the fixed obstacle is a falling object (S46).
By the method, the fixed obstacle can be identified by comparing the preset reference signal with the received signal under the condition that the fixed obstacle with the speed value cannot be sensed.
The invention has been described above with reference to the embodiments shown in the drawings, which are intended to be exemplary only, and it will be understood by those skilled in the art that various changes may be made therein and equivalents may be substituted therefor. Therefore, the true protection scope of the present invention is subject to the content of the technical solution.

Claims (14)

1. A radar-based high-precision emergency detection system, comprising:
a radar sensor unit that is provided around a road, transmits a wireless signal to the road, and receives a wireless signal reflected from a target object on the road;
a control section that receives a wireless signal from the radar sensor section, and generates sensing information about an emergency from the received wireless signal;
a communication unit for transmitting the sensing information generated by the control unit to at least one external terminal of a terminal including a driver or a pedestrian, a terminal installed in a vehicle, and a traffic condition control center for managing road conditions through wired or wireless communication; and
a tracking unit that tracks the target object based on the sensing information generated by the control unit,
the control section includes:
a radar interface that receives a signal from the radar sensor unit and transmits a control command to the radar sensor unit;
a collected information analysis module that converts a signal received from the radar sensor section into data or receives data converted at the radar sensor section, classifies the data, and generates sensed information;
an emergency determination module that determines an emergency or tracking of the target object or extracts traffic information according to the sensing information generated by the collected information analysis module;
an external terminal interface for transmitting and receiving signals to and from the external terminal;
a storage module that stores the sensed information generated by the collected information analysis module or the information processed by the emergency determination module; and
a tracking section control module that controls so as to track the target object based on the information judged by the emergency judgment module,
the radar interface is set so that the radar sensor section transmits a plurality of radar pulses of a radio signal in one transmission cycle,
the difference between the widths of the plurality of radar pulses is set so as to be proportional to the length of a straight section included in the road.
2. A radar-based high precision emergency detection system according to claim 1, wherein:
the collected information analysis module assigns a unique identification ID to the target object based on the wireless signal received by the radar sensor section, and generates data on the position, speed, and magnitude of the identification ID.
3. A radar-based high precision emergency detection system according to claim 2, wherein:
the emergency determination module analyzes the data on the position, speed and size generated by the collected information analysis module,
the speed is within the absolute value range of a preset static critical value, the target object is judged to be a static vehicle under the condition that the speed is greater than a preset vehicle critical value,
and under the condition that the speed is within the absolute value range of a preset static critical value and the size is less than or equal to the vehicle critical value, judging that the target object is a pedestrian.
4. The radar-based high precision emergency detection system of claim 3, wherein:
and when the speed exceeds the absolute value range of the static critical value, the speed is negative, and the size is larger than a preset vehicle critical value, the target object is judged to be a reverse running vehicle.
5. A radar-based high precision emergency detection system according to claim 1, wherein:
the radar interface is set such that the width of the radar pulse transmitted in the transmission period increases with the passage of time.
6. A radar-based high precision emergency detection system according to claim 1, wherein:
the radar interface sets the width of the radar pulse such that at least a part of detection areas of wireless signals of two of the plurality of radar pulses overlap to form an overlap area.
7. The radar-based high precision emergency detection system of claim 6, wherein:
the emergency determination module determines whether a detection area in which the emergency is detected is the overlapping area when the emergency is detected,
determining whether the emergency is detected from both detection regions of the wireless signals including the two radar pulses of the overlap region when the overlap region is determined,
determining that the emergency is detected from the detection regions of the wireless signals of the two radar pulses, and generating emergency information regarding the emergency,
when the emergency is detected from only one detection area of the detection areas of the radio signals of the two radar pulses, whether the emergency is detected from the detection area of the radio signal of the radar pulse with a preset weighted value higher than the two radar pulses is judged.
8. A radar-based high precision emergency detection system according to claim 7, wherein:
the tracking part comprises a camera which is arranged on the road and shoots the upper part of the road in real time,
the emergency judgment module judges whether the emergency is detected from only one detection area in the detection areas of the wireless signals of the two radar pulses including the overlapping area, confirms the image of the camera and judges whether the emergency exists according to the image of the camera.
9. A radar-based high precision emergency detection system according to claim 1, wherein:
the radar interface is set so that the radar sensor section transmits a plurality of radar pulses of a radio signal in one transmission cycle,
the output of the radar pulse transmitted in the one-time transmission period is set to vary with the passage of time.
10. A radar-based high precision emergency detection system according to claim 1, wherein:
the collected information analysis module calculates a wireless signal received from the radar sensor section as detection target data including a speed index and a distance index using constant false alarm detection,
displaying a plurality of the detection target data in a grid on a cluster map having the distance index and the velocity index as X-Y axis coordinates,
the same object number is assigned to a cell having a value continuous in the vertical and horizontal directions among the plurality of cells,
and displaying a plurality of detection target data in the cluster map according to the same target object cluster.
11. The radar-based high precision emergency detection system of claim 10, wherein:
the collected information analysis module extracts data of a bin having the greatest signal strength from among the bins having the same target number as representative target data,
and the emergency judgment module judges the emergency according to the representative target data.
12. The radar-based high precision emergency detection system of claim 11, wherein:
the collected information analysis module selects a position corresponding to a center as a comparison center from a plurality of bins having the same target number,
selecting at least a part of the cells located on the upper, lower, left, and right sides of the comparison center among the plurality of cells as comparison target cells for comparing signal strengths,
and extracting data of the cell with the maximum signal intensity from the comparison target cells as representative target data.
13. A radar-based high precision emergency detection system according to claim 2, wherein:
the emergency judgment module compares a received signal received by the radar sensor part in real time with a preset reference signal,
and determining that the obstacle is fixed when the difference between the received signal and the reference signal exceeds a preset critical value.
14. The radar-based high precision emergency detection system of claim 13, wherein:
the emergency judging module compares the size of the fixed obstacle with a preset vehicle critical value under the condition of judging that the fixed obstacle exists,
determining that the fixed obstacle is a stationary vehicle when the size of the fixed obstacle exceeds the vehicle threshold value,
and under the condition that the size of the fixed obstacle is less than or equal to the vehicle critical value, judging that the fixed obstacle is a falling object.
CN201710593264.6A 2016-12-09 2017-07-19 High-precision emergency detection system based on radar Active CN108226917B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020160167709A KR101752858B1 (en) 2016-12-09 2016-12-09 Radar-based high-precision unexpected incident detecting system
KR10-2016-0167709 2016-12-09

Publications (2)

Publication Number Publication Date
CN108226917A CN108226917A (en) 2018-06-29
CN108226917B true CN108226917B (en) 2021-09-28

Family

ID=59427484

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710593264.6A Active CN108226917B (en) 2016-12-09 2017-07-19 High-precision emergency detection system based on radar

Country Status (3)

Country Link
KR (1) KR101752858B1 (en)
CN (1) CN108226917B (en)
WO (1) WO2018105842A1 (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102438461B1 (en) * 2018-02-22 2022-09-01 렉스젠(주) Vehicle information display apparatus and traffic surveilliance system using the same
CN109884601B (en) * 2018-12-28 2023-05-05 中国航天科工集团八五一一研究所 Radar pulse rapid searching method based on equal-order jump technology
KR102151195B1 (en) * 2018-12-28 2020-10-26 주식회사 유라코퍼레이션 System and method for detecting passenger in vehicle using UWB
CN110045345B (en) * 2019-03-12 2023-01-20 广东工业大学 Adaptive analysis method for millimeter wave radar
CN110146865B (en) * 2019-05-31 2023-07-14 创新先进技术有限公司 Target identification method and device for radar image
CN110501684B (en) * 2019-08-23 2022-12-23 北京航天朗智科技有限公司 Radar data processing device and radar data processing method
CN111273268B (en) * 2020-01-19 2022-07-19 北京百度网讯科技有限公司 Automatic driving obstacle type identification method and device and electronic equipment
CN112346046B (en) * 2020-10-30 2022-09-06 合肥中科智驰科技有限公司 Single-target tracking method and system based on vehicle-mounted millimeter wave radar
KR102271138B1 (en) * 2021-04-22 2021-06-30 메타빌드(주) Traffic information collection accuracy improvement device using radar sensor and method of the same
KR102275280B1 (en) * 2021-04-22 2021-07-09 메타빌드(주) Multiple object tracking-based traffic information collection device using radar sensor and method of the same
KR102598250B1 (en) * 2023-07-11 2023-11-06 메타빌드주식회사 Target detection method using radar signal, target detection system and computer program for the same

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5115243A (en) * 1991-04-16 1992-05-19 General Electric Co. Radar system with active array antenna, beam multiplex control and pulse integration control responsive to azimuth angle
US7474257B2 (en) * 2004-11-08 2009-01-06 The United States Of America As Represented By The Secretary Of The Navy Multistatic adaptive pulse compression method and system
KR100902560B1 (en) * 2008-11-04 2009-06-11 국방과학연구소 Apparatus and method for generating warning alarm in a tracking-while-scanning radar
CN103593980A (en) * 2012-08-14 2014-02-19 业纳遥控设备有限公司 Method for classifying vehicles in motion
CN103852757A (en) * 2012-12-03 2014-06-11 株式会社电装 Target detecting device for avoiding collision between vehicle and target captured by sensor mounted to the vehicle
CN104054005A (en) * 2012-01-16 2014-09-17 丰田自动车株式会社 Object detection device
KR20140123435A (en) * 2013-04-12 2014-10-22 메타빌드주식회사 Apparatus, method and system for detecting objects using hand-over between antennas of radar device
CN104865568A (en) * 2015-06-02 2015-08-26 西安电子科技大学 Sparse reconstruction-based broadband radar high-speed group-target resolving method
CN105372659A (en) * 2015-11-20 2016-03-02 上海无线电设备研究所 Road traffic monitoring multi-target detection tracking method and tracking system
CN105893931A (en) * 2015-02-16 2016-08-24 松下知识产权经营株式会社 Object detection apparatus and method
KR101652798B1 (en) * 2015-04-27 2016-09-01 메타빌드주식회사 Unexpected incident detecting method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101104560B1 (en) * 2009-07-03 2012-01-11 주식회사 엠티아이 Apparatus and Method for detecting overspeed car
KR101343975B1 (en) * 2012-06-11 2013-12-20 휴앤에스(주) System for detecting unexpected accident

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5115243A (en) * 1991-04-16 1992-05-19 General Electric Co. Radar system with active array antenna, beam multiplex control and pulse integration control responsive to azimuth angle
US7474257B2 (en) * 2004-11-08 2009-01-06 The United States Of America As Represented By The Secretary Of The Navy Multistatic adaptive pulse compression method and system
KR100902560B1 (en) * 2008-11-04 2009-06-11 국방과학연구소 Apparatus and method for generating warning alarm in a tracking-while-scanning radar
CN104054005A (en) * 2012-01-16 2014-09-17 丰田自动车株式会社 Object detection device
CN103593980A (en) * 2012-08-14 2014-02-19 业纳遥控设备有限公司 Method for classifying vehicles in motion
CN103852757A (en) * 2012-12-03 2014-06-11 株式会社电装 Target detecting device for avoiding collision between vehicle and target captured by sensor mounted to the vehicle
KR20140123435A (en) * 2013-04-12 2014-10-22 메타빌드주식회사 Apparatus, method and system for detecting objects using hand-over between antennas of radar device
CN105893931A (en) * 2015-02-16 2016-08-24 松下知识产权经营株式会社 Object detection apparatus and method
KR101652798B1 (en) * 2015-04-27 2016-09-01 메타빌드주식회사 Unexpected incident detecting method
CN104865568A (en) * 2015-06-02 2015-08-26 西安电子科技大学 Sparse reconstruction-based broadband radar high-speed group-target resolving method
CN105372659A (en) * 2015-11-20 2016-03-02 上海无线电设备研究所 Road traffic monitoring multi-target detection tracking method and tracking system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Obstacle detection radar system for highway safety;Jung S. Jung et.al;《2011 3rd International Asia-Pacific Conference on Synthetic Aperture Radar》;20111128;正文第1-4页 *
基于LPC技术的运动车辆目标分类研究;丁帅帅 等;《现代防御技术》;20160430;第44卷(第2期);第177-184页 *

Also Published As

Publication number Publication date
CN108226917A (en) 2018-06-29
KR101752858B1 (en) 2017-07-19
WO2018105842A1 (en) 2018-06-14

Similar Documents

Publication Publication Date Title
CN108226917B (en) High-precision emergency detection system based on radar
US10081308B2 (en) Image-based vehicle detection and distance measuring method and apparatus
EP2118818B1 (en) Video-based road departure warning
JP5939357B2 (en) Moving track prediction apparatus and moving track prediction method
CN102765365B (en) Pedestrian detection method based on machine vision and pedestrian anti-collision warning system based on machine vision
CN105825185B (en) Vehicle collision avoidance method for early warning and device
EP3576008B1 (en) Image based lane marking classification
EP1817761B1 (en) Apparatus and method for automatically detecting objects
EP2803944A2 (en) Image Processing Apparatus, Distance Measurement Apparatus, Vehicle-Device Control System, Vehicle, and Image Processing Program
EP3224819B1 (en) Method of controlling a traffic surveillance system
KR101446546B1 (en) Display system of vehicle information based on the position
CN105620489A (en) Driving assistance system and real-time warning and prompting method for vehicle
KR102001002B1 (en) Method and system for recognzing license plate based on deep learning
CN102997900A (en) Vehicle systems, devices, and methods for recognizing external worlds
KR102031503B1 (en) Method and system for detecting multi-object
CN106327880B (en) A kind of speed recognition methods and its system based on monitor video
KR20170080480A (en) The vehicle detecting system by converging radar and image
JP2007257536A (en) Road traffic monitoring device by means of millimeter-wave radar
CN107273816A (en) Traffic speed limit label detection recognition methods based on vehicle-mounted forward sight monocular camera
TWI335886B (en) Methods and systems for identifying events for a vehicle
WO2014181386A1 (en) Vehicle assessment device
CN104123532A (en) Target object detection and target object quantity confirming method and device
Chachich et al. Traffic sensor using a color vision method
CN113111682A (en) Target object sensing method and device, sensing base station and sensing system
CN114648748A (en) Motor vehicle illegal parking intelligent identification method and system based on deep learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant