WO2022134387A1 - Procède de détection de déplacement à contresens de véhicule, appareil, dispositif, support d'enregistrement lisible par ordinateur et produit programme d'ordinateur - Google Patents

Procède de détection de déplacement à contresens de véhicule, appareil, dispositif, support d'enregistrement lisible par ordinateur et produit programme d'ordinateur Download PDF

Info

Publication number
WO2022134387A1
WO2022134387A1 PCT/CN2021/086694 CN2021086694W WO2022134387A1 WO 2022134387 A1 WO2022134387 A1 WO 2022134387A1 CN 2021086694 W CN2021086694 W CN 2021086694W WO 2022134387 A1 WO2022134387 A1 WO 2022134387A1
Authority
WO
WIPO (PCT)
Prior art keywords
lane
trajectory
vehicle
information
target vehicle
Prior art date
Application number
PCT/CN2021/086694
Other languages
English (en)
Chinese (zh)
Inventor
谈正
朱铖恺
薛志强
路少卿
武伟
Original Assignee
深圳市商汤科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市商汤科技有限公司 filed Critical 深圳市商汤科技有限公司
Publication of WO2022134387A1 publication Critical patent/WO2022134387A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/056Detecting movement of traffic to be counted or controlled with provision for distinguishing direction of travel
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road

Definitions

  • the present disclosure relates to the technical field of vehicles, and in particular, to a vehicle retrograde detection method, apparatus, device, computer-readable storage medium, and computer program product.
  • the wrong-way of vehicles is a very serious traffic violation.
  • the wrong-way of a small number of vehicles may cause serious traffic accidents, seriously endangering road safety and road traffic efficiency.
  • In order to achieve accurate vehicle retrograde detection it is often through manual viewing to find out whether retrograde behavior occurs in surveillance videos.
  • the real-time detection of surveillance video by manual is not only time-consuming and labor-intensive, but also has low detection efficiency, and is prone to delays and omissions.
  • Embodiments of the present disclosure provide a vehicle retrograde detection method, apparatus, device, computer-readable storage medium, and computer program product.
  • An embodiment of the present disclosure provides a method for detecting the wrong direction of a vehicle, including: determining trajectory information of a target vehicle in a video stream; acquiring lane information in a video stream through a preset lane segmentation model, where the lane information includes each of the lanes in at least one lane. The driving area and driving direction corresponding to the lane; determine whether the target vehicle has retrograde behavior according to the trajectory information and the driving area and driving direction of each lane.
  • the determining whether the target vehicle has a retrograde behavior according to the trajectory information and the driving area and driving direction of each lane includes: determining, according to the position information of the target vehicle at multiple times in the trajectory information, The displacement direction of the target vehicle; according to the position information of the target vehicle at multiple times and the driving area of each lane, determine the target lane where the target vehicle is located; based on whether there is a deviation between the driving direction of the target lane and the displacement direction of the target vehicle, Determine if the target vehicle has retrograde behavior.
  • the obtaining the lane information in the video stream through the preset lane segmentation model includes: in response to the lane information obtaining condition, obtaining the lane information in the video stream through the lane segmentation model; obtaining the lane information
  • the conditions include at least one of the following: when the image capture device rotates; when the image capture device is initialized; wherein, the image capture device is used to acquire image information in the current real scene and output a video stream.
  • a method for detecting rotation of an image capture device includes: extracting multiple image frames from a video stream; the multiple image frames at least include a current image frame and a historical image frame adjacent to the current image frame; Determine whether the image acquisition device rotates through the current image frame and the historical image frame.
  • the obtaining lane information in the video stream by using a preset lane segmentation model includes: inputting an image frame in the video stream into the lane segmentation model to obtain a segmentation result of the image frame,
  • the segmentation result includes the driving area corresponding to each lane in at least one lane; according to the sample trajectory acquisition rule, the sample trajectories corresponding to multiple sample vehicles in the driving area corresponding to each lane are obtained; according to the driving area corresponding to each lane and each The sample trajectories corresponding to multiple sample vehicles in the driving area corresponding to the lane are used to determine the driving direction corresponding to each lane.
  • the sample trajectory includes a start position and an end position of the sample vehicle; track, and determine the driving direction corresponding to each lane, including: according to the starting positions and ending positions corresponding to a plurality of sample vehicles in the driving area corresponding to each lane, determining the driving area corresponding to each lane corresponding to a plurality of sample vehicles Track direction: according to the track directions corresponding to multiple sample vehicles in the driving area corresponding to each lane, determine the driving direction corresponding to each lane in at least one lane.
  • the sample trajectory acquisition rule includes at least one of the following: acquiring sample trajectories at preset time intervals; counting the number of acquired sample trajectories, when the number of sample trajectories reaches a preset number , stop the acquisition of sample trajectories.
  • the determining the driving direction corresponding to each lane in the at least one lane according to the trajectory directions corresponding to the plurality of sample vehicles in the driving area corresponding to each lane includes: determining the driving direction corresponding to each of the plurality of trajectory directions Normalize each trajectory direction of , to obtain a trajectory vector corresponding to each trajectory direction; determine the vector sum of multiple trajectory vectors corresponding to each lane as the direction corresponding to each lane.
  • the method further includes: determining a trajectory offset quantization value corresponding to each lane according to the driving direction corresponding to each lane and a plurality of trajectory vectors corresponding to each lane; wherein, the trajectory The offset quantization value is used to characterize the degree of dispersion between the multiple trajectory vectors corresponding to the lane; when the trajectory offset quantization value exceeds the preset precision threshold, the lane information is re-determined according to the video stream.
  • the determining whether the target vehicle has a retrograde behavior based on whether there is a deviation between the driving direction of the target lane and the displacement direction of the target vehicle includes: according to at least one sub-displacement direction corresponding to the displacement direction of the target vehicle At least one included angle is obtained from the included angle with the driving direction of the target lane; wherein, the displacement direction includes at least one sub-displacement direction, and the sub-displacement direction is determined according to the position information at two adjacent moments; In the case that the included angle is greater than the preset included angle threshold, it is determined that the target vehicle has retrograde behavior; when at least one included angle is less than or equal to the included angle threshold, it is determined that the target vehicle does not have retrograde behavior.
  • An embodiment of the present disclosure provides a vehicle reverse-travel detection device, including: a first determination part, configured to determine the trajectory information of a target vehicle in a video stream; an acquisition part, configured to acquire the information in the video stream through a preset lane segmentation model Lane information, the lane information includes a driving area and a driving direction corresponding to each of the lanes in the at least one lane; the second determining part is configured to determine whether the target vehicle is in the wrong direction according to the trajectory information and the driving area and driving direction of each lane Behavior.
  • An embodiment of the present disclosure provides a vehicle reverse-travel detection device, comprising: a memory configured to store an executable computer program; a processor configured to implement the above-mentioned vehicle reverse-travel detection when executing the executable computer program stored in the memory method.
  • An embodiment of the present disclosure provides a computer-readable storage medium storing a computer program, and when the computer program is executed by a processor, the above-mentioned vehicle retrograde detection method is implemented.
  • An embodiment of the present disclosure provides a computer program, including computer-readable code, and when the computer-readable code is executed in an electronic device, a processor in the electronic device implements the above-mentioned vehicle retrograde detection method when executed. .
  • the vehicle reverse-travel detection method determines the trajectory information of the target vehicle in the video stream; obtains the lane information in the video stream through a preset lane segmentation model, and the lane information includes at least one lane corresponding to each of the lanes. According to the trajectory information and the driving area and driving direction of each lane, it is determined whether the target vehicle has a wrong-way behavior. According to the vehicle retrograde detection method provided by the embodiment of the present disclosure, since the trajectory information of the target vehicle is directly obtained according to the video stream, the trajectory information obtained by the embodiment of the present disclosure is more accurate compared with the solution of manually judging the driving direction of the vehicle in the traditional technology. The real-time performance is improved; after obtaining the trajectory information, combined with the lane information of the current lane in the video stream, the detection result of whether the target vehicle has retrograde behavior can be directly obtained, which improves the detection efficiency of vehicle retrograde detection.
  • FIG. 1 is an optional schematic flowchart of a vehicle retrograde detection method provided by an embodiment of the present disclosure
  • FIG. 2 is an optional schematic flow chart of the method for detecting reverse movement of a vehicle provided by an embodiment of the present disclosure
  • FIG. 3 is an optional schematic flow chart of the method for detecting reverse movement of a vehicle provided by an embodiment of the present disclosure
  • FIG. 4 is an optional schematic flow chart of the vehicle retrograde detection method provided by the embodiment of the present disclosure.
  • FIG. 5 is an optional schematic flowchart of the method for detecting the reverse movement of a vehicle provided by an embodiment of the present disclosure
  • FIG. 6 is an optional schematic flow chart of the method for detecting reverse movement of a vehicle provided by an embodiment of the present disclosure
  • FIG. 7 is an optional schematic flow chart of the method for detecting reverse movement of a vehicle provided by an embodiment of the present disclosure
  • FIG. 8 is an optional system schematic diagram of the vehicle retrograde detection system provided by the embodiment of the present disclosure.
  • FIG. 9 is a schematic diagram of an optional vehicle detection and tracking provided by an embodiment of the present disclosure.
  • FIG. 10A is a schematic diagram of a lane before segmentation in a lane segmentation process according to an embodiment of the present disclosure
  • 10B is a schematic diagram of a divided lane in a lane division process according to an embodiment of the present disclosure
  • FIG. 11 is a schematic diagram of an optional lane direction estimation provided by an embodiment of the present disclosure.
  • FIG. 12A is a schematic diagram of a retrograde start image frame of a retrograde behavior of a vehicle according to an embodiment of the present disclosure
  • FIG. 12B is a schematic diagram of an image frame in a retrograde motion of a vehicle retrograde behavior according to an embodiment of the present disclosure
  • 12C is a schematic diagram of a retrograde end image frame of a retrograde behavior of a vehicle according to an embodiment of the present disclosure
  • FIG. 12D is a schematic diagram of a vehicle detail image frame of a retrograde behavior of a vehicle according to an embodiment of the present disclosure
  • FIG. 13 is a schematic diagram of the composition and structure of a vehicle retrograde detection device provided by an embodiment of the present disclosure
  • FIG. 14 is a schematic diagram of a hardware entity of a device according to an embodiment of the present disclosure.
  • FIG. 1 is an optional schematic flow chart of a vehicle retrograde detection method provided by an embodiment of the present disclosure, which will be described in conjunction with the steps shown in FIG. 1 .
  • the embodiments of the present disclosure may acquire a real-time video stream of the current road through an image acquisition device, and identify a target vehicle in the video stream in real time through a target recognition technology. In the embodiment of the present disclosure, there may be one or more target vehicles identified in the real-time video stream.
  • the video stream is a large number of image frames arranged in time series.
  • the current road can be obtained in real time according to the frame rate corresponding to the hardware factors and configuration information. image frame.
  • the corresponding image frames to be detected can be extracted from a large number of image frames in the video stream according to the preset image extraction frequency, and the trajectory information of the target vehicle can be obtained according to the image frames to be detected. .
  • S101 may include: performing vehicle identification on the image frame (or the image frame to be detected) in the video stream, and in the case of identifying the vehicle, taking the vehicle as the target vehicle, and using the vehicle as the target vehicle.
  • the image frame of the identified vehicle is used as the initial image frame corresponding to the target vehicle, and the relative position of the target vehicle in the initial image frame is obtained.
  • the relative position of the target vehicle in each image frame is continuously acquired in chronological order until the target vehicle disappears in the image frames in the video stream.
  • An image frame that appears as the end image frame By sequentially connecting the relative positions of the target vehicle in the image frame, the trajectory information of the target vehicle can be obtained, and the driving direction of the target vehicle relative to the image frame can be obtained through the trajectory information.
  • the trajectory information of the target vehicle in the video stream may also be acquired by an optical flow method.
  • the trajectory information can be extracted by the following steps: acquiring a video stream of the target vehicle, extracting multiple video screenshots arranged in time series from the video stream, and extracting the target vehicle in each video screenshot through a preset optical flow model According to the optical flow characteristics obtained, the trajectory information of the target vehicle is determined.
  • the lane information includes a driving area and a driving direction corresponding to each of the at least one lane.
  • the lane information includes a driving area and a driving direction of at least one lane included in the current road in the video stream.
  • the video stream may include multiple lanes, the lane information includes the driving area and the driving direction corresponding to each lane, the area corresponding to each lane is a relative area relative to the image frame in the video stream, and each lane corresponds to The driving direction of is relative to the image frame in the video stream. Acquiring the lane information in the video stream can be achieved in the following ways:
  • Lane information corresponding to all image acquisition devices and each image acquisition device is pre-stored in the database, and the corresponding lane information can be searched in the database by acquiring the device identification of the image acquisition device.
  • the relative positions of all image capture devices and each image capture device are pre-stored in the database. By obtaining the device identifiers of the image capture devices, the corresponding relative positions can be searched in the database, and according to the relative positions The map obtains the lane information around the image acquisition device in the real scene.
  • the driving direction of the target vehicle relative to the image frame is obtained through trajectory information, and, according to the driving direction of the current lane relative to the image frame, the distance between the driving direction and the driving direction of the target lane is
  • the angle difference is greater than the preset angle threshold, it is determined that the target vehicle has a wrong-way behavior; when the angle difference between the driving direction and the driving direction of the target lane is less than or equal to the preset angle threshold, it is determined that the target vehicle does not Retrograde behavior exists.
  • the method further includes, in the case where it is determined that the target vehicle has a retrograde behavior, also according to the start image frame and the end image frame corresponding to the trajectory information of the target vehicle, in the A corresponding part of the video is intercepted from the video stream and stored as an archived video of the retrograde behavior corresponding to the target vehicle.
  • the license plate number of the target vehicle can also be obtained, and in the process of storing the archived video, the license plate number can be used as an index value of the archived video.
  • the embodiment of the present disclosure determines the trajectory information of the target vehicle in the video stream; obtains the lane information in the video stream through a preset lane segmentation model; information to determine whether the target vehicle has a wrong-way behavior.
  • the vehicle retrograde detection method since the trajectory information of the target vehicle is directly obtained according to the video stream, the trajectory information obtained by the embodiment of the present disclosure is more accurate compared with the traditional method of manually judging the driving direction of the vehicle, and The real-time performance is improved; after obtaining the trajectory information, combined with the lane information of the current lane in the video stream, the detection result of whether the target vehicle has retrograde behavior can be directly obtained, which improves the detection efficiency of vehicle retrograde detection.
  • FIG. 2 is an optional schematic flowchart of the vehicle retrograde detection method provided by an embodiment of the present disclosure.
  • S103 shown in FIG. 1 may include S201 to S204 , which will be described in conjunction with the steps shown in FIG. 2 .
  • S201 Determine the displacement direction of the target vehicle according to the position information of the target vehicle at multiple times in the trajectory information.
  • the image frame in which the target vehicle appears in the video stream is taken as the start image frame
  • the image frame in which the target vehicle appears in the video stream is taken as the end image frame.
  • the track information can be obtained from the starting image frame in the to multiple image frames in the ending image frame (including the starting image frame and the ending image frame).
  • each image frame of the plurality of image frames can be input into the vehicle identification model, and the position information of the target vehicle in the image frame in each image frame can be obtained, and the position information is used to represent the position of the target vehicle in the image frame. relative position.
  • the position information used to represent the relative position of the target vehicle in the image frame may include at least one of the following: position information of the detection frame corresponding to the target vehicle, and wheel key point information corresponding to the target vehicle .
  • the detection frame position information can be at least one of the following: the relative position of each vertex of the detection frame, the relative position of the center point of the detection frame, the relative position of the middle point of the bottom edge of the detection frame, and the relative position of the center of the circumcircle of the detection frame.
  • the displacement direction of the target vehicle can be determined according to the position information of the target vehicle at multiple times in the trajectory information in the following manner:
  • S202 Determine the target lane where the target vehicle is located according to the position information of the target vehicle at multiple times and the driving area of each lane.
  • image segmentation is performed on any image frame in the currently captured video stream, and then the lane area of each lane can be obtained, that is, each lane can be obtained. Relative area in the image frame.
  • a target lane to which the target vehicle belongs can be determined in at least one lane in the lane information.
  • the driving direction is a standard direction prescribed by a traffic law. According to the target lane in which the target vehicle is located obtained in S202, the driving direction corresponding to the target lane can be obtained.
  • the above S203 can be implemented by the following method: in the case that the angle difference between the displacement direction and the driving direction of the target lane is greater than a preset angle threshold, it is determined that the target vehicle has a wrong-way behavior; When the angle difference between the displacement direction and the driving direction of the target lane is less than or equal to the preset angle threshold, it is determined that the target vehicle does not have a wrong-way behavior.
  • the above S203 can also be implemented by the following method: obtaining at least one included angle according to the included angle between at least one sub-displacement direction corresponding to the displacement direction of the target vehicle and the driving direction of the target lane; wherein, the displacement The direction includes at least one sub-displacement direction, and the sub-displacement direction is determined according to the position information at two adjacent moments; in the case that each included angle is greater than the preset included angle threshold, it is determined that the target vehicle has retrograde behavior; When the included angle is less than or equal to the included angle threshold, it is determined that the target vehicle does not have a retrograde behavior.
  • the relative position of the image frame of the target vehicle in the video stream can be accurately obtained according to the position information corresponding to multiple times in the trajectory information of the target vehicle in the embodiment of the present disclosure,
  • the accuracy and accuracy of the displacement direction can be improved; since the target lane where the vehicle is located is obtained, and the driving direction of the target lane corresponding to the target lane is determined, there can be different driving directions in the current scene.
  • the application range of the embodiments of the present disclosure can also be improved while improving the detection accuracy of the vehicle in the wrong direction.
  • FIG. 3 is an optional schematic flowchart of the vehicle retrograde detection method provided by the embodiment of the present disclosure. Based on FIG. 1 or FIG. 2 , and based on FIG. 1 as an example, S102 shown in FIG. 1 can be updated to S301, which will be combined with FIG. 3 The steps shown are explained.
  • the above lane information acquisition conditions include at least one of the following: (1) when the image acquisition device rotates; (2) when the image acquisition device is initialized; wherein, the image acquisition device It is used to obtain the image information in the current real scene and output the video stream. It should be noted that, when any one of the above lane information acquisition conditions is triggered, the lane information in the video stream is acquired through the lane segmentation model.
  • the rotation of the image acquisition device can be detected by the following methods: extracting multiple image frames from the video stream; the multiple image frames include at least the current image frame and the historical image frames adjacent to the current image frame; Historical image frames determine whether the image capture device has rotated. Further, in the embodiment of the present disclosure, the current static sub-image and the historical static sub-image can be respectively intercepted in the current image frame and the historical image frame according to the preset static area, and the current static sub-image and the historical static sub-image can be compared. When the degree of rotation is greater than the preset threshold, it is determined that the image capture device rotates.
  • the preset static area may be determined according to a plurality of image frames acquired during a period of time when the image acquisition device does not rotate. For a road scene, the static area may be the area where environmental objects such as street lamps, trees, and buildings on both sides of the road are located, and the non-static area opposite to the static area is the area on the inner side of the road where vehicles travel.
  • the capture angle of the image capture device for the current scene changes.
  • the track information of the target vehicle collected according to the current collection angle does not match the pre-stored lane information. Therefore, the lane information needs to be re-determined according to the video stream.
  • the lane information is fixed, that is to say, in each process of performing the retrograde detection of the target vehicle, according to the current collection
  • the track information of the target vehicle collected from the angle matches the pre-stored lane information, and the lane information can be directly searched in the database without re-determining the lane information according to the video stream.
  • FIG. 4 is an optional schematic flowchart of the vehicle retrograde detection method provided by the embodiment of the present disclosure. Based on FIG. 3 , S301 shown in FIG. 3 can be updated to S401 to S403 , which will be described in conjunction with the steps shown in FIG. 4 .
  • an image frame is acquired in the video stream, the image frame is respectively input into a preset lane segmentation model, and a segmentation result corresponding to the image frame is acquired , the segmentation result includes the driving area corresponding to each lane in the image frame, that is, the relative area of each lane in the image frame.
  • the sample trajectory acquisition rule includes at least one of the following: acquiring the sample trajectory within a preset time interval (ie, at a preset time interval) after the time corresponding to the image frame; The number of sample trajectories. When the number of sample trajectories reaches the preset number, the acquisition of sample trajectories is stopped.
  • each lane may include a sample track corresponding to at least one sample vehicle, wherein one sample vehicle corresponds to one sample track.
  • S402 may include: during the acquisition of sample trajectories for the video stream, the sample trajectories in the current video stream are always acquired, and the number of acquired sample trajectories is counted in real time, and when the number of sample trajectories reaches a predetermined number When the number is set, the acquisition of sample trajectories is stopped.
  • S402 may further include: during the acquisition of the sample trajectory of the video stream, the sample trajectory in the current video stream will always be acquired, and when the time interval after the corresponding moment of the image frame arrives, stop Acquisition of sample trajectories.
  • S402 may further include: in the acquisition of sample trajectories for the video stream, if the counted number of sample trajectories does not reach the preset number when the time interval after the time corresponding to the image frame arrives, Then, it is determined that the sample trajectory acquisition fails, and S402 is executed again.
  • S403 Determine the driving direction corresponding to each lane according to the driving area corresponding to each lane and the sample trajectories corresponding to the plurality of sample vehicles in the driving area corresponding to each lane.
  • the driving direction corresponding to each sample vehicle may be determined.
  • the driving directions corresponding to multiple sample vehicles in each lane can be obtained, and the average value of the driving directions of multiple sample vehicles in any lane can be used as the driving direction corresponding to the lane.
  • vehicle A, vehicle B and vehicle C can be found, wherein the driving direction D1 of vehicle A can be known according to the sample trajectory of vehicle A,
  • the driving direction D2 of vehicle B can be known from the sample trajectory of vehicle B, and the driving direction D3 of vehicle C can be known from the sample trajectory of vehicle C.
  • the driving direction corresponding to the first lane is the average value of D1, D2, and D3.
  • the driving direction corresponding to the first lane is 60 degrees; if the unit of the direction is a vector, in D1 is (0.7 , 0.7), when D2 is (0.5, 0.87), and D3 is (0.87, 0.5), the driving direction corresponding to the first lane is (0.7, 0.7).
  • the embodiment of the present disclosure determines the driving direction corresponding to each lane according to the area corresponding to each lane and the sample trajectories corresponding to the multiple sample vehicles in each lane. Analyze multiple sample trajectories in each lane in the current scene, and determine the driving direction of each lane according to the obtained driving directions of multiple sample vehicles in each lane, which can accurately estimate the lane driving in the current scene in any scene. direction, which indirectly improves the accuracy of vehicle retrograde detection.
  • FIG. 5 is an optional schematic flowchart of the vehicle retrograde detection method provided by the embodiment of the present disclosure. Based on FIG. 4 , S403 shown in FIG. 4 can be updated to S501 to S502 , which will be described in conjunction with the steps shown in FIG. 5 .
  • a start image frame and an end image frame corresponding to the sample trajectory are obtained, and a start position is obtained in the start image frame, and in the end image
  • the end position is obtained in the frame, and the trajectory direction corresponding to the sample trajectory can be determined according to the start position and the end position.
  • vehicle A and vehicle B can be found, wherein the starting position and ending position of vehicle A can be known according to the sample trajectory of vehicle A, According to the sample trajectory of vehicle B, the starting position and ending position of vehicle B can be known.
  • the trajectory direction of the vehicle A is , and the trajectory direction of the vehicle B is .
  • S502 Determine a driving direction corresponding to each lane in at least one lane according to the trajectory directions corresponding to the plurality of sample vehicles in the driving area corresponding to each lane.
  • the average value of the trajectory directions corresponding to a plurality of sample vehicles in each lane is used as the driving direction corresponding to the lane.
  • the embodiment of the present disclosure determines the trajectory directions corresponding to multiple sample vehicles in each lane through the starting position and the ending position of each sample trajectory, and then determines each trajectory direction.
  • the prediction accuracy of the corresponding trajectory direction of each sample vehicle can be improved while reducing the amount of calculation; since the target lane is determined according to the trajectory directions corresponding to multiple sample vehicles The driving direction of the target lane is eliminated, and the error of the estimated driving direction of the target lane caused by the abnormal driving of individual vehicles is eliminated, thereby improving the accuracy of lane risk estimation.
  • FIG. 6 is an optional schematic flowchart of the vehicle retrograde detection method provided by an embodiment of the present disclosure. Based on FIG. 5 and other above-mentioned embodiments, S502 shown in FIG. 5 may be updated to S601 to S602, which will be combined with steps are explained.
  • S601 includes: determining a track modulo length corresponding to each track direction, and performing normalization processing on each track direction according to the track modulo length of each track direction, so as to obtain each track direction the corresponding trajectory vector.
  • the trajectory direction is determined by the starting trajectory coordinate point and the ending trajectory coordinate point. First obtain the square sum of the starting trajectory coordinate point and the ending trajectory coordinate point in the trajectory direction, and then take the square root of the square sum to obtain the trajectory direction.
  • the ratio of the starting trajectory coordinate point, the ending trajectory coordinate point and the trajectory modulo length respectively is taken as the trajectory vector corresponding to the trajectory direction.
  • trajectory direction For example, for a trajectory direction, first determine the trajectory modulo length corresponding to the trajectory direction, and normalize the trajectory direction according to the trajectory modulo length, and then the trajectory vector can be obtained as .
  • S602. Determine the vector sum of multiple trajectory vectors corresponding to each lane as the driving direction corresponding to each lane.
  • the driving direction corresponding to the lane is determined according to the vectors of the multiple trajectory vectors corresponding to the lane and the lane. .
  • the vector sum may also be normalized, and the normalized vector sum may be used as the driving direction corresponding to the lane.
  • the second lane includes the trajectory vector and , and the driving direction corresponding to the lane is ++.
  • the driving direction corresponding to the second lane is (1.7, 1.7).
  • the obtained driving direction of the second lane may also be normalized, and the normalized driving direction is obtained as (0.7, 0.7).
  • the embodiment of the present disclosure normalizes each trajectory direction, and selects the driving direction corresponding to each lane according to the normalized trajectory vector machine. , which can further reduce the amount of data calculation when predicting the driving direction of the target lane. While ensuring the accuracy of the calculation, the estimation efficiency of the driving direction of the target lane can be improved, thereby improving the detection efficiency of the vehicle's wrong-way behavior.
  • FIG. 7 is an optional schematic flowchart of the vehicle retrograde detection method provided by the embodiment of the present disclosure. Based on FIG. 6 and other above-mentioned embodiments, the method further includes S701 to S702 , which will be described in conjunction with the steps shown in FIG. 7 .
  • the trajectory offset quantization value is used to represent the degree of dispersion among the plurality of trajectory vectors corresponding to the lanes.
  • the quantized value of the trajectory offset may be the range, the sum of squares of deviations from the mean, the variance, the standard deviation, and the coefficient of variation between the driving direction corresponding to the lane and multiple trajectory vectors.
  • the driving direction and the trajectory vector corresponding to the lane in the vector format can be converted into the format of the angle first, and the maximum angle and the minimum angle are taken as the maximum angle and the minimum angle.
  • the difference between the maximum angle and the minimum angle is used as the track offset quantization value. For example, when the driving direction corresponding to the lane is , the trajectory vector 1 is , and the trajectory vector 2 is Spend.
  • the driving direction and the trajectory vector corresponding to the lane in the vector format can be converted into the format of the angle first, and the multiple angles obtained after the conversion can be calculated.
  • standard deviation For example, if the driving direction corresponding to the lane is , the trajectory vector 1 is , and the trajectory vector 2 is The standard deviation of 45 degrees, 60 degrees and 30 degrees is 12.25.
  • the quantized value of the trajectory offset exceeds the preset accuracy threshold, it means that the trajectory vectors of the sample vehicles in the lane are far apart, that is, there may be multiple trajectory vectors with retrograde behavior in the collected trajectory vectors.
  • the accuracy of the driving direction corresponding to the generated lane is low, and the lane information needs to be re-determined according to the video stream. In one embodiment, it may jump to the step of "determining lane information according to the video stream" in the above embodiment.
  • the embodiment of the present disclosure can reduce the degree of dispersion of the plurality of trajectory vectors when the degree of dispersion of the plurality of trajectory vectors is relatively high.
  • the accuracy of the driving direction corresponding to the determined lane is low, so that the lane information is regenerated, which can avoid the misjudgment of wrong-way behavior caused by the fact that the trajectory of each sample vehicle cannot accurately reflect the lane information in extreme scenarios;
  • the determination of the retrograde behavior is performed when the discrete degree of the multiple trajectory vectors is low, the detection accuracy of the retrograde behavior of the vehicle can be improved.
  • the wrong-way of vehicles is a very serious traffic violation.
  • the wrong-way of a small number of vehicles may cause serious traffic accidents, seriously endangering road safety and road traffic efficiency.
  • Automatic retrograde detection has played a very important role in control in the application of traffic police.
  • the detection of wrong-way vehicles mainly relies on a large number of cameras set up on the road.
  • a large number of surveillance videos are obtained, and then methods such as optical flow and background modeling are used to detect and track vehicle targets, and then use the obtained vehicle trajectories.
  • the information judges whether the vehicle is going in the wrong direction, supplemented by manual to judge the authenticity of the retrograde event.
  • Many wrong-way vehicles can be detected by the above method, but it is still limited by many scenarios. For example, the lane has no marked range or direction, or the surveillance camera is deflected and the angle of the video picture is different from the original. In these scenarios, the above method cannot continue to work.
  • an embodiment of the present disclosure proposes a vehicle reverse-travel detection method, which realizes the monitoring of the rotation state of the camera, automatically divides the lanes, and estimates the correct driving direction of each lane, so as to complete the detection without manual marking of the driving direction and the possible rotation of the camera.
  • Vehicle retrograde detection task which realizes the monitoring of the rotation state of the camera, automatically divides the lanes, and estimates the correct driving direction of each lane, so as to complete the detection without manual marking of the driving direction and the possible rotation of the camera.
  • the proposed vehicle retrograde detection algorithm based on panoramic segmentation, lane area recognition in different driving directions is realized, and the driving direction of each area is estimated to replace manual labeling, which is convenient for large-scale deployment; and, due to the introduction of camera rotation judgment , it can restart quickly after the surveillance camera is rotated to avoid false alarms due to changes in the shooting angle; due to the setting of certain filtering logic and filtering thresholds, false alarms caused by a small number of detection errors can be avoided.
  • the vehicle retrograde detection is a video abnormal event detection system.
  • the input is the video captured by the surveillance camera, and the output is the retrograde vehicle trajectory.
  • the system includes the following parts: a video structuring part, which is configured to structure the input video stream, can detect and track vehicles, and output its driving trajectory for each vehicle, including the detection frame and wheels of the vehicle at each frame. key point.
  • the scene understanding part is configured to input a screenshot of the surveillance video every other period, and determine whether there is a camera rotation compared with the previous input. If the input is for the first time or the rotation occurs, the panoramic segmentation results of different lanes are output, and the lane direction estimation part is called.
  • the lane direction estimation part is configured to input vehicle trajectories in each lane for the next period of time (usually 5 minutes), and take the average of the moving directions as the lane direction estimation result and output.
  • the vehicle reverse-travel detection part is configured to determine whether there is reverse-travel based on the trajectory of each vehicle and the driving direction of the lane to which it belongs. If there is retrograde, output the information of the trajectory.
  • the lane direction is the driving direction of the target lane in the above embodiment.
  • FIG. 8 is an optional system schematic diagram of the vehicle reverse-travel detection system provided by the embodiment of the present disclosure, including: a video structuring part 801 , a scene understanding part 802 , a lane direction estimation part 803 , and a vehicle reverse-travel detection part 804 .
  • the input of the video structuring part 801 is the real-time video stream of the surveillance camera, and the output is the driving track of each vehicle, including the detection frame and wheel key points of the vehicle in each frame; wherein,
  • the detection and tracking tool can be used to obtain the vehicle trajectory appearing in the video, extract its detection frame and key point information, and use it as the input of the lane direction estimation part and the vehicle retrograde detection part.
  • FIG. 9 is a schematic diagram of an optional vehicle detection and tracking provided by an embodiment of the present disclosure.
  • a vehicle in each frame of an image in a current video stream can be detected, and a label frame corresponding to the vehicle can be obtained.
  • the input of the scene understanding part 802 is a screenshot of a certain frame of the video stream
  • the output is the division of lanes according to the picture
  • the panoramic segmentation results of different lanes are output.
  • the lane segmentation model and the camera rotation model are used, and the lane segmentation is used as the basis for judging the lane to which each vehicle belongs. And it is determined whether the camera is rotated to decide whether to start the lane direction estimation part again.
  • FIG. 10A is a schematic diagram of a lane before segmentation in a lane segmentation process provided by an embodiment of the present disclosure, and the lane segmentation model can be used to segment the lane of each frame of image in the current video stream to obtain the area corresponding to the lane in each frame of image .
  • the lane image obtained after segmentation can be as shown in Figure 10B.
  • the input of the lane direction estimation part 803 is the vehicle trajectory information obtained by the video structuring part, and the segmentation map output by the scene understanding part; the output is the average driving direction of each lane.
  • the lane direction estimation algorithm is used to read the label of the lane to which the center point of the bottom edge of each vehicle detection frame of each frame belongs to on the segmentation map output by the scene understanding part according to the vehicle trajectory information obtained from the video structuring part. And in the same lane, all trajectories from the start to the end are averaged as the result output.
  • FIG. 11 is a schematic diagram of an optional lane direction estimation provided by an embodiment of the present disclosure, and the lane direction estimation part can estimate the lane direction of each lane in the current video stream.
  • a vector is defined to represent the direction of a certain vehicle trajectory. For the trajectory Tx, let the center point of the bottom edge of the detection frame of the start frame be , and the center point of the bottom edge of the detection frame of the end frame to be , then . Then the average direction of the lane is defined as shown in Equation 1 and Equation 2.
  • the lane direction estimation part cannot collect enough trajectories (not less than 5) within 5 minutes of running, or the standard deviation of each trajectory is too large (greater than 20), it is considered that the direction estimation of the lane fails.
  • the input of the retrograde detection part 804 is the vehicle trajectory information obtained by the video structuring part, the segmentation map output by the scene understanding part, and the lane direction output by the direction estimation part; the output is all retrograde trajectories. .
  • the wrong-way detection section 804 is activated after the lane direction estimation section 803 succeeds.
  • the center points of the bottom edge of the vehicle detection frame at three moments are (x1, y1), (x2, y2), (x3, y3) respectively. Then the displacement direction between them is . If there is a sum, and the included angle of the sum is greater than 120°, and
  • the vehicle retrograde detection part can output the trajectory information of the target vehicle with retrograde behavior, wherein, multiple image frames corresponding to the retrograde behavior can be output.
  • the retrograde behavior of the target vehicle may be as shown in FIGS. 12A to 12D , wherein, FIG. 12A may be the image frame of the retrograde beginning in the retrograde behavior; FIG. 12B may be the image frame of the retrograde in the retrograde behavior; FIG. 12C may be The image frame of the end of the retrograde movement in the retrograde behavior; FIG. 12D can be a detailed image frame of the vehicle during the retrograde movement in the retrograde behavior.
  • the following technical effects can be achieved: (1) Real-time early warning: the traffic police can use this system to detect dangerous retrograde behavior in time, so as to dispatch police forces to stop the behavior, which can reduce risk of accident. (2)
  • Responsibility after the event The traffic police can use this system to discover the missed retrograde behavior after the event as the basis for a fine.
  • FIG. 13 is a schematic structural diagram of a vehicle reverse-travel detection device provided by an embodiment of the present disclosure. As shown in FIG. 13 , the vehicle reverse-travel detection device 1300 includes a first determination part 1301 , an acquisition part 1302 and a second determination part 1303 , in:
  • the first determining part 1301 is configured to determine the track information of the target vehicle in the video stream
  • the obtaining part 1302 is configured to obtain lane information in the video stream through a preset lane segmentation model, where the lane information includes a driving area and a driving direction corresponding to each of the at least one lane;
  • the second determining part 1303 is configured to determine whether the target vehicle has a wrong-way behavior according to the track information and the driving area and driving direction of each lane.
  • the second determining part 1303 is further configured to determine the displacement direction of the target vehicle according to the position information of the target vehicle at multiple times in the trajectory information; The location information and the driving area of each lane are used to determine the target lane where the target vehicle is located; based on whether there is a deviation between the driving direction of the target lane and the displacement direction of the target vehicle, it is determined whether the target vehicle has retrograde behavior.
  • the acquiring part 1302 is further configured to acquire lane information in the video stream through the lane segmentation model in response to the lane information acquisition condition;
  • the lane information acquisition condition includes at least one of the following: in the image acquisition device In the case of rotation; in the case of initialization of the image acquisition device; wherein, the image acquisition device is used to acquire the image information in the current real scene and output the video stream.
  • the acquiring part 1302 is further configured to extract multiple image frames from the video stream; the multiple image frames at least include the current image frame and the historical image frames adjacent to the current image frame; Frame and historical image frames determine whether the image capture device is rotated.
  • the acquiring part 1302 is further configured to input an image frame in the video stream into the lane segmentation model, and obtain a segmentation result of the image frame, where the segmentation result includes the correspondence between each lane in the at least one lane.
  • the sample trajectory acquisition rules the sample trajectories corresponding to multiple sample vehicles in the driving area corresponding to each lane are obtained; according to the driving area corresponding to each lane and the driving area corresponding to each lane.
  • the sample trajectories of determine the driving direction corresponding to each lane.
  • the sample trajectory includes the starting position and the ending position of the sample vehicle; the acquiring part 1302 is further configured to be based on the starting position and ending position corresponding to the plurality of sample vehicles in the driving area corresponding to each lane position, determine the trajectory directions corresponding to multiple sample vehicles in the driving area corresponding to each lane; determine the driving direction corresponding to each lane in at least one lane according to the trajectory directions corresponding to multiple sample vehicles in the driving area corresponding to each lane .
  • the sample trajectory acquisition rule includes at least one of the following: acquiring sample trajectories at preset time intervals; counting the number of acquired sample trajectories, and stopping when the number of sample trajectories reaches a preset number Acquisition of sample trajectories.
  • the acquisition part 1302 is further configured to perform normalization processing on each of the multiple trajectory directions to obtain a trajectory vector corresponding to each trajectory direction; A vector sum of multiple trajectory vectors is determined as the direction corresponding to each lane.
  • the second determining part 1303 is further configured to determine the quantized value of the trajectory offset corresponding to each lane according to the driving direction corresponding to each lane and the plurality of trajectory vectors corresponding to each lane ; wherein, the track offset quantization value is used to represent the degree of dispersion between the multiple track vectors corresponding to the lane; when the track offset quantization value exceeds the preset accuracy threshold, the lane information is re-determined according to the video stream.
  • the second determining part 1303 is further configured to obtain at least one included angle according to the included angle between at least one sub-displacement direction corresponding to the displacement direction of the target vehicle and the driving direction of the target lane; wherein, The displacement direction includes at least one sub-displacement direction, and the sub-displacement direction is determined according to the position information at two adjacent moments; in the case that each included angle is greater than the preset included angle threshold, it is determined that the target vehicle has retrograde behavior; When at least one included angle is less than or equal to the included angle threshold, it is determined that the target vehicle does not have a retrograde behavior.
  • the vehicle retrograde detection method may also be stored in a computer-readable storage medium.
  • the computer software products are stored in a storage medium and include several instructions to make
  • the terminal which may be a smart phone with a camera, a tablet computer, etc.
  • the aforementioned storage medium includes: U disk, mobile hard disk, read only memory (Read Only Memory, ROM), magnetic disk or optical disk and other media that can store program codes.
  • embodiments of the present disclosure are not limited to any particular combination of hardware and software.
  • a "part" may be a part of a circuit, a part of a processor, a part of a program or software, etc., of course, a unit, a module or a non-modularity.
  • an embodiment of the present disclosure provides a computer-readable storage medium on which a computer program is stored, and when the computer program is executed by a processor, implements the steps in any of the vehicle retrograde detection methods in the foregoing embodiments.
  • a chip is also provided, the chip includes a programmable logic circuit and/or program instructions, and when the chip is running, it is configured to implement any one of the above embodiments. Describe the steps in the vehicle retrograde detection method.
  • a computer program product is also provided.
  • the computer program product is executed by a processor of a terminal, the computer program product is configured to implement any one of the vehicle retrograde detection methods in the foregoing embodiments. step.
  • FIG. 14 is a schematic diagram of a hardware entity of a device provided by an embodiment of the present disclosure.
  • the device 1400 includes a memory 1410 and a processor 1420 , and the memory 1410 stores a computer that can run on the processor 1420 A program, when the processor 1420 executes the program, implements the steps in any of the vehicle retrograde detection methods in the embodiments of the present disclosure.
  • the memory 1410 is configured to store instructions and applications executable by the processor 1420, and can also cache data to be processed or processed by the processor 1420 and various parts of the terminal (for example, image data, audio data, voice communication data and video communication data). data), which can be implemented through flash memory (FLASH) or random access memory (Random Access Memory, RAM).
  • the processor 1420 executes the program, it implements the steps of any one of the above-mentioned methods for detecting the wrong direction of a vehicle.
  • the processor 1420 generally controls the overall operation of the device 1400.
  • the above-mentioned processor can be a special purpose integrated circuit (Application Specific Integrated Circuit, ASIC), a digital signal processor (Digital Signal Processor, DSP), a digital signal processing device (Digital Signal Processing Device, DSPD), a programmable logic device (Programmable Logic At least one of Device, PLD), Field Programmable Gate Array (Field Programmable Gate Array, FPGA), Central Processing Unit (Central Processing Unit, CPU), controller, microcontroller, microprocessor.
  • ASIC Application Specific Integrated Circuit
  • DSP Digital Signal Processor
  • DSPD Digital Signal Processing Device
  • a programmable logic device Programmable Logic At least one of Device, PLD), Field Programmable Gate Array (Field Programmable Gate Array, FPGA), Central Processing Unit (Central Processing Unit, CPU), controller, microcontroller, microprocessor.
  • the electronic device implementing the function of the above processor may also be other, which is not limited in the embodiment of the present disclosure.
  • the above-mentioned computer-readable storage medium/memory can be a read-only memory (Read Only Memory, ROM), a programmable read-only memory (Programmable Read-Only Memory, PROM), an erasable programmable read-only memory (Erasable Programmable Read-Only Memory) Memory, EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Magnetic Random Access Memory (FRAM), Flash Memory (Flash Memory), Magnetic Surface Memory, optical disk, or memory such as Compact Disc Read-Only Memory (CD-ROM); it can also be various terminals including one or any combination of the above memories, such as mobile phones, computers, tablet devices, personal digital Assistant etc.
  • the disclosed terminal and method may be implemented in other manners.
  • the terminal embodiments described above are illustrative.
  • the division of the units is a logical function division.
  • multiple units or components may be combined or integrated. to another system, or some features can be ignored, or not implemented.
  • the coupling, or direct coupling, or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection of devices or units may be electrical, mechanical or other forms. of.
  • the unit described above as a separate component may or may not be physically separated, and the component displayed as a unit may or may not be a physical unit; it may be located in one place or distributed to multiple network units; Some or all of the units may be selected according to actual needs to achieve the purpose of the solutions of the embodiments of the present disclosure.
  • each functional unit in each embodiment of the present disclosure may be all integrated into one processing unit, or each unit may be separately used as a unit, or two or more units may be integrated into one unit; the above integration
  • the unit can be implemented either in the form of hardware or in the form of hardware plus software functional units.
  • the above-mentioned integrated units in the embodiments of the present disclosure are implemented in the form of software functional parts and sold or used as independent products, they may also be stored in a computer-readable storage medium.
  • the technical solutions of the embodiments of the present disclosure may be embodied in the form of software products that are essentially or contribute to related technologies.
  • the computer software products are stored in a storage medium and include several instructions to make The device automated test line performs all or part of the methods described in various embodiments of the present disclosure.
  • the aforementioned storage medium includes various media that can store program codes, such as a removable storage device, a ROM, a magnetic disk, or an optical disk.
  • Embodiments of the present disclosure provide a method, device, device, computer-readable storage medium, and computer program product for vehicle reverse-travel detection; wherein, the vehicle reverse-travel detection method includes: determining trajectory information of a target vehicle in a video stream; passing a preset lane The segmentation model obtains the lane information in the video stream, and the lane information includes the driving area and the driving direction corresponding to each of the lanes in at least one lane; according to the trajectory information and the driving area and driving direction of each lane, determine whether the target vehicle has a wrong-way behavior .
  • the detection accuracy and detection efficiency of vehicle retrograde detection can be improved.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

L'invention concerne un procédé de détection du déplacement à contresens d'un véhicule, un appareil, un dispositif, un support de stockage lisible par ordinateur et un produit programme d'ordinateur ; le procédé comprend les étapes consistant à : déterminer des informations de trajectoire d'un véhicule cible dans un flux vidéo (S101) ; obtenir des informations de voie dans le flux vidéo au moyen d'un modèle de segmentation de voie prédéfini, les informations de voie comprenant une zone de déplacement de véhicule et un sens de déplacement de véhicule correspondants pour chaque voie parmi au moins une voie (S102) ; en fonction des informations de trajectoire et de la zone de déplacement de véhicule et du sens de déplacement de véhicule de chaque voie, déterminer s'il existe un comportement de déplacement à contresens du véhicule cible (S103).
PCT/CN2021/086694 2020-12-21 2021-04-12 Procède de détection de déplacement à contresens de véhicule, appareil, dispositif, support d'enregistrement lisible par ordinateur et produit programme d'ordinateur WO2022134387A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011520348.5A CN112750317A (zh) 2020-12-21 2020-12-21 车辆逆行检测方法、装置、设备及计算机可读存储介质
CN202011520348.5 2020-12-21

Publications (1)

Publication Number Publication Date
WO2022134387A1 true WO2022134387A1 (fr) 2022-06-30

Family

ID=75648102

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/086694 WO2022134387A1 (fr) 2020-12-21 2021-04-12 Procède de détection de déplacement à contresens de véhicule, appareil, dispositif, support d'enregistrement lisible par ordinateur et produit programme d'ordinateur

Country Status (2)

Country Link
CN (1) CN112750317A (fr)
WO (1) WO2022134387A1 (fr)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113602264B (zh) * 2021-08-17 2023-02-28 北京市商汤科技开发有限公司 一种车辆行驶行为检测方法、装置、设备和存储介质
US20230111391A1 (en) * 2021-10-07 2023-04-13 Here Global B.V. Method, apparatus, and computer program product for identifying wrong-way driven vehicles
CN114332708A (zh) * 2021-12-29 2022-04-12 深圳市商汤科技有限公司 交通行为检测方法、装置、电子设备及存储介质
CN114360261B (zh) * 2021-12-30 2023-05-19 北京软通智慧科技有限公司 车辆逆行的识别方法、装置、大数据分析平台和介质
CN114913695B (zh) * 2022-06-21 2023-10-31 上海西井科技股份有限公司 基于ai视觉的车辆逆行检测方法、系统、设备及存储介质
CN116434560B (zh) * 2023-06-15 2023-08-25 跨越速运集团有限公司 一种违规行驶的识别方法、装置、设备及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004234486A (ja) * 2003-01-31 2004-08-19 Matsushita Electric Ind Co Ltd 車両逆走検知装置
CN101937614A (zh) * 2010-06-12 2011-01-05 北京中科卓视科技有限责任公司 一种即插即用的交通综合检测系统
KR101416457B1 (ko) * 2013-11-22 2014-08-06 주식회사 넥스파시스템 역주행 차량 및 보행자 위험상황 인지가 적용된 도로방범시스템
CN104464305A (zh) * 2014-12-11 2015-03-25 天津艾思科尔科技有限公司 车辆逆行智能检测装置与方法
CN111611901A (zh) * 2020-05-15 2020-09-01 北京百度网讯科技有限公司 车辆逆行检测方法、装置、设备以及存储介质

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106571039A (zh) * 2016-08-22 2017-04-19 中海网络科技股份有限公司 一种高速公路违章行为自动抓拍系统
KR101964063B1 (ko) * 2017-06-12 2019-08-13 윤현민 역주행 방지 시스템
CN111898491A (zh) * 2020-07-15 2020-11-06 上海高德威智能交通系统有限公司 一种车辆逆向行驶的识别方法、装置及电子设备

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004234486A (ja) * 2003-01-31 2004-08-19 Matsushita Electric Ind Co Ltd 車両逆走検知装置
CN101937614A (zh) * 2010-06-12 2011-01-05 北京中科卓视科技有限责任公司 一种即插即用的交通综合检测系统
KR101416457B1 (ko) * 2013-11-22 2014-08-06 주식회사 넥스파시스템 역주행 차량 및 보행자 위험상황 인지가 적용된 도로방범시스템
CN104464305A (zh) * 2014-12-11 2015-03-25 天津艾思科尔科技有限公司 车辆逆行智能检测装置与方法
CN111611901A (zh) * 2020-05-15 2020-09-01 北京百度网讯科技有限公司 车辆逆行检测方法、装置、设备以及存储介质

Also Published As

Publication number Publication date
CN112750317A (zh) 2021-05-04

Similar Documents

Publication Publication Date Title
WO2022134387A1 (fr) Procède de détection de déplacement à contresens de véhicule, appareil, dispositif, support d'enregistrement lisible par ordinateur et produit programme d'ordinateur
KR101758576B1 (ko) 물체 탐지를 위한 레이더 카메라 복합 검지 장치 및 방법
Grassi et al. Parkmaster: An in-vehicle, edge-based video analytics service for detecting open parking spaces in urban environments
US10212397B2 (en) Abandoned object detection apparatus and method and system
CN112085952B (zh) 监控车辆数据方法、装置、计算机设备及存储介质
WO2020094088A1 (fr) Procédé de capture d'image, caméra de surveillance et système de surveillance
CN111860318A (zh) 一种建筑工地行人徘徊检测方法、装置、设备及存储介质
CN111127508B (zh) 一种基于视频的目标跟踪方法及装置
AU2014240669B2 (en) Object monitoring system, object monitoring method, and monitoring target extraction project
CN111105437B (zh) 车辆轨迹异常判断方法及装置
CN108647587B (zh) 人数统计方法、装置、终端及存储介质
CN111738240A (zh) 区域监测方法、装置、设备及存储介质
CN109446946B (zh) 一种基于多线程的多摄像头实时检测方法
WO2018058530A1 (fr) Procédé et dispositif de détection de cible, et appareil de traitement d'images
WO2023124133A1 (fr) Procédé et appareil de détection de comportement de trafic, dispositif électronique, support de stockage, et produit de programme informatique
WO2014193220A2 (fr) Système et procédé pour identification de plaques d'immatriculation multiples
CN112183367A (zh) 车辆数据检错方法、装置、服务器及存储介质
CN111898491A (zh) 一种车辆逆向行驶的识别方法、装置及电子设备
CN110969864A (zh) 一种车速检测方法、车辆行驶事件检测方法及电子设备
CN111275957A (zh) 一种交通事故信息采集方法、系统及摄像机
CN115761655A (zh) 一种目标跟踪方法及装置
CN111294552A (zh) 图像采集设备确定方法及装置
CN111353342A (zh) 肩头识别模型训练方法、装置、人数统计方法、装置
CN114913470B (zh) 一种事件检测方法及装置
WO2023184833A1 (fr) Procédé et appareil de traitement de résultat de détection, dispositif, support et produit-programme d'ordinateur

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21908399

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 28/09/2023)

122 Ep: pct application non-entry in european phase

Ref document number: 21908399

Country of ref document: EP

Kind code of ref document: A1