CN112668466A - Lane line identification method for address event data stream - Google Patents

Lane line identification method for address event data stream Download PDF

Info

Publication number
CN112668466A
CN112668466A CN202011572704.8A CN202011572704A CN112668466A CN 112668466 A CN112668466 A CN 112668466A CN 202011572704 A CN202011572704 A CN 202011572704A CN 112668466 A CN112668466 A CN 112668466A
Authority
CN
China
Prior art keywords
event
lane line
value
lane
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011572704.8A
Other languages
Chinese (zh)
Inventor
张远辉
许璐钧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Jiliang University
Original Assignee
China Jiliang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Jiliang University filed Critical China Jiliang University
Priority to CN202011572704.8A priority Critical patent/CN112668466A/en
Publication of CN112668466A publication Critical patent/CN112668466A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a lane line identification method for address Event AER (Address Event retrieval) data streams, which mainly solves the problem of lane line identification under motion blur and light illumination dark environments. The implementation scheme is as follows: (1) the method comprises the steps of converting traditional lane line image frames into event stream data by using an event simulator (2), and denoising the event data in a filtering mode by using asynchronous output characteristics of the event stream. (3) The event frame is obtained by LIF (Le aky integral-and-Fire) coding. (4) And (3) carrying out visual angle transformation on the event frame to obtain an aerial view (5) of the lane line, and adopting a histogram peak value to quickly position the lane and a B spline curve to fit the lane. (6) And optimizing the algorithm recognition efficiency and robustness by using a lane tracking strategy, and realizing the rapid recognition and tracking of the lane line. The method and the device can reduce the time consumption of each frame of image for identifying the lane line while ensuring high detection rate.

Description

Lane line identification method for address event data stream
Technical Field
The invention relates to the technical field of lane line identification, in particular to a lane line identification method for address event data streams.
Background
Lane line identification is one of key technologies of the current advanced driving assistance system, a driver is assisted to sense potential safety hazards existing in a driving environment by means of a vehicle-mounted sensor, and the relative position of a vehicle in a driving lane is analyzed by identifying the lane line, so that whether the vehicle has a lane departure danger or not is judged, and early warning is further performed on possible dangerous conditions. Most of the existing lane line identification methods can be summarized into a feature-based lane line identification method, a model-based lane line identification method and a machine-learned image segmentation method by means of a traditional visible light camera. When the vehicle is traveling at high speed, motion blur is easily caused, and these consecutive image frames containing a large amount of redundant information can be extremely wasteful of computing power, memory space and time.
The event camera is a sensor simulating the nerve form, and is widely concerned by researchers in recent years, the sensor can continuously detect light intensity information on each pixel point in a photosensitive array of the sensor according to time sequence, when relative change exceeds a threshold value, position information and change attributes of the pixel point are asynchronously and independently output, and the data containing the position information and the change attributes is called as an event. Conventional cameras capture and store video in "frames," taking a complete frame of pictures over a fixed period, regardless of brightness variations, and each frame of pictures carries visual information from all pixels. The imaging mechanism brings a great deal of information redundancy, and brings great challenges to bandwidth and storage, and the requirements of small data volume and high time resolution can be completely met through an event-driven mode.
Although research and development are carried out in the field of computer vision, in real-world applications, lane line identification is difficult due to complex conditions such as diverse environments and light and shade of illumination. Maqueda et al experimentally demonstrated that event data is robust in predicting the steering angle of a vehicle under challenging lighting conditions and fast motion. However, the model used in this method is a network model of conventional image training, and the asynchronous nature of the event stream data is not preserved. Wang proposes a positioning detector based on an improved Gaussian speckle tracking algorithm, which is used for tracking and positioning an event target and can realize the identification of a reference positive direction. However, when the classification direction is increased, a recognition error easily occurs. Ramesh et al propose a Descriptor of (DART) event information, but this descriptor does not consider the invariance of rotation, scale, and view angle in design principle. Li tracks the event stream object by using a correlation filter and a convolutional neural network, but noise events have a large influence on the event stream object.
Disclosure of Invention
The invention aims to provide a lane line identification method aiming at address event data flow aiming at the defects of the existing method, which can realize lane line identification under the influences of high-speed driving of vehicles, light and darkness of the environment and the like, and meanwhile, the invention effectively utilizes the characteristics of high time resolution, wide dynamic range and the like of the event flow data, reduces the calculated amount and storage space of the lane line data, and improves the calculation capability and the detection rate of the lane line.
The method comprises the following specific steps:
(1) simulating and generating an address event data stream:
(1a) converting the lane line video shot by a traditional camera into event stream data by using an event simulator ESIM;
(1b) forming a lane line database of address events by lane line event stream data files under four different scenes;
(2) filtering the event stream data:
(2a) defining an event as ei=(xi,yi,ti) Assume that the number of co-occurring events is N in the T time period and is recorded as
Figure BDA0002856560360000031
The number of events N includes noise-induced events (ineffective events N)noise) And actually occurring event (valid event N)valid) Written by mathematical expressions
Figure BDA0002856560360000032
(2b) Within a fixed time interval T of 20 milliseconds, calculating the density value rho of the event in the current space-time domain by using an event stream density value calculation formula, wherein the density calculation formula is as follows:
Figure BDA0002856560360000033
where ρ represents the density value of the event, and U (x, y, t) is the spatiotemporal domain of the event (x, y, t)I represents the sequence numbers of all events in the space-time domain, i is 1,2, …, n, xiThe corresponding abscissa position of the ith event in the address event stream set in the three-dimensional coordinate system is shown, and the same principle is thatiIs the corresponding ordinate position, tiIs the corresponding time stamp, [ integral ] denotes the integration operation, and [ epsilon ] (x, y, t) denotes the step function.
(2c) In a space-time neighborhood U (x, y, t), if the density rho of the current event is greater than 50, the event data is kept, otherwise, the event data is an invalid event, and the event data is discarded;
(3) obtaining an event frame by using a LIF (Leaky integral-and-Fire) coding mode:
(3a) treating each image pixel (x, y) as a neuron having a membrane potential and a trigger counter n;
(3b) each input event causes a step increase in Membrane Potential (MP) at the pixel (x, y), while the corresponding MP value decreases following a fixed decay rate; resetting the pixel MP value to 0 once it exceeds a set threshold;
(3c) counting the times n that the membrane potential MP of each pixel exceeds a threshold value within 20 milliseconds, and mapping the pixel value to 0-255 through the following normalized calculation formula;
Figure BDA0002856560360000041
where σ (n) is the pixel value of the event frame, and n is the total number of events that occurred at pixel point (x, y) within 20 milliseconds.
(3d) Performing binarization operation on a pixel value sigma (n) by adopting a threshold segmentation method, setting the pixel at the current position as 1 if the pixel value is greater than a set threshold, and otherwise, setting the pixel value as 0;
(4) carrying out visual angle conversion on the preprocessed lane lines to obtain an aerial view:
and converting the visual angle shot by the camera into an overlooking visual angle of the lane line through the inverse perspective transformation matrix, thereby recovering the parallel relation of the two lane lines. The transformation formula of the camera coordinate system (x, y, z) to the two-dimensional image coordinate system (u, v) is as follows;
Figure BDA0002856560360000042
wherein Z iscIs a scale factor, usually assuming that the lanes are parallel, ZcNamely a fixed value;
Figure BDA0002856560360000043
is an intra-camera parameter matrix; r is a rotation matrix; t is the displacement.
(5) And positioning the search position of the starting point by utilizing the peak value of the statistical histogram:
(5a) dividing the overlook image with the converted visual angle into two parts along the vertical direction, and dividing the two parts into a left searching area and a right searching area;
(5b) performing histogram statistics of the number of non-zero pixels on a left area and a right area of the aerial view in the vertical direction;
(5c) the two peak values of the left area and the right area are respectively used as the search starting points of a left lane line and a right lane line;
(6) positioning the characteristic points by using a sliding window, and fitting the lane line by adopting a random sampling consistency method:
(6a) and taking the base point of the current search as a search starting point and taking the current base point as a center to carry out gridding search. Meanwhile, counting the number of non-zero pixels in each search frame area, and eliminating the search frames with the number of the non-zero pixels smaller than a certain threshold value;
(6b) calculating the mean value of the nonzero pixel coordinates in each search frame as a characteristic point;
(6c) randomly selecting a plurality of points according to the following cubic B sample curve model to generate an initial curve;
Figure BDA0002856560360000051
wherein q isiIs the characteristic point of the control curve, i is 1,2, …, t has a value range of [0, 1%]。
(6d) Calculating the relative distance between the residual characteristic point and the initial curve, executing (6e) when the relative distance is smaller than a set threshold, otherwise executing the step (6c) until the maximum iteration number is reached;
(6e) correcting the cubic curve by using the characteristic points added in the current cycle, and storing the current best fit curve until the iteration upper limit;
(7) tracking the lane line and outputting:
(7a) judging whether the standard deviation of the two fitting cubic curves of the previous frame is larger than a certain threshold value or not, if so, outputting the average fitting data of the previous five frames as the fitting result of the current frame; if the standard deviation of the two fitted lane lines in the previous frame is larger than the threshold range, executing the step (4);
(7b) in the normal running process of the vehicle, considering the situation that the vehicle can change lanes, when the offset d is larger than the set threshold value, the step (4) is executed, otherwise, the step (7) is started. The calculation formula of the offset d of the vehicle running process is as follows;
Figure BDA0002856560360000061
wherein a and b are pixel difference between the left lane and the right lane from the center line, w is pixel difference between the left lane and the right lane, and LwThe actual width of the road is 3.7 meters.
(7c) And outputting the lane line identification map.
Compared with the prior art, the invention has the following advantages:
(1) aiming at the noise randomly generated by a hardware circuit and the environment in the AER perception mode, the event stream sequence is directly preprocessed in a filtering and LIF coding mode, so that a large number of invalid events can be filtered, and the accuracy of lane line positioning is improved.
(2) The method of the invention adopts the asynchronous event stream with high time resolution, overcomes the storage of a large amount of redundant information in the prior art, reduces the memory consumption in the process of identifying the lane line, improves the identification efficiency and avoids the information loss between two adjacent frames.
(3) The invention adopts the lane line tracking strategy, thereby reducing the interference of abnormal frames and improving the real-time property of identification.
Drawings
FIG. 1 is a diagram comparing a transmission method of a conventional camera and an event camera;
FIG. 2 is an overall process flow diagram of the present invention;
fig. 3 is an output diagram of lane line recognition in four scenes, which are sequentially general situations, and situations of character identification on the road surface, tree shadow influence and insufficient illumination at night.
Detailed Description
The technical solutions in the embodiments of the present invention will be described in further detail below with reference to the drawings in the embodiments of the present invention, but the scope of the present invention is not limited to the embodiments described below.
The specific steps implemented by the present invention are further described with reference to fig. 2.
Step 1, simulating to generate address event data flow, and converting the lane line video shot by a traditional camera into event flow data by using an event simulator ESIM.
And (4) forming the lane line event stream data files under four different scenes into a lane line database of the address event.
Step 2, defining an event as ei=(xi,yi,ti) Assume that the number of co-occurring events is N in the T time period and is recorded as
Figure BDA0002856560360000071
The number of events N includes noise-induced events (ineffective events N)noise) And actually occurring event (valid event N)valid) Written in mathematical expressions.
Figure BDA0002856560360000072
And calculating the density value rho of the event in the current space-time domain by using an event stream density value calculation formula within the fixed time interval T of 20 milliseconds, wherein the density calculation formula is as follows.
Figure BDA0002856560360000073
Where ρ represents the density value of the event, U (x, y, t) is the spatio-temporal domain of the event (x, y, t), i represents the sequence numbers of all events in the spatio-temporal domain, i is 1,2, …, n, xiThe corresponding abscissa position of the ith event in the address event stream set in the three-dimensional coordinate system is shown, and the same principle is thatiIs the corresponding ordinate position, tiIs the corresponding time stamp, [ integral ] denotes the integration operation, and [ epsilon ] (x, y, t) denotes the step function.
Within the spatio-temporal neighborhood U (x, y, t), if the density ρ of the current event is greater than 50, the event data is retained, otherwise, the event data is discarded for an invalid event.
Step 3, consider each image pixel (x, y) as a neuron with a membrane potential and a trigger counter n.
Each input event causes a step increase in the membrane potential MP value at pixel (x, y) while the corresponding MP value follows a fixed decay rate decrease; once the pixel MP value exceeds the set threshold, it is reset to 0.
And counting the times n that the membrane potential MP of each pixel exceeds the threshold value within 20 milliseconds, and mapping the pixel value to 0-255 by the following normalized calculation formula.
Figure BDA0002856560360000081
Where σ (n) is the pixel value of the event frame, and n is the total number of events that occurred at pixel point (x, y) within 20 milliseconds.
And (3) carrying out binarization operation on the pixel value sigma (n) by adopting a threshold segmentation method, wherein if the pixel value is greater than a set threshold, the pixel at the current position is set to be 1, and otherwise, the pixel value is 0.
And 4, converting the view angle of the preprocessed lane line to obtain an aerial view. And converting the visual angle shot by the camera into an overlooking visual angle of the lane line through the inverse perspective transformation matrix, thereby recovering the parallel relation of the two lane lines. The transformation formula of the camera coordinate system (x, y, z) to the two-dimensional image coordinate system (u, v) is as follows.
Figure BDA0002856560360000082
Wherein Z iscIs a scale factor, usually assuming that the lanes are parallel, ZcNamely a fixed value;
Figure BDA0002856560360000091
is an intra-camera parameter matrix; r is a rotation matrix; t is the displacement.
And 5, positioning the search position of the starting point by utilizing the peak value of the statistical histogram. And dividing the overlook image with the converted visual angle into two search areas along the vertical direction.
And performing histogram statistics of the number of non-zero pixels on the left area and the right area of the aerial view in the vertical direction. The two peak values of the left area and the right area are respectively used as the search starting points of the left lane line and the right lane line.
And 6, positioning the characteristic points by using a sliding window, and fitting the lane line by adopting a random sampling consistency method.
And taking the base point of the current search as a search starting point and taking the current base point as a center to carry out gridding search. Meanwhile, the number of non-zero pixels in each search frame area is counted, and the search frames with the number of the non-zero pixels smaller than a certain threshold value are removed.
And calculating the mean value of the non-zero pixel coordinates in each search box as the characteristic point. And randomly selecting a plurality of points according to the following cubic B sample curve model to generate an initial curve.
Figure BDA0002856560360000092
Wherein q isiIs the characteristic point of the control curve, i is 1,2, …, t has a value range of [0, 1%]。
And calculating the relative distance between the residual characteristic points and the initial curve, regenerating a fitting curve when the relative distance is smaller than a set threshold, and otherwise, correcting the curve until the maximum iteration times is reached. And correcting the cubic curve by using the characteristic points added in the current cycle, and storing the current best fit curve until the iteration upper limit.
Step 7, judging whether the standard deviation of the two fitting cubic curves of the previous frame is larger than a certain threshold value, if so, outputting the average fitting data of the previous five frames as the fitting result of the current frame; and if the standard deviation of the two fitted lane lines in the previous frame is larger than the threshold range, executing the step 4.
In the normal running process of the vehicle, considering that the vehicle can change lanes, when the offset d is larger than the set threshold value, step 4 is executed, otherwise, step 7 is started. The offset d of the vehicle during travel is calculated as follows.
Figure BDA0002856560360000101
Wherein a and b are pixel difference between the left lane and the right lane from the center line, w is pixel difference between the left lane and the right lane, and LwThe actual width of the road is 3.7 meters.
And inputting the tested lane line event frame into a program to obtain a lane line identification picture.
The effect of the present invention will be further described below with reference to the test chart of the lane scheme.
The method comprises the steps of dividing address event stream data of continuous lane lines into a plurality of event stream sequences, and removing invalid events by using density values of the events aiming at each event sequence; LIF coding is carried out on the denoised event stream sequence to form an event frame, and then visual angle conversion is carried out on the event frame to obtain an aerial view of the lane line; secondly, positioning a search position of a starting point by adopting a peak value of the statistical histogram as a characteristic point, and fitting a lane line by adopting a random sampling consistency method; and finally, tracking and outputting the lane line by using a tracking strategy. Fig. 3(a) to 3(d) show lane line identification output diagrams in four different common scenarios.

Claims (4)

1. A lane line identification method for address event data streams is characterized by specifically comprising the following steps:
(1) simulating and generating an address event data stream:
converting the lane line video shot by a traditional camera into event stream data by using an event simulator ESIM;
(2) filtering the event stream data:
within a fixed time interval T of 20 milliseconds, calculating a density value rho of an event in the current space-time field by using an event stream density value calculation formula, if the density rho of the current event is greater than 50, keeping the event data, and if the density rho of the current event is not greater than 50, discarding the event data as an invalid event;
(3) obtaining an event frame by using a LIF (Leaky integral-and-Fire) coding mode:
(3a) considering each image pixel (x, y) as a neuron with a membrane potential and a trigger counter n, each input event causes a step increase in the membrane potential MP value at the pixel (x, y) while the corresponding MP value follows a fixed decay rate decrease; resetting the pixel MP value to 0 once it exceeds a set threshold;
(3b) counting the times n that the membrane potential MP of each pixel exceeds a threshold value within 20 milliseconds, and mapping the pixel value to 0-255 through a normalization calculation formula;
(4) carrying out visual angle conversion on the preprocessed lane lines to obtain an aerial view:
converting the visual angle shot by the camera into an overlooking visual angle of the lane line through the inverse perspective transformation matrix, thereby recovering the parallel relation of the two lane lines;
(5) and positioning the search position of the starting point by utilizing the peak value of the statistical histogram:
(6) positioning the characteristic points by using a sliding window, and fitting the lane line by adopting a random sampling consistency method:
(7) tracking the lane line and outputting:
and (4) judging the standard deviation and the offset of the two fitting curves, if so, outputting the average fitting data of the previous five frames, otherwise, executing the step (4) and outputting a lane line identification graph.
2. The method for identifying a lane line of an address event stream according to claim 1, wherein the filtering method in step (2) uses an event density value to remove invalid events, wherein the event density value is calculated as follows:
Figure FDA0002856560350000021
where ρ represents the density value of the event, U (x, y, t) is the spatio-temporal domain of the event (x, y, t), i represents the sequence numbers of all events in the spatio-temporal domain, i is 1,2, …, n, xiRepresents the abscissa position of the ith event in the address event stream corresponding to the three-dimensional coordinate system, and similarly yiIs the corresponding ordinate position, tiIs the corresponding time stamp, [ integral ] denotes the integration operation, and [ epsilon ] (x, y, t) denotes the step function.
3. The method according to claim 1, wherein the fixed decay rate of the MP value in step (3a) is-0.8, and the set zero threshold is an integer arbitrarily selected from the range of (35, 50).
4. The method for identifying a lane line of an address event stream according to claim 1, wherein the normalized calculation formula in the step (3b) is as follows:
Figure FDA0002856560350000022
where σ (n) is the pixel value of the event frame, and n is the total number of events that occurred at pixel point (x, y) within 20 milliseconds.
CN202011572704.8A 2020-12-24 2020-12-24 Lane line identification method for address event data stream Pending CN112668466A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011572704.8A CN112668466A (en) 2020-12-24 2020-12-24 Lane line identification method for address event data stream

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011572704.8A CN112668466A (en) 2020-12-24 2020-12-24 Lane line identification method for address event data stream

Publications (1)

Publication Number Publication Date
CN112668466A true CN112668466A (en) 2021-04-16

Family

ID=75410138

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011572704.8A Pending CN112668466A (en) 2020-12-24 2020-12-24 Lane line identification method for address event data stream

Country Status (1)

Country Link
CN (1) CN112668466A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114078100A (en) * 2021-11-25 2022-02-22 成都时识科技有限公司 Clustering noise reduction device, method, chip, event imaging device and electronic equipment
CN114550288A (en) * 2022-01-29 2022-05-27 清华大学 Event data based action identification method and device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102253013A (en) * 2011-04-29 2011-11-23 陈伟 Transmission method visibility detection device and method applied in field of transportation
US20160364621A1 (en) * 2015-06-11 2016-12-15 Garmin Switzerland Gmbh Navigation device with integrated camera
CN107481526A (en) * 2017-09-07 2017-12-15 公安部第三研究所 System and method for drive a vehicle lane change detection record and lane change violating the regulations report control
CN110443225A (en) * 2019-08-15 2019-11-12 安徽半问科技有限公司 Virtual and real lane line identification method and device based on feature pixel statistics
WO2020048027A1 (en) * 2018-09-06 2020-03-12 惠州市德赛西威汽车电子股份有限公司 Robust lane line detection method based on dynamic region of interest

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102253013A (en) * 2011-04-29 2011-11-23 陈伟 Transmission method visibility detection device and method applied in field of transportation
US20160364621A1 (en) * 2015-06-11 2016-12-15 Garmin Switzerland Gmbh Navigation device with integrated camera
CN107481526A (en) * 2017-09-07 2017-12-15 公安部第三研究所 System and method for drive a vehicle lane change detection record and lane change violating the regulations report control
WO2020048027A1 (en) * 2018-09-06 2020-03-12 惠州市德赛西威汽车电子股份有限公司 Robust lane line detection method based on dynamic region of interest
CN110443225A (en) * 2019-08-15 2019-11-12 安徽半问科技有限公司 Virtual and real lane line identification method and device based on feature pixel statistics

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
彭红;肖进胜;沈三明;李必军;程显: "一种基于随机抽样一致性的车道线快速识别算法", 上海交通大学学报, vol. 48, no. 12 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114078100A (en) * 2021-11-25 2022-02-22 成都时识科技有限公司 Clustering noise reduction device, method, chip, event imaging device and electronic equipment
CN114550288A (en) * 2022-01-29 2022-05-27 清华大学 Event data based action identification method and device

Similar Documents

Publication Publication Date Title
CN108182670B (en) Resolution enhancement method and system for event image
CN112800860B (en) High-speed object scattering detection method and system with coordination of event camera and visual camera
CN110244322A (en) Pavement construction robot environment sensory perceptual system and method based on Multiple Source Sensor
CN108416780B (en) Object detection and matching method based on twin-region-of-interest pooling model
CN113724297A (en) Event camera-based tracking method
CN112668466A (en) Lane line identification method for address event data stream
WO2019228450A1 (en) Image processing method, device, and equipment, and readable medium
CN112766046B (en) Target detection method and related device
CN111832461A (en) Non-motor vehicle riding personnel helmet wearing detection method based on video stream
CN112232356A (en) Event camera denoising method based on cluster degree and boundary characteristics
CN113034378B (en) Method for distinguishing electric automobile from fuel automobile
CN111160100A (en) Lightweight depth model aerial photography vehicle detection method based on sample generation
CN111985314B (en) Smoke detection method based on ViBe and improved LBP
CN112308087A (en) Integrated imaging identification system and method based on dynamic vision sensor
CN112907972B (en) Road vehicle flow detection method and system based on unmanned aerial vehicle and computer readable storage medium
CN110751667A (en) Method for detecting infrared dim small target under complex background based on human visual system
CN108932465B (en) Method and device for reducing false detection rate of face detection and electronic equipment
Zhang et al. Capitalizing on RGB-FIR hybrid imaging for road detection
CN117037085A (en) Vehicle identification and quantity statistics monitoring method based on improved YOLOv5
CN107122722A (en) A kind of self-adapting compressing track algorithm based on multiple features
CN116486352A (en) Lane line robust detection and extraction method based on road constraint
CN111127355A (en) Method for finely complementing defective light flow graph and application thereof
CN116403200A (en) License plate real-time identification system based on hardware acceleration
CN113936030B (en) Moving object detection method and system based on convolution coding
CN114612999A (en) Target behavior classification method, storage medium and terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
AD01 Patent right deemed abandoned

Effective date of abandoning: 20240920

AD01 Patent right deemed abandoned