CN113160218A - Method for detecting object motion intensity based on event camera - Google Patents
Method for detecting object motion intensity based on event camera Download PDFInfo
- Publication number
- CN113160218A CN113160218A CN202110516952.9A CN202110516952A CN113160218A CN 113160218 A CN113160218 A CN 113160218A CN 202110516952 A CN202110516952 A CN 202110516952A CN 113160218 A CN113160218 A CN 113160218A
- Authority
- CN
- China
- Prior art keywords
- optical flow
- motion
- speed
- event camera
- pixel
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/207—Analysis of motion for motion estimation over a hierarchy of resolutions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10052—Images from lightfield camera
Abstract
The invention provides a method for detecting object motion intensity based on an event camera, which comprises the following steps of S1, collecting a noise base number; s2, detecting the motion of an object; s3, counting optical flows; and S4, calculating the movement intensity. The method of the invention utilizes the optical flow data returned by the dynamic vision sensor to detect the motion intensity of the object, and uses the method of trigger point probability distribution statistics to calculate the number of trigger points without carrying out target detection and noise filtering processing, thereby solving the problems of difficult data acquisition, complex algorithm training and difficult real-time detection existing in the prior method; the method of the invention detects the motion intensity of the object by utilizing probability distribution statistics aiming at the optical flow data of the event camera, and compared with the existing algorithm, the method does not need large-scale data acquisition, model training and large amount of calculation, so that the algorithm calculation is simpler and is convenient to use on embedded equipment.
Description
Technical Field
The invention belongs to the field of computer vision, and particularly relates to a method for detecting the motion intensity of an object based on an event camera.
Background
With the development of artificial intelligence technology, more and more artificial intelligence technology has come to the ground in recent years. The image recognition technology of the artificial intelligence technology is widely applied to various industries. For example, the liveness of poultry is detected in animal husbandry, abnormal movement of objects is monitored in security and protection fields, and the degree of bumping of objects is detected in transportation industry. In the aspect of detecting the motion intensity of the object, the object is generally identified from the image, and then tracking statistical calculation is performed. The current mainstream image recognition technology is a deep learning technology based on a CNN convolutional neural network, and the technology has the characteristics of strong adaptability, high accuracy, strong anti-interference capability and the like, and is popular in the market. However, at present, the technology is difficult to apply, a deep convolutional neural network and real-time detection operation need a high-performance CPU and a GPU, and picture data often needs to be sent to a server and then processed by the server. In addition, millions of large-scale pictures need to be marked, and the training, the acquisition and the marking of the pictures are time-consuming and labor-consuming. In the prior art, the CNN convolutional neural network is used for image recognition, and the problems of large-scale data set acquisition, long-time model training, large-scale data calculation and the like are needed.
Disclosure of Invention
The invention provides an object motion intensity detection method based on an event camera, which solves the technical problems of difficult data acquisition, complex algorithm training and difficult real-time detection.
The technical scheme of the invention is as follows:
a method of object motion intensity detection based on an event camera, comprising the steps of: s1, collecting a noise base number: aiming an event camera at a target space area to be detected, and ensuring that no moving object exists in the range of the target space area; then, under different illumination intensities, storing optical flow data returned by the event camera, and calculating to obtain noise cardinal numbers under different illumination; s2, object motion detection: keeping the target space area detected by the event camera consistent with the target space area of the noise base acquired in the step S1, and returning optical flow data in the range of the object motion area; s3, optical flow statistics: performing probability distribution statistics on the optical flow data within the range of the object motion area within a certain time range in the step S2 to obtain the number of optical flow pixels and the corresponding optical flow velocity within the time range; and S4, calculating the movement intensity: and subtracting the noise base number under the corresponding illumination intensity from the counted optical flow data to obtain the total optical flow of the target, and dividing the total optical flow by the number of the moving objects to obtain the average value of the motion intensity of each object.
Preferably, in the method for detecting the motion intensity of the object, in step S1, first, the event camera is aimed at a target space region to be detected, and it is ensured that there is no moving object within the target region; simultaneously adjusting the focal length to ensure that the number of pixels occupied by the target object during imaging is within a specified range; then, starting an event camera, continuously acquiring data in a target space range, and recording the number of times that each pixel point is triggered in the time range; meanwhile, each pixel point is triggered and simultaneously accompanied by a speed value, namely the speed of the object moving at the position corresponding to the pixel, and the speed is generated by the event camera.
Preferably, in the method for detecting the motion intensity of the object, in step S2, when the object actively moves in the detection area, the event camera captures the corresponding motion area; in the area where the motion occurs, the corresponding pixel point is triggered; the moving object moves frequently, and the triggering times of the pixel points are more; the more objects that move, the greater the number of pixels that are triggered; similarly, the event camera returns a velocity, which yields optical flow data as the object moves.
Preferably, in the method for detecting the motion intensity of an object, in step S3, the optical flow data has three attribute values of number, direction and speed.
Preferably, in the method for detecting the motion intensity of the object, in step S3, the optical flow data original is a visual effect, and the corresponding data structure is a three-dimensional structure, and is [ n, m, v ], where n is a pixel number, i.e., an nth pixel; m is the number m of triggering times of the nth pixel, v is the speed v of the object motion in the area corresponding to the pixel when triggering, and the range of the speed v is 1-256, which is a relative value; drawing optical flow data with an object motion speed v of 1 into a probability statistical histogram A, wherein the horizontal axis represents the nth pixel, the vertical axis represents the number m of times that the pixel is triggered within a period of time, and simultaneously drawing the noise base number which is acquired in the step S1 and has the same duration and the speed v of 1 into a probability statistical histogram B by the same method; and finally, subtracting the vertical axis value of the histogram B from the vertical axis value of the histogram A to finally obtain the triggered times of each pixel after noise subtraction when the speed is 1, and so on, calculating the probability distribution of the speed v from 2 to 256, wherein the calculation formula is as follows:
thus, a probability distribution of the number of times each pixel is triggered at a certain velocity v is obtained, and all velocities are added up to obtain the velocity sum S:
preferably, in the method for detecting the motion intensity of the object, in step S4, the calculation formula of the motion intensity J of the object is:
when the object moves, S is the sum of all the speeds, and N is the number of the objects in the area range.
According to the technical scheme of the invention, the beneficial effects are as follows:
according to the method for detecting the object motion intensity based on the event camera, the object motion intensity is detected by using the optical flow data returned by the dynamic visual sensor, the number of trigger points is calculated by using a method of trigger point probability distribution statistics without performing target detection and noise filtering processing, and therefore the problems of difficult data acquisition, complex algorithm training and difficult real-time detection existing in the conventional method are solved; the method of the invention detects the motion intensity of the object by utilizing probability distribution statistics aiming at the optical flow data of the event camera, and compared with the existing algorithm, the method does not need large-scale data acquisition, model training and large amount of calculation, so that the algorithm calculation is simpler and is convenient to use on embedded equipment.
For a better understanding and appreciation of the concepts, principles of operation, and effects of the invention, reference will now be made in detail to the following examples, taken in conjunction with the accompanying drawings, in which:
drawings
In order to more clearly illustrate the detailed description of the invention or the technical solutions in the prior art, the drawings that are needed in the detailed description of the invention or the prior art will be briefly described below.
FIG. 1 is a flow chart of a method of object motion intensity detection based on an event camera according to the present invention;
FIG. 2 is a schematic diagram of collecting noise floor;
FIG. 3 is a schematic illustration of object motion detection; and
FIG. 4 is a schematic illustration of optical flow statistical calculations.
Detailed Description
The method for detecting the object motion intensity based on the event camera carries out probability distribution statistical calculation on optical flow data returned by the event camera to obtain information of triggered optical flow pixels in an area, namely the number of the optical flow pixels and the optical flow speed; and calculating the activity degree of the movement of the object in the area by using the number and the speed of the triggered optical flows, namely obtaining the movement information of the object in a certain time period in the area range through accumulative statistical calculation of a period of time, and further obtaining the movement intensity of the object in the area range. The method for processing the event camera optical flow data has simple algorithm, can be operated on embedded equipment with relatively deficient computing resources, realizes real-time detection, does not need a high-performance GPU and a CPU, and solves the problems of difficult data acquisition, complex algorithm training and difficult real-time detection in the prior art.
The basic principle of the object motion intensity detection method based on the event camera is as follows: by using the method based on probability distribution statistics, the optical flow data of the event camera is subjected to statistical calculation without calculus calculation, so that large-scale data acquisition and model training are not required, and real-time detection calculation can be realized.
The method for detecting the motion intensity of the object based on the event camera comprises the following steps (as shown in figure 1):
s1, collecting a noise base number: aiming an event camera at a target space area to be detected, and ensuring that no moving object exists in the range of the target space area; and then, under different illumination intensities, storing optical flow data returned by the event camera, and calculating to obtain noise cardinality under different illumination.
Fig. 2 is a schematic diagram of collecting the noise floor. When no object moves, the event camera triggers some pixel points due to noise. Therefore, noise acquisition is required to obtain a noise average value, i.e., a noise base. First, the event camera is aimed at a target spatial region to be detected (i.e., a detection region), ensuring that there are no moving objects within the target region. And simultaneously, adjusting the focal length to ensure that the number of pixels occupied by the target object during imaging is within a specified range. Then, the event camera is started, data acquisition is carried out continuously in the target space range, and the number of times that each pixel point is triggered in the time range is recorded. Meanwhile, each pixel point is triggered and simultaneously accompanied by a speed value, namely the speed of the object moving at the position corresponding to the pixel, and the speed is generated by the event camera. That is, the event camera returns (x, y, v), (x, y) is coordinates, and v is velocity.
S2, detecting the motion of the object: the target spatial region detected by the event camera is kept consistent with the target spatial region of the noise floor acquired in step S1, and the optical flow data in the range of the object motion region is returned. The optical flow data returned by the event camera is returned to a computer or other embedded equipment with a CPU through a USB data line or other connection modes.
As shown in fig. 3, a schematic diagram of object motion detection is shown. The target spatial region range (i.e., the detection region) detected by the event camera needs to be consistent with the target spatial region range in S1. When an object is actively moving within the detection area, the event camera will capture the corresponding motion area. In the area where the motion occurs, the corresponding pixel point is triggered and displayed as white in the figure. The moving object moves frequently, and the triggering times of the pixel points are more; the more objects that move, the greater the number of pixels that are triggered. Similarly, the event camera returns a velocity v, which yields optical flow data for the moving object.
S3, optical flow statistics: the optical flow data (optical flow data having three attribute values of number, direction, and velocity) in the range of the object motion region within a certain time range in step S2 is subjected to probability distribution statistics to obtain the number of optical flow pixels and the corresponding optical flow velocity within the time range.
FIG. 4 is a schematic diagram of optical flow statistical calculations. As shown in fig. 4, the left side 1 is an optical flow data original and is a visual effect, and the corresponding data structure is a three-dimensional structure [ n, m, v ] as shown in the middle 2 of fig. 4. n is a pixel number, namely an nth pixel; m is the number m of triggered times of the nth pixel; and v is the speed v of the object motion in the area corresponding to the pixel when triggering, the range of the speed v is 1-256, the speed v is a relative value, and relative to the chip clock, the clocks are different and the units are different. If the clock is set to be 1ns once, the clock is 1-256 ns, and the unit is a nanosecond. Optical flow data with an object motion velocity v of 1 is plotted into a probability statistic histogram, which is similar to the histogram shown in the right side 3 of fig. 4, and this histogram statistic graph is called histogram statistic graph a. The horizontal axis represents the nth pixel and the vertical axis represents the number m of times the pixel is triggered within a time range (e.g., 10ms, 50ms, 100ms, 1s, etc.). Meanwhile, the noise floor collected in step S1 when the velocity v is 1 in the same time duration is plotted into a probability statistic histogram in the same way, and a histogram similar to that shown in the right side 3 of fig. 4 is obtained, and this histogram statistic histogram is referred to as histogram statistic B. And finally, subtracting the vertical axis value of the histogram B from the vertical axis value of the histogram A to finally obtain the triggered times of each pixel after the noise is subtracted when the speed is 1. By analogy, the probability distribution of the velocity v from 2 to 256 is calculated by the following formula:
in this way, a probability distribution of the number of times each pixel is triggered at a certain velocity v is obtained. Add all velocities to get the velocity sum S:
s4, calculating the motion intensity: and subtracting the noise base number under the corresponding illumination intensity from the counted optical flow data to obtain the total optical flow of the target, and dividing the total optical flow by the number of the moving objects to obtain the average value of the motion intensity of each object.
The calculation formula of the object motion strength J is as follows:
when the object moves, S is the addition of all the speeds, N is the number of the objects in the area range, the formula calculation is only pixel accumulation calculation, compared with a deep convolution neural network, the calculation amount is greatly reduced, and the method can be carried out on embedded equipment with relatively deficient resources. In addition, data set acquisition and model training are not required. The noise reduction mode is direct subtraction, and compared with noise reduction algorithms such as Gaussian filtering and the like which need calculus gradient calculation, the method is simpler. Therefore, the problems of difficult data acquisition, complex algorithm training and difficult real-time detection in the conventional method are solved.
The foregoing description is of the preferred embodiment of the concept and principles of operation in accordance with the present invention. The above-described embodiments should not be construed as limiting the scope of the claims, and other embodiments and combinations of implementations according to the inventive concept are within the scope of the invention.
Claims (6)
1. A method for detecting object motion intensity based on an event camera is characterized by comprising the following steps:
s1, collecting a noise base number: aligning an event camera to a target space area to be detected, ensuring that no moving object exists in the range of the target space area, then storing optical flow data returned by the event camera under different illumination intensities, and calculating to obtain a noise base number under different illumination;
s2, detecting the motion of the object: keeping the target space area detected by the event camera consistent with the target space area of the noise base acquired in the step S1, and returning optical flow data in the range of the object motion area;
s3, optical flow statistics: performing probability distribution statistics on the optical flow data within the range of the object motion area within a certain time range in the step S2 to obtain the number of optical flow pixels and the corresponding optical flow velocity within the time range; and
s4, calculating the motion intensity: and subtracting the noise base number under the corresponding illumination intensity from the counted optical flow data to obtain the total optical flow of the target, and dividing the total optical flow by the number of the moving objects to obtain the average value of the motion intensity of each object.
2. The method for detecting the intensity of motion of an object according to claim 1,
in step S1, first, the event camera is aimed at a target space region to be detected, ensuring that there are no moving objects within the target region; simultaneously adjusting the focal length to ensure that the number of pixels occupied by the target object during imaging is within a specified range; then, starting an event camera, continuously acquiring data in a target space range, and recording the number of times that each pixel point is triggered in the time range; meanwhile, each pixel point is triggered and simultaneously accompanied by a speed value, namely the speed of the object moving at the position corresponding to the pixel, and the speed is generated by the event camera.
3. The method for detecting the intensity of motion of an object according to claim 1,
in step S2, when the object actively moves in the detection area, the event camera will capture the corresponding movement area; in the area where the motion occurs, the corresponding pixel point is triggered; the moving object moves frequently, and the triggering times of the pixel points are more; the more objects that move, the greater the number of pixels that are triggered; similarly, the event camera returns a velocity, which yields optical flow data as the object moves.
4. The method of detecting object motion intensity according to claim 1, wherein in step S3, the optical flow data has three attribute values of number, direction and speed.
5. The method for detecting the intensity of motion of an object according to claim 1,
in step S3, the original image of the optical flow data is visually effective, the corresponding data structure is a three-dimensional structure, and is [ n, m, v ], n is the pixel number, m is the number of times that the nth pixel is triggered, v is the speed v of the object motion in the area corresponding to the pixel at the time of triggering, and the range of the speed v is 1-256, which is a relative value; drawing optical flow data with an object motion speed v of 1 into a probability statistical histogram A, wherein the horizontal axis represents the nth pixel, the vertical axis represents the number m of times that the pixel is triggered within a period of time, and simultaneously drawing the noise base number which is acquired in the step S1 and has the same duration and the speed v of 1 into a probability statistical histogram B by the same method; and finally, subtracting the vertical axis value of the histogram B from the vertical axis value of the histogram A to finally obtain the triggered times of each pixel after noise subtraction when the speed is 1, and so on, calculating the probability distribution of the speed v from 2 to 256, wherein the calculation formula is as follows:
obtaining the probability distribution of the triggered times of each pixel under a certain speed v, and adding all the speeds to obtain the speed sum S:
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110516952.9A CN113160218B (en) | 2021-05-12 | 2021-05-12 | Method for detecting object motion intensity based on event camera |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110516952.9A CN113160218B (en) | 2021-05-12 | 2021-05-12 | Method for detecting object motion intensity based on event camera |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113160218A true CN113160218A (en) | 2021-07-23 |
CN113160218B CN113160218B (en) | 2023-06-20 |
Family
ID=76874687
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110516952.9A Active CN113160218B (en) | 2021-05-12 | 2021-05-12 | Method for detecting object motion intensity based on event camera |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113160218B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116703968A (en) * | 2023-04-20 | 2023-09-05 | 北京百度网讯科技有限公司 | Visual tracking method, device, system, equipment and medium for target object |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109903309A (en) * | 2019-01-07 | 2019-06-18 | 山东笛卡尔智能科技有限公司 | A kind of robot motion's information estimating method based on angle optical flow method |
CN110390685A (en) * | 2019-07-24 | 2019-10-29 | 中国人民解放军国防科技大学 | Feature point tracking method based on event camera |
CN111582300A (en) * | 2020-03-20 | 2020-08-25 | 北京航空航天大学 | High-dynamic target detection method based on event camera |
CN111931752A (en) * | 2020-10-13 | 2020-11-13 | 中航金城无人系统有限公司 | Dynamic target detection method based on event camera |
CN112514373A (en) * | 2018-08-14 | 2021-03-16 | 华为技术有限公司 | Image processing apparatus and method for feature extraction |
CN112686935A (en) * | 2021-01-12 | 2021-04-20 | 武汉大学 | Airborne depth sounding radar and multispectral satellite image registration method based on feature fusion |
-
2021
- 2021-05-12 CN CN202110516952.9A patent/CN113160218B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112514373A (en) * | 2018-08-14 | 2021-03-16 | 华为技术有限公司 | Image processing apparatus and method for feature extraction |
CN109903309A (en) * | 2019-01-07 | 2019-06-18 | 山东笛卡尔智能科技有限公司 | A kind of robot motion's information estimating method based on angle optical flow method |
CN110390685A (en) * | 2019-07-24 | 2019-10-29 | 中国人民解放军国防科技大学 | Feature point tracking method based on event camera |
CN111582300A (en) * | 2020-03-20 | 2020-08-25 | 北京航空航天大学 | High-dynamic target detection method based on event camera |
CN111931752A (en) * | 2020-10-13 | 2020-11-13 | 中航金城无人系统有限公司 | Dynamic target detection method based on event camera |
CN112686935A (en) * | 2021-01-12 | 2021-04-20 | 武汉大学 | Airborne depth sounding radar and multispectral satellite image registration method based on feature fusion |
Non-Patent Citations (2)
Title |
---|
孔德磊: ""基于事件的视觉传感器及其应用综述"", 《信息与控制》 * |
赵锴: ""基于事件相机的运动分割算法研究"", 《信息科技辑》 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116703968A (en) * | 2023-04-20 | 2023-09-05 | 北京百度网讯科技有限公司 | Visual tracking method, device, system, equipment and medium for target object |
Also Published As
Publication number | Publication date |
---|---|
CN113160218B (en) | 2023-06-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10769480B2 (en) | Object detection method and system | |
CN107481270B (en) | Table tennis target tracking and trajectory prediction method, device, storage medium and computer equipment | |
Gao et al. | Block-sparse RPCA for salient motion detection | |
US20090262977A1 (en) | Visual tracking system and method thereof | |
CN101344965A (en) | Tracking system based on binocular camera shooting | |
Chen et al. | Asynchronous tracking-by-detection on adaptive time surfaces for event-based object tracking | |
CN113724297A (en) | Event camera-based tracking method | |
Min et al. | Human fall detection using normalized shape aspect ratio | |
Liu et al. | Motion robust high-speed light-weighted object detection with event camera | |
Chen et al. | Real-time object tracking via CamShift-based robust framework | |
CN113160218B (en) | Method for detecting object motion intensity based on event camera | |
JP2020109644A (en) | Fall detection method, fall detection apparatus, and electronic device | |
Singh et al. | A multi-gait dataset for human recognition under occlusion scenario | |
CN113781523A (en) | Football detection tracking method and device, electronic equipment and storage medium | |
Choi et al. | Salient motion information detection technique using weighted subtraction image and motion vector | |
Gómez-Conde et al. | Simple human gesture detection and recognition using a feature vector and a real-time histogram based algorithm | |
Junejo et al. | Dynamic scene modeling for object detection using single-class SVM | |
Zarka et al. | Real-time human motion detection and tracking | |
Beltrán et al. | Automated Human Movement Segmentation by Means of Human Pose Estimation in RGB-D Videos for Climbing Motion Analysis. | |
Ribnick et al. | Detection of thrown objects in indoor and outdoor scenes | |
Lu et al. | An objects detection framework in UAV videos | |
Liu et al. | Table tennis robot with stereo vision and humanoid manipulator II: Visual measurement of motion-blurred ball | |
El Shair et al. | High Speed Hybrid Object Tracking Algorithm using Image and Event Data | |
Tang | Research on the Application of Computer Vision Technology in the Adversarial Judgment System in Competitive Sports Competitions | |
Mudjirahardjo et al. | Abnormal motion detection in an occlusion environment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |