CN114341650A - Event detection method and device, movable platform and computer readable storage medium - Google Patents

Event detection method and device, movable platform and computer readable storage medium Download PDF

Info

Publication number
CN114341650A
CN114341650A CN202080059855.5A CN202080059855A CN114341650A CN 114341650 A CN114341650 A CN 114341650A CN 202080059855 A CN202080059855 A CN 202080059855A CN 114341650 A CN114341650 A CN 114341650A
Authority
CN
China
Prior art keywords
image
event
images
frames
processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202080059855.5A
Other languages
Chinese (zh)
Inventor
黄潇
洪小平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Southwest University of Science and Technology
Original Assignee
SZ DJI Technology Co Ltd
Southwest University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd, Southwest University of Science and Technology filed Critical SZ DJI Technology Co Ltd
Publication of CN114341650A publication Critical patent/CN114341650A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01PMEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
    • G01P13/00Indicating or recording presence, absence, or direction, of movement

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Image Processing (AREA)

Abstract

The method can acquire at least two frames of images acquired by an industrial camera, perform image processing according to the at least two frames of images to obtain a processing result, wherein the processing result is used for representing the brightness change condition of pixel points in the images, then perform rendering according to the processing result to acquire an event image, wherein the event image is used for representing the brightness change event, and finally output the event image. Therefore, event detection can be realized directly according to images collected by a conventional industrial camera without specially setting an event camera, and detection cost can be saved to a certain extent.

Description

Event detection method and device, movable platform and computer readable storage medium Technical Field
The invention belongs to the technical field of machine vision, and particularly relates to an event detection method, an event detection device, a movable platform and a computer-readable storage medium.
Background
In tasks in the field of machine vision, such as high-speed sensing of moving objects, high dynamic range scene sensing, real-time robot positioning and map reconstruction (SLAM), and object recognition and tracking of moving objects, it is often necessary to detect event changes in the external environment, that is, to detect events in the external environment that cause brightness changes.
In the prior art, an event camera is often installed, and event detection is realized through the event camera. Specifically, the event camera is a machine vision image sensor, and the event camera can directly output a light intensity change value when sensing that the light intensity change exists in the external environment through a special pixel circuit inside the event camera, so that event detection is realized. However, the structure of the pixel circuit in the event camera is complex, and the process difficulty is high, so the cost is often high. This results in a high detection cost when detecting using the event camera.
Disclosure of Invention
The invention provides an event detection method, an event detection device, a movable platform and a computer readable storage medium, which are used for solving the problem of higher detection cost caused by using an event camera for detection.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides an event detection method, where the method includes:
acquiring at least two frames of images acquired by an industrial camera;
performing image processing according to the at least two frames of images to obtain a processing result; the processing result is used for representing the brightness change condition of the pixel points in the image;
rendering according to the processing result to obtain an event image; the event image is used for representing a brightness change event;
and outputting the event image.
In a second aspect, an embodiment of the present invention provides an event detection apparatus, which includes a memory and a processor;
the memory for storing program code;
the processor, invoking the program code, when executed, is configured to:
acquiring at least two frames of images acquired by an industrial camera;
performing image processing according to the at least two frames of images to obtain a processing result; the processing result is used for representing the brightness change condition of the pixel points in the image;
rendering according to the processing result to obtain an event image; the event image is used for representing a brightness change event;
and outputting the event image.
In a third aspect, an embodiment of the present invention provides a movable platform, where the movable platform includes an industrial camera and the event detection device described above; the event detection device is used for executing the following operations:
acquiring at least two frames of images acquired by an industrial camera;
performing image processing according to the at least two frames of images to obtain a processing result; the processing result is used for representing the brightness change condition of the pixel points in the image;
rendering according to the processing result to obtain an event image; the event image is used for representing a brightness change event;
and outputting the event image.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the following operations:
acquiring at least two frames of images acquired by an industrial camera;
performing image processing according to the at least two frames of images to obtain a processing result; the processing result is used for representing the brightness change condition of the pixel points in the image;
rendering according to the processing result to obtain an event image; the event image is used for representing a brightness change event;
and outputting the event image.
In the embodiment of the invention, at least two frames of images acquired by an industrial camera can be acquired, image processing is carried out according to the at least two frames of images to obtain a processing result, wherein the processing result is used for representing the brightness change condition of a pixel point in the image, then rendering is carried out according to the processing result to obtain an event image, the event image is used for representing a brightness change event, and finally the event image is output. Therefore, event detection can be realized directly according to images collected by a conventional industrial camera without specially setting an event camera, and detection cost can be saved to a certain extent.
Drawings
FIG. 1 is a flowchart illustrating steps of a method for event detection according to an embodiment of the present invention;
FIG. 2 is a schematic diagram illustrating comparison of outputs provided by an embodiment of the present invention;
FIG. 3 is a schematic diagram of an output provided by an embodiment of the present invention;
FIG. 4 is a schematic processing flow diagram of an embodiment of the present invention;
FIG. 5 is a block diagram of an event detection apparatus according to an embodiment of the present invention;
FIG. 6 is a diagram of a hardware configuration of an apparatus for implementing various embodiments of the invention;
FIG. 7 is a block diagram of a computing processing device provided by an embodiment of the invention;
fig. 8 is a block diagram of a portable or fixed storage unit according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the scope of protection of the present invention.
Fig. 1 is a flowchart illustrating steps of an event detection method according to an embodiment of the present invention, where as shown in fig. 1, the method may include:
step 101, acquiring at least two frames of images acquired by an industrial camera.
The event detection method provided in the embodiment of the present invention may be performed by an event detection device, which may be, for example, an unmanned aerial vehicle, an autonomous vehicle, an automated guided vehicle, a mobile robot, or the like.
An industrial camera may take continuous shots to acquire an image frame data stream. The at least two images may be images contained in an image frame data stream continuously captured by the industrial camera, and the respective capturing times of the images may be different. The industrial camera can be mounted on the event detection device, and accordingly, the image acquired by the industrial camera can be directly read to achieve acquisition. The industrial camera may also be provided independently of the event detection device, and accordingly, acquisition may be achieved by receiving an image transmitted by the industrial camera through a wireless transmission technique. Further, the industrial camera may employ a general-purpose image sensor chip, for example, the industrial camera may be a camera based on a Charge Coupled Device (CCD) or a Complementary Metal Oxide Semiconductor (CMOS) chip. The industrial camera does not need a special pixel circuit, namely, a special design and process are not needed on the sensor chip level, the structure is simple, the manufacturing process is more mature, and therefore the cost is lower. Thus, the detection cost can be saved to a certain extent by using the image collected by the industrial camera to carry out event detection.
102, processing images according to the at least two frames of images to obtain a processing result; and the processing result is used for representing the brightness change condition of the pixel points in the image.
Since the event to be detected is an event that causes a change in light intensity, a change in brightness tends to cause a change in light intensity. Meanwhile, each frame of image contains the brightness condition when the frame of image is shot, so the embodiment of the invention can firstly carry out image processing through at least two frames of images to obtain the processing result capable of representing the brightness change condition of the pixel points in the image. The processing result may represent a luminance change condition of a pixel point of one frame image relative to the other frame images in the at least two frame images. For example, the at least two frames of images may be two frames of images, and the processing result may represent a luminance change condition of a pixel point of a previous frame of image relative to a pixel point of a next frame of image in the two frames of images. The processing result may also represent a luminance change condition of a pixel point of a multi-frame image relative to the other frame images in at least two frame images. For example, the at least two frames of images may be three frames of images, and the processing result may represent a brightness change condition of pixel points of the first two frames of images relative to the last frame of images in the three frames of images.
103, rendering according to the processing result to acquire an event image; the event image is used for representing a brightness change event.
In the embodiment of the invention, because the processing result represents the brightness change condition of the pixel points and the number of the pixel points is often large, the rendering can be carried out according to the brightness change conditions of the plurality of pixel points so as to summarize the brightness change conditions of the pixel points, so that the overall brightness change condition of other images is visualized. Since the brightness change occurring between images is often caused by a brightness change event, the brightness change event can be characterized by an event image obtained after rendering.
And 104, outputting the event image.
In the embodiment of the invention, the output event image can be directly displayed, or the event image can be sent to the display terminal. Because the event image can represent the brightness change event, and the image is more visual and vivid, the event can be more visually and conveniently known by a user to a certain extent by outputting the event image to realize the mode of outputting the event, and further the viewing efficiency of the user is improved.
In summary, the event detection method provided in the embodiments of the present invention can obtain at least two frames of images acquired by an industrial camera, perform image processing according to the at least two frames of images to obtain a processing result, where the processing result is used to represent a luminance change condition of a pixel point in an image, perform rendering according to the processing result to obtain an event image, where the event image is used to represent a luminance change event, and finally output the event image. Therefore, event detection can be realized directly according to images collected by a conventional industrial camera without specially setting an event camera, and detection cost can be saved to a certain extent.
Optionally, the above-mentioned operation of performing image processing according to at least two frames of images may be implemented by the following sub-steps (1) to (2):
substep (1): and respectively acquiring the value of each color channel of each image for each frame of image.
In the embodiment of the invention, the color spaces of the images are different, and the color channels contained in the images can be different. For example, if the image is an image in a Red-Green-Blue (RGB) color space, R, G, B three color channels may be included in the image, and accordingly, in this step, channel values of R, G, B color channels in the image may be extracted. If the image is an image of Red-Yellow-Blue (RYB) color space, R, Y, B color channels can be included in the image, and accordingly, channel values of R, Y, B color channels in the image can be extracted in this step. Specifically, the color space to which the image belongs may be an arrangement manner of color pixel points adopted by a photosensitive element in the industrial camera. For example, in the case of the RGGB arrangement, an RGB image can be obtained, and in the case of the RYYB arrangement, an RYB image can be obtained.
Substep (2): and respectively carrying out image difference processing according to the value of each color channel of the at least two frames of images.
In this step, the corresponding gray value may be determined according to the value of each color channel in the image, and then the difference processing may be performed according to the time sequence of the image and the gray value of the image on each color channel, so as to obtain the difference result corresponding to each color channel. Finally, the difference results may be summarized, for example, added to obtain a final difference result, and the final difference result is used as a processing result. Alternatively, the corresponding difference result on each color channel may also be directly used as the processing result, which is not limited in the embodiment of the present invention. Because the brightness change event can cause the change of the gray value, the difference processing is carried out according to the gray value in the embodiment of the invention, so that the finally obtained difference result can represent the brightness change event.
Specifically, when the image difference processing is performed, the difference between the gradation value of the later image and the gradation value of the earlier image may be calculated in accordance with the image sequence. For example, assuming that at least two frames of images are the 1 st frame image and the 3 rd frame image in the data stream acquired by the industrial camera, the difference may be made by subtracting the 1 st frame image from the 3 rd frame image. Or, assuming that at least two frames of images are the 1 st frame of image, the 2 nd frame of image, the 3 rd frame of image and the 4 th frame of image in the data stream acquired by the industrial camera, the 3 rd frame of image and the 4 th frame of image may be added first, and then the 1 st frame of image and the 2 nd frame of image are subtracted, that is, the difference result is used to represent the brightness change condition of the pixel points of the two subsequent frames of images relative to the two previous frames of images. Of course, other difference algorithms may also be used to implement the difference processing, which is not limited in the embodiment of the present invention.
Further, compared with a method of not performing distinguishing processing on each color channel, the method determines the whole gray value and performs difference processing according to a plurality of color channels directly. According to the embodiment of the invention, the values of the color channels are respectively extracted, the difference processing is respectively carried out according to the gray values of the color channels, and finally the summarizing mode is carried out, so that the problems that the channels are mutually coupled and the difference error is increased due to the fact that a plurality of color channels are combined into one channel for processing can be avoided, and the accuracy of the difference processing can be further improved.
It should be noted that the image collected by the industrial camera may also be a gray scale image, subject to the spectral response of the sensor in the camera and the characteristics of the filter array. Accordingly, in this case, the difference processing can be directly performed according to the gray value of each pixel point in the image, so that the value of each color channel does not need to be separately extracted, the operation of performing the difference processing can be simplified to a certain extent, and the efficiency of the difference processing is improved.
Optionally, the difference result may be a difference image, and the value of each pixel in the difference image may be a difference between gray values. Further, noise interference tends to exist in the circuits of the image sensor. For example, there may be noise such as shot noise, thermal noise, and fixed pattern noise that fluctuate over time. And noise may cause the gray value to be inaccurate and have errors. In order to improve the accuracy of the difference result, in the embodiment of the present invention, noise filtering may be performed. In one implementation, the following step a may be performed after image processing is performed according to at least two frames of images to obtain a processing result:
and A, carrying out noise filtering processing on the differential image.
Specifically, various filters may be introduced into the frequency domain or wavelet domain of the difference image, and noise filtering processing may be performed by the filters. In the embodiment of the invention, the processing result, namely the difference image, is processed after the image processing is finished, so that the noise interference contained in the processing result can be removed, and the accuracy of the difference result is further ensured.
In another implementation manner, before performing image processing according to at least two frames of images to obtain a processing result, noise filtering processing may be implemented through the following steps B to D:
b, acquiring m frames of images from the at least two frames of images; and m is a positive integer.
In the embodiment of the present invention, the specific value of m may be determined according to actual conditions. Optionally, the m-frame image may be an image participating in image difference processing in at least two frames of images. That is, the difference processing may be performed once every time m-frame images are acquired. Accordingly, in the embodiment of the present invention, before performing the differential processing each time, the noise filtering processing may be performed on the image of the m frames, that is, the noise filtering processing and the differential processing are performed alternately, so that while maintaining the event output, noise interference in the differential processing performed each time is reduced, and thus the effect of the differential processing is ensured.
And step C, carrying out weighted average processing on the m frames of images to collect noise information in the m frames of images.
In an actual scene, noise interference occurs, and noise information exists in each image. In the step, the noise information in the m frames of images is collected by performing weighted average processing on the m frames of images, so that the noise information can be conveniently and intensively processed in the subsequent steps. Specifically, when performing weighted average processing, the preset weight corresponding to each frame of image may be determined according to a preset weight and a weight distribution rule. For example, the assignment may be made such that the lower the image quality, the greater the weight. Then, the weighted sum corresponding to each pixel point in the m frames of images can be calculated according to the preset weight corresponding to each frame of image and the pixel value of each pixel point.
And D, carrying out noise filtering processing on the noise information.
In the embodiment of the present invention, since all the noise information is collected by the weighted sum corresponding to all the pixel points obtained by the calculation in the above step, in the embodiment of the present invention, the noise filtering process may be performed on the image formed by the weighted sum corresponding to these pixel points, so as to perform the noise filtering process on the noise information. The specific implementation manner of the noise filtering process refers to the implementation manner in step a, or spatial filtering may be performed by using the time domain characteristics of m frames of images, which is not limited in the embodiment of the present invention. In one implementation, each frame of image may be separately noise filtered. But then m times of noise filtering processing is required to filter out the noise interference. In the embodiment of the invention, the noise information in the multi-frame image is collected firstly, and then the noise information is filtered integrally. Therefore, only one-time noise filtering treatment is required, and the noise filtering efficiency can be improved to a certain extent.
Optionally, the processing result may specifically include the difference image and the difference value corresponding to each pixel point in the difference image. Accordingly, in one implementation, the operation of rendering according to the processing result to obtain the event image can be realized through the following sub-steps (3) to (4):
substep (3): and determining the rendering color corresponding to the pixel point according to the symbol of the differential value corresponding to the pixel point in the differential image, and determining the color concentration of the rendering color according to the absolute value of the differential value.
In the embodiment of the present invention, when the difference value is a positive number, the sign of the difference value may be a positive sign, and when the difference value is a negative number, the sign of the difference value may be a negative sign. When the rendering color is determined according to the symbol of the differential value, the rendering color corresponding to the symbol of the differential value can be searched according to the preset corresponding relation between the symbol and the rendering color, and then the rendering color corresponding to the symbol is determined as the rendering color of the pixel point. The preset corresponding relationship between the symbol and the rendering color may be set according to actual requirements, for example, the rendering color corresponding to the positive sign may be set to be red, and the rendering color corresponding to the negative sign may be set to be green. Correspondingly, if the sign of the differential value corresponding to the pixel point is a positive sign, the rendering color corresponding to the pixel point can be determined to be red, and if the sign of the differential value corresponding to the pixel point is a negative sign, the rendering color corresponding to the pixel point can be determined to be green.
Further, the absolute value of the difference value may be positively correlated with the color density, and when the color density is determined based on the absolute value of the difference value, the higher the absolute value, the higher the color density is set, and the smaller the absolute value, the lower the color density is set. For example, the absolute value of the difference value may be used as an input of a first preset generating function, and the output of the first preset generating function may be used as the color density. Wherein the first preset function may be a function in which the dependent variable and the independent variable are positively correlated. Alternatively, the absolute value of the difference value may be negatively correlated with the color density, and when determining the color density according to the absolute value of the difference value, the smaller the absolute value, the higher the color density may be set, and the larger the absolute value, the lower the color density may be set, which is not limited in the embodiment of the present invention. Of course, in a case where the difference value is 0, in this case, the pixel point may not be rendered, and thus, processing resources are saved to a certain extent.
Substep (4): and rendering the pixel points according to the rendering color and the color concentration.
In the embodiment of the invention, the pixel point can be rendered according to the rendering color and the color concentration, so that the rendered pixel point presents the rendering color, and the presented rendering color concentration is matched with the color concentration. The rendered difference image is an event image, and the event image may be a matrix image. Because the differential value of the pixel point is often caused by the brightness change event, in the embodiment of the invention, the pixel point is rendered into the rendering color corresponding to the symbol of the differential value and the color concentration corresponding to the absolute value, and the change direction of the event can be represented by the boundary between the colors in the event image obtained after rendering. For example, the direction in which the color boundary is sharpest in the event image may be taken as the event change direction. The starting azimuth and the ending azimuth of the changing direction can be reflected by the specific change of the density at the boundary, for example, in the case that the absolute value of the difference value is in positive correlation with the color density, the end with the lower density in the direction with the clearest color boundary can be used as the starting azimuth, and the end with the higher density can be used as the ending azimuth. Therefore, the brightness change event can be displayed clearly through the event image.
In another implementation manner, the operation of rendering according to the processing result to obtain the event image may be implemented by the following sub-steps (5) to (6):
substep (5): and determining the rendering color corresponding to the pixel point according to the corresponding relation between the preset numerical value and the rendering color and the differential value corresponding to the pixel point in the differential image.
In the embodiment of the present invention, the corresponding relationship between the preset numerical value and the rendering color may be set according to actual requirements. For example, one numerical value may be set to correspond to one rendering color, and certainly, in order to avoid that the visual burden is too heavy due to too many colors, the same rendering color may also be set for numerical values within the same numerical value range, which is not limited in the embodiment of the present invention.
Specifically, the rendering color corresponding to the difference value may be searched in the corresponding relationship, and the corresponding rendering color is determined as the rendering color corresponding to the pixel point.
Substep (6): and rendering the pixel points according to the rendering colors.
In the embodiment of the invention, the pixel points can be rendered into the corresponding rendering colors according to the default color concentrations, so that the uniformity of the color concentrations of the pixel points can be ensured, and the user can watch the pixel points conveniently. Because the differential value of the pixel point is often caused by the brightness change event, in the embodiment of the invention, the pixel point is rendered into the rendering color corresponding to the differential value, and the change direction of the event can be reflected by the distribution change of the color in the differential image obtained after rendering. For example, the direction in which the color distribution changes most regularly may be used as the change direction of the event. And taking the end with lower regularity degree as the starting position and the end with higher regularity degree as the ending position in the direction with the highest regularity. Therefore, the brightness change event can be displayed clearly through the event image.
It should be noted that, after the event image is generated, the event image may be output through a preset data interface. Wherein, the image frame rate of the output event image may not be higher than the frame rate of the image frame data stream collected by the industrial camera.
Optionally, the brightness change event in the embodiment of the present invention may be an object motion, and accordingly, after rendering is performed according to a processing result to obtain an event image, the following steps E to G may be further performed to implement obstacle avoidance according to an event detection result:
and E, acquiring a High Dynamic Range (HDR) image according to the at least two frames of images, and determining the motion direction of the object according to the event image.
In the embodiment of the present invention, when the motion direction of the object is determined according to the event image, the event change direction may be read according to the event image, and the event change direction may be determined as the motion direction of the object. Specifically, the implementation manner of determining the event change direction may refer to the description in the sub-step (4) or the sub-step (6), and is not described herein again in this embodiment of the present invention.
Further, when acquiring a High-Dynamic Range (HDR) image from at least two frames of images, the following operations may be performed: calculating the sum of the gray values of all the pixel points in the at least two frames of images according to the gray value of each pixel point in the images; and carrying out normalization processing on the sum of the gray values to obtain the HDR image.
Specifically, the operation of calculating the sum of the gray values may be performed according to the values of the color channels in the image, which are read into the image respectively. Therefore, the problems that the channels are mutually coupled and the calculation error is increased due to the fact that a plurality of color channels are combined into one channel for processing can be avoided, and the calculation accuracy can be improved. The specific manner of reading in each color channel may refer to the description in the sub-step (1) to the sub-step (2), and it is needless to say that the color channel values read in each color channel when the sub-step (1) to the sub-step (2) are executed may be directly multiplexed in this step. Of course, when the image itself collected by the industrial camera is a gray scale image, the sum of the gray scale values can be directly performed according to the gray scale value of each pixel point in the image, so as to improve the calculation efficiency.
When the sum of the gray values is calculated, the integration operation may be performed on the previous and subsequent frame images in the at least two frame images according to the time sequence of the images, that is, the sum of the gray values of the pixel points in the previous and subsequent frame images is calculated. Specifically, the sum of the gray values on each color channel may be calculated, and finally, the sum of the gray values on the color channels is used as the sum of the gray values of the pixel points.
Since the sum of the gray values after the addition of the images of the plurality of frames may exceed the bit depth and the display range adopted by the common display device. Therefore, in the embodiment of the invention, the sum of the gray values can be normalized through the normalization operation, so that the HDR image can be normally displayed by a common display device, and the application range of the HDR image is further improved. For example, the bit depth adopted by a common display device may be 8 bits (bit), and accordingly, in the embodiment of the present invention, the sum of the gray values may be normalized to 8-bit depth in the range of 0 to 255 values. Further, the bit depth of the image may also be increased based on the number of frames of the image involved in the calculation to ensure that the sum of the calculated gray values has sufficient bit depth to be used. For example, the bit depth may be increased to 10-bit when the number of frames of the image participating in the calculation is 4. For example, 8-bit image with gray value in 0-255 numerical range is integrated by 4 frames of image to obtain 10-bit image in 0-1023 numerical range, and the bit depth can be ensured to meet the use requirement by increasing to 10 bits.
In practice, the noise of the image is randomly fluctuating, and weak signals, such as darker parts, in the image are relatively stable. Therefore, in the embodiment of the invention, through the addition operation of the multi-frame images, the increase of the noise intensity is lower than that of the weak signal, so that the darker part in the image can be enhanced from the noise, the signal-to-noise ratio in the dark area is improved, and further more imaging details are provided. Further, when a high-brightness region and a low-brightness region exist in an imaging scene at the same time, due to random fluctuation of noise, on the premise that the high-brightness region is kept not to generate overexposure, enhanced perception of the low-brightness region can be realized to a certain extent, and then HDR imaging is realized, that is, an HDR image is obtained. Under the condition that the adopted industrial camera is a high-speed industrial camera, the exposure time of the high-speed industrial camera is shorter, and the number of photons collected in unit time is less, so that the image signal to noise ratio can be improved to a certain extent by a summing calculation mode.
It should be noted that the generated HDR image may be output through a preset data interface, so as to be used in subsequent steps. The output image can be a color image, which is helpful for judging the material and texture information of the object, and further improves the subsequent image recognition efficiency. Further, a certain noise interference may exist in the HDR image, and in order to improve the image quality of the HDR image, the HDR image may be subjected to noise filtering. The specific noise filtering method may refer to the noise filtering process performed in the above-mentioned correlation step, and for example, spatial filtering, frequency domain filtering, or wavelet domain filtering may be adopted.
And F, identifying the position information of the object in the external environment according to the HDR image.
In the embodiment of the present invention, the position information of the object in the external environment may be coordinates of the object in the external environment, or may be a relative position relationship, a relative distance, and the like between the object and another object in the external environment. Specifically, the HDR image may be identified by using a preset image identification algorithm, for example, performing operations such as target object identification, semantic segmentation of the image, and the like, to determine the position information. It should be noted that, in order to improve the accuracy of identification, in the embodiment of the present invention, the features of the identified object may be compared with the features of the event in the event image, and if the two features are matched, it is determined that the object is a moving object, and then the position information of the object is further determined.
And G, avoiding the obstacle of the object according to the motion direction and the position information.
In the embodiment of the invention, the movement change of the object can be estimated according to the movement direction and the position information, and the obstacle avoidance strategy is determined according to the movement change. For example, assuming that the direction of motion is moving horizontally from left to right, the object is now in a neutral position and then moves to the far right of the viewing range, the device may be controlled away from the right position to avoid the object.
When a moving object is kept away from an obstacle in a scene with severe light intensity change, such as a tunnel entering and exiting and an indoor-outdoor switching scene, if the obstacle is kept away only by performing image analysis on an image acquired by an industrial camera, the image analysis may fail due to insufficient dynamic range of the image, and the obstacle-keeping effect is poor. In the embodiment of the invention, the HDR image is further generated according to the image acquired by the industrial camera, the motion direction of the object is judged by combining the event image, and in the mode of avoiding the obstacle, the HDR image has a high dynamic range, so that enough information can be provided for image recognition and analysis, the position information of the object can be further ensured to be recognized, and the obstacle avoiding effect is further ensured.
Optionally, when a moving object exists in the scene, the position of the object on the previous and subsequent frame images may be changed, so that motion blur may be formed to some extent in the HDR image obtained through the calculation. Therefore, after acquiring the HDR image according to at least two frame images, the following steps may be further performed in the embodiment of the present invention:
and H, determining a motion blur kernel according to the motion information of the object in the event image.
In the embodiment of the present invention, the motion information may be related information of a pixel point in the event change direction. Since the blurring is due to the motion of objects, the blurring of the image may also be referred to as image degradation. Therefore, these pixel points can be considered as points causing degradation. The motion blur kernel may be a convolution kernel describing the motion trajectories of the pixel points.
Optionally, when determining the motion blur kernel, the motion blur model may be estimated according to the motion information of the object in the event image. For example, since there is motion blur in the HDR image, the HDR image may be used as a degraded image, an image used when the HDR image is generated is used as an original sharp image, and motion information is used as an input of a preset degraded point spread function. And constructing an image degradation model according to the degraded image, the original clear image and the degraded point spread function, and taking the finally constructed image degradation model as a motion blur model.
The motion blur kernel may then be estimated from the motion blur model. Wherein, the motion blur kernel can be a space invariant blur kernel or a space variant blur kernel. Specifically, the spatially invariant blur kernel is equivalent to performing an overall deblurring operation on an image when used, assuming that the overall blur degree of the HDR image is the same. The spatial invariant blur kernel is a global calculation, and in the case that the motion blur kernel is a spatial invariant blur kernel, the motion blur kernel may be determined by an overall estimation algorithm, for example, an overall estimation algorithm such as an overall average and a principal component analysis of a motion degree reflected by an event image, according to the motion blur model. The operation of the integral estimation is simpler, so the calculation efficiency can be improved to a certain extent by using the space invariant fuzzy core. However, since the operation is equivalent to performing the whole deblurring operation on the image, the defect such as ringing effect may be caused, and the deblurring effect is poor.
In the case that the motion blur kernel is a spatially varying blur kernel, the motion blur kernel may be determined by a local estimation algorithm according to a motion blur model. The spatial variation blurring kernel assumes that the blurring degrees of the motion of an object in a three-dimensional space on an image are different, and the spatial variation blurring kernel can be used for reflecting the difference of the blurring degrees in a local area, so that a better deblurring effect can be obtained when the method is used. But the computational complexity of the local estimation is high. Therefore, the spatial invariant fuzzy kernel or the spatial variant fuzzy kernel is specifically used and can be selected according to actual requirements.
And step I, deblurring the HDR image according to the motion blur kernel.
In the embodiment of the invention, the motion blur kernel can be used for carrying out deconvolution operation on the HDR image, thereby realizing deblurring processing.
Of course, deblurring may also be achieved in other ways. For example, an image convolution kernel of the motion blur may be directly estimated according to image prior knowledge, for example, data characteristics such as statistical correlation of previous and subsequent frame images, and a preset convolution kernel function, and finally, the image convolution kernel is used to perform a deconvolution operation on the HDR image.
Further, in the embodiment of the present invention, after at least two frames of images acquired by the industrial camera are acquired, a grayscale map or a color map corresponding to the at least two frames of images may be output. For example, the grayscale map may be output when the captured image is a grayscale map, and the color map may be output when the captured image is a color map. A color map refers to a map belonging to a color space, such as an RGB map. The image may be output through a preset data interface when being output. Under the scene of continuously acquiring images in image frame data streams acquired by an industrial camera, the images in the data streams can be continuously output. Wherein the output image frame rate may be the same as the frame rate of the image frame data stream.
For example, taking a brightness change event as a motion event as an example, fig. 2 is a schematic diagram of output comparison provided by the embodiment of the present invention. For brightness change events caused by object motion or other factors, the event camera can output events at a higher time sampling frequency, but only can output events, and cannot output images. By way of example, the output of the event camera may be as shown in the first row. In the embodiment of the present invention, an industrial camera, that is, a classical imaging camera, may output image frames acquired at each time shown in the second row, determine an event image through inter-frame operation, and output an event by outputting the event image (only an event change direction in the event image is shown in fig. 2).
Optionally, the frame rate of the industrial camera in the embodiment of the present invention may be not less than 100 frames per second (fps), that is, the industrial camera may be a high-speed industrial camera. Thus, by adopting the industrial camera with the frame rate not less than 100, the problems of poor detection real-time performance and low practicability caused by too low frame rate can be avoided. For example, in a practical application scenario, such as a common robot application scenario, when the robot moves at a low speed, the perceived frame rate to the environment is approximately 20fps to the human eye. The time sampling frequency of the event camera can reach 1 kilohertz (kHz) or even 100kHz, which largely meets the application requirements of the robot, but it only provides event information. In the embodiment of the invention, the industrial camera with the frame rate of more than 100fps is adopted, and the event output meeting the application of the robot is realized by utilizing inter-frame operation, and meanwhile, information such as images, HDR images and the like can be further provided, so that more comprehensive real-time image data can be provided for the perception of the robot.
Further, fig. 3 is a schematic output diagram provided in the embodiment of the present invention. As shown in fig. 3, the embodiment of the present invention may directly output a grayscale image or a color image. Alternatively, an HDR image and an event image are output by inter-frame operation (only the event change direction is shown in fig. 3).
Fig. 4 is a schematic processing procedure of a specific example provided in the embodiment of the present invention. As shown in fig. 4, for the image frame data stream output by the industrial camera, the image output type can be selected according to actual requirements. In the case of direct output being selected, the original image, i.e. the image in the image frame data stream, may be directly output. In the case of selecting an output event image, the operation of reading in the raw data of the subchannel, that is, the operation shown in the above-described sub-step (1), may be performed first. And performing interframe difference operation and noise filtering, and then performing operation of visualizing difference numerical signs, namely performing rendering operation shown in the steps. Finally, an event image may be output. In the case of selecting the output HDR image, the operation of reading in the raw data of the channels may be performed first, and then the inter-frame integration operation, that is, the operation of acquiring the HDR image shown in step E above, may be performed. Then, a noise filtering and deblurring process can be performed, wherein the deblurring process can be performed in conjunction with the event image. Finally, an HDR image may be output.
Further, compared with the method using an event camera, the event camera is only sensitive to the variation value of the light intensity and is not sensitive to the light intensity distribution, so that the output information is often sparse and the image cannot be output. In the embodiment of the invention, HDR images can be generated and output, and images acquired by an industrial camera can be output, so that the richness of output data is improved. Meanwhile, compared with an event camera, the industrial camera has higher compatibility with various computing platforms, smaller pixel size and larger filling factor and array scale, so that an image with higher resolution can be output, and the richness of image information is further improved. By way of example, with signal processing transmission circuit design, an industrial camera can achieve higher acquisition and transmission speeds (e.g., up to about 2 hundred million pixels per second using 5 switch bandwidth (Gbps)), enabling a Video Graphics Array (VGA) in the camera to output images at higher image resolutions such as 720p in a high speed imaging mode of greater than 200 fps.
It should be noted that, in the embodiment of the present invention, the Processing process of the image may be implemented by a high-speed image calculation Processing Unit, such as a Field Programmable Gate Array (FPGA), a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), an embedded Neural Network Processor (NPU), and the like, which are mounted in the event detection device. The specific number of images used in generating the event image may be determined according to the acquisition and data transmission speed of the industrial camera, the computing power of the processor, and the actual output frame frequency requirement of the event image, which is not limited in the embodiment of the present invention.
Fig. 5 is a block diagram of an event detection apparatus according to an embodiment of the present invention, where the apparatus 50 may include: a memory 501 and a processor 502.
The memory 501 is used for storing program codes.
The processor 502, invoking the program code, when executed, is configured to:
acquiring at least two frames of images acquired by an industrial camera;
performing image processing according to the at least two frames of images to obtain a processing result; the processing result is used for representing the brightness change condition of the pixel points in the image;
rendering according to the processing result to obtain an event image; the event image is used for representing a brightness change event;
and outputting the event image.
Optionally, the processor 502 is specifically configured to:
for each frame of image, respectively acquiring the value of each color channel of the image;
and respectively carrying out image difference processing according to the value of each color channel of the at least two frames of images.
Optionally, the processing result includes a difference image; the processor 502 is further configured to: and carrying out noise filtering processing on the differential image.
Optionally, the processor 502 is further configured to: acquiring m frames of images from the at least two frames of images; m is a positive integer; carrying out weighted average processing on the m frames of images to collect noise information in the m frames of images; and carrying out noise filtering processing on the noise information.
Optionally, the brightness change event is an object motion; the processor 502 is further configured to: acquiring a High Dynamic Range (HDR) image from the at least two images, and determining a motion direction of the object from the event image; identifying position information of the object in an external environment according to the HDR image; and avoiding the obstacle of the object according to the motion direction and the position information.
Optionally, the processor 502 is further configured to: determining a motion blur kernel according to the motion information of the object in the event image; and carrying out deblurring processing on the HDR image according to the motion blur kernel.
Optionally, the processor 502 is specifically configured to: estimating a motion blur model according to the motion information of the object in the event image; estimating the motion blur kernel according to the motion blur model; wherein the motion blur kernel is a spatially invariant blur kernel or a spatially variant blur kernel.
Optionally, the processor 502 is specifically configured to: calculating the sum of the gray values of all the pixel points in the at least two frames of images according to the gray value of each pixel point in the images; and carrying out normalization processing on the sum of the gray values to obtain the HDR image.
Optionally, the processing result includes a difference image and a difference value corresponding to each pixel point in the difference image; the processor 502 is specifically configured to: determining a rendering color corresponding to the pixel point according to a symbol of a differential value corresponding to the pixel point in the differential image, and determining the color concentration of the rendering color according to an absolute value of the differential value; and rendering the pixel points according to the rendering color and the color concentration. Optionally, the processing result includes a difference image and a difference value corresponding to each pixel point in the difference image; the processor 502 is specifically configured to: determining rendering colors corresponding to the pixel points according to a corresponding relation between a preset numerical value and the rendering colors and a differential value corresponding to the pixel points in the differential image; and rendering the pixel points according to the rendering colors.
Optionally, the frame rate of the industrial camera is not less than 100 frames per second. Optionally, the processor 502 is further configured to: and outputting a gray scale map or a color map corresponding to the at least two frames of images. In summary, the event detection device provided in the embodiments of the present invention can acquire at least two frames of images acquired by an industrial camera, perform image processing according to the at least two frames of images to obtain a processing result, where the processing result is used to represent a luminance change condition of a pixel point in the image, then perform rendering according to the processing result to obtain an event image, where the event image is used to represent a luminance change event, and finally output the event image. Therefore, event detection can be realized directly according to images collected by a conventional industrial camera without specially setting an event camera, and detection cost can be saved to a certain extent.
Further, the embodiment of the present invention also provides a movable platform, which includes an industrial camera and any one of the event detection devices described above; the event detection device is used for executing each step in the event detection method, and can achieve the same technical effect, and for avoiding repetition, the details are not repeated here.
Optionally, the movable platform comprises a powered propeller and a drive motor for driving the powered propeller.
Further, an embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each step in the event detection method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
Fig. 6 is a schematic diagram of a hardware structure of an apparatus for implementing various embodiments of the present invention, where the apparatus 600 includes, but is not limited to: a radio frequency unit 601, a network module 602, an audio output unit 603, an input unit 604, a sensor 605, a display unit 606, a user input unit 607, an interface unit 608, a memory 609, a processor 610, and a power supply 611. Those skilled in the art will appreciate that the configuration of the device shown in fig. 6 does not constitute a limitation of the device, and that the device may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted device, a wearable device, a pedometer, and the like. It should be understood that, in the embodiment of the present invention, the radio frequency unit 601 may be used for receiving and sending signals during a message sending and receiving process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 610; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 601 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. Further, the radio frequency unit 601 may also communicate with a network and other devices through a wireless communication system. The device provides wireless broadband internet access to the user via the network module 602, such as assisting the user in emailing, browsing web pages, and accessing streaming media. The audio output unit 603 may convert audio data received by the radio frequency unit 601 or the network module 602 or stored in the memory 609 into an audio signal and output as sound. Also, the audio output unit 603 can also provide audio output related to a specific function performed by the apparatus 600 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 603 includes a speaker, a buzzer, a receiver, and the like. The input unit 604 is used to receive audio or video signals. The input Unit 604 may include a Graphics Processing Unit (GPU) 6041 and a microphone 6042, and the Graphics processor 6041 processes image data of a still picture or video obtained by an image capturing apparatus (such as a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 606. The image frames processed by the graphic processor 6041 may be stored in the memory 609 (or other storage medium) or transmitted via the radio frequency unit 601 or the network module 602. The microphone 6042 can receive sound, and can process such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 601 in case of the phone call mode. The device 600 also includes at least one sensor 605, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 6061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 6061 and/or the backlight when the apparatus 600 is moved to the ear. As one type of motion sensor, an accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the device attitude (such as horizontal and vertical screen switching, related games, magnetometer attitude calibration), vibration identification related functions (such as pedometer, tapping), and the like; the sensors 605 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein. The display unit 606 is used to display information input by the user or information provided to the user. The Display unit 606 may include a Display panel 6061, and the Display panel 6061 may be configured by a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. The user input unit 607 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the apparatus. Specifically, the user input unit 607 includes a touch panel 6041 and other input devices 6072. Touch panel 6041, also referred to as a touch screen, may collect touch operations by a user on or near it (e.g., operations by a user on or near touch panel 6041 using a finger, stylus, or any suitable object or accessory). The touch panel 6041 may include two parts of a touch detection device and a touch controller. The touch detection equipment detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 610, receives a command from the processor 610, and executes the command. In addition, the touch panel 6041 can be implemented by various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The user input unit 607 may include other input devices 6072 in addition to the touch panel 6041. Specifically, the other input devices 6072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a track ball, a mouse, and a joystick, which are not described herein again. Further, the touch panel 6041 can be overlaid on the display panel 6061, and when the touch panel 6041 detects a touch operation on or near the touch panel 6041, the touch operation can be transmitted to the processor 610 to determine the type of the touch event, and then the processor 610 can provide a corresponding visual output on the display panel 6061 according to the type of the touch event. Although the touch panel 6041 and the display panel 6061 are two independent components to implement the input and output functions of the apparatus, in some embodiments, the touch panel 6041 and the display panel 6061 may be integrated to implement the input and output functions of the apparatus, and are not limited herein. The interface unit 608 is an interface for connecting an external device to the apparatus 600. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 608 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the device 600 or may be used to transmit data between the device 600 and an external device.
The memory 609 may be used to store software programs as well as various data. The memory 609 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 609 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. The processor 610 is a control center of the apparatus, connects various parts of the entire apparatus using various interfaces and lines, performs various functions of the apparatus and processes data by running or executing software programs and/or modules stored in the memory 609, and calling up data stored in the memory 609, thereby performing overall monitoring of the apparatus. Processor 610 may include one or more processing units; preferably, the processor 610 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 610. The device 600 may also include a power supply 611 (e.g., a battery) to supply power to the various components, and preferably, the power supply 611 may be logically coupled to the processor 610 via a power management system to manage charging, discharging, and power consumption management functions via the power management system. In addition, the device 600 includes some functional modules that are not shown, and are not described in detail herein. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
The various component embodiments of the invention may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that a microprocessor or digital signal processor may be used in practice to implement some or all of the functionality of some or all of the components in a computing processing device according to embodiments of the present invention. The present invention may also be embodied as apparatus or device programs (e.g., computer programs and computer program products) for performing a portion or all of the methods described herein. Such programs implementing the present invention may be stored on computer-readable media or may be in the form of one or more signals. Such a signal may be downloaded from an internet website or provided on a carrier signal or in any other form. For example, fig. 7 is a block diagram of a computing processing device according to an embodiment of the present invention, and as shown in fig. 7, fig. 7 shows a computing processing device that can implement the method according to the present invention. The computing processing device conventionally includes a processor 710 and a computer program product or computer-readable medium in the form of a memory 720. The memory 720 may be an electronic memory such as a flash memory, an EEPROM (electrically erasable programmable read only memory), an EPROM, a hard disk, or a ROM. The memory 720 has a storage space 730 for program code for performing any of the method steps of the above-described method. For example, the storage space 730 for the program codes may include respective program codes respectively for implementing various steps in the above methods. The program code can be read from or written to one or more computer program products. These computer program products comprise a program code carrier such as a hard disk, a Compact Disc (CD), a memory card or a floppy disk. Such a computer program product is typically a portable or fixed storage unit as described with reference to fig. 8. The memory unit may have memory segments, memory spaces, etc. arranged similarly to memory 720 in the computing processing device of fig. 7. The program code may be compressed, for example, in a suitable form. Typically, the memory unit comprises computer readable code, i.e. code that can be read by a processor, such as 710, for example, which when executed by a computing processing device causes the computing processing device to perform the steps of the method described above. The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. Reference herein to "one embodiment," "an embodiment," or "one or more embodiments" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the invention. Moreover, it is noted that instances of the word "in one embodiment" are not necessarily all referring to the same embodiment. In the description provided herein, numerous specific details are set forth. It is understood, however, that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words may be interpreted as names. Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (26)

  1. A method of event detection, the method comprising:
    acquiring at least two frames of images acquired by an industrial camera;
    performing image processing according to the at least two frames of images to obtain a processing result; the processing result is used for representing the brightness change condition of the pixel points in the image;
    rendering according to the processing result to obtain an event image; the event image is used for representing a brightness change event;
    and outputting the event image.
  2. The method of claim 1, wherein the image processing according to the at least two frames of images comprises:
    for each frame of image, respectively acquiring the value of each color channel of the image;
    and respectively carrying out image difference processing according to the value of each color channel of the at least two frames of images.
  3. The method of claim 1, wherein the processing result comprises a difference image; after the image processing is performed according to the at least two frames of images and a processing result is obtained, the method further includes:
    and carrying out noise filtering processing on the differential image.
  4. The method of claim 1, wherein before performing the image processing according to the at least two frames of images to obtain the processing result, the method further comprises:
    acquiring m frames of images from the at least two frames of images; m is a positive integer;
    carrying out weighted average processing on the m frames of images to collect noise information in the m frames of images;
    and carrying out noise filtering processing on the noise information.
  5. The method of any one of claims 1 to 4, wherein the brightness change event is an object motion; after the rendering is performed according to the processing result to obtain the event image, the method further includes:
    acquiring a High Dynamic Range (HDR) image from the at least two images, and determining a motion direction of the object from the event image;
    identifying position information of the object in an external environment according to the HDR image;
    and avoiding the obstacle of the object according to the motion direction and the position information.
  6. The method as claimed in claim 5, wherein after acquiring the HDR image according to the at least two frame images, the method further comprises:
    determining a motion blur kernel according to the motion information of the object in the event image;
    and carrying out deblurring processing on the HDR image according to the motion blur kernel.
  7. The method of claim 6, wherein determining a motion blur kernel according to the motion information of the object in the event image comprises:
    estimating a motion blur model according to the motion information of the object in the event image;
    estimating the motion blur kernel according to the motion blur model;
    wherein the motion blur kernel is a spatially invariant blur kernel or a spatially variant blur kernel.
  8. The method of claim 5, wherein said obtaining a High Dynamic Range (HDR) image from the at least two frame images comprises:
    calculating the sum of the gray values of all the pixel points in the at least two frames of images according to the gray value of each pixel point in the images;
    and carrying out normalization processing on the sum of the gray values to obtain the HDR image.
  9. The method according to claim 1, wherein the processing result comprises a difference image and a difference value corresponding to each pixel point in the difference image;
    the rendering according to the processing result to obtain the event image includes:
    determining a rendering color corresponding to the pixel point according to a symbol of a differential value corresponding to the pixel point in the differential image, and determining the color concentration of the rendering color according to an absolute value of the differential value;
    and rendering the pixel points according to the rendering color and the color concentration.
  10. The method according to claim 1, wherein the processing result comprises a difference image and a difference value corresponding to each pixel point in the difference image;
    the rendering according to the processing result to obtain the event image includes:
    determining rendering colors corresponding to the pixel points according to a corresponding relation between a preset numerical value and the rendering colors and a differential value corresponding to the pixel points in the differential image;
    and rendering the pixel points according to the rendering colors.
  11. The method of claim 1, wherein the industrial camera has a frame rate of not less than 100 frames per second.
  12. The method of claim 1, further comprising:
    and outputting a gray scale map or a color map corresponding to the at least two frames of images.
  13. An event detection apparatus, comprising a memory and a processor;
    the memory for storing program code;
    the processor, invoking the program code, when executed, is configured to:
    acquiring at least two frames of images acquired by an industrial camera;
    performing image processing according to the at least two frames of images to obtain a processing result; the processing result is used for representing the brightness change condition of the pixel points in the image;
    rendering according to the processing result to obtain an event image; the event image is used for representing a brightness change event;
    and outputting the event image.
  14. The apparatus of claim 13, wherein the processor is specifically configured to:
    for each frame of image, respectively acquiring the value of each color channel of the image;
    and respectively carrying out image difference processing according to the value of each color channel of the at least two frames of images.
  15. The apparatus of claim 13, wherein the processing result comprises a difference image; the processor is further configured to:
    and carrying out noise filtering processing on the differential image.
  16. The apparatus of claim 13, wherein the processor is further configured to:
    acquiring m frames of images from the at least two frames of images; m is a positive integer;
    carrying out weighted average processing on the m frames of images to collect noise information in the m frames of images;
    and carrying out noise filtering processing on the noise information.
  17. The apparatus according to any one of claims 13 to 16, wherein the brightness change event is an object motion; the processor is further configured to:
    acquiring a High Dynamic Range (HDR) image from the at least two images, and determining a motion direction of the object from the event image;
    identifying position information of the object in an external environment according to the HDR image;
    and avoiding the obstacle of the object according to the motion direction and the position information.
  18. The apparatus of claim 17, wherein the processor is further configured to:
    determining a motion blur kernel according to the motion information of the object in the event image;
    and carrying out deblurring processing on the HDR image according to the motion blur kernel.
  19. The apparatus of claim 18, wherein the processor is specifically configured to:
    estimating a motion blur model according to the motion information of the object in the event image;
    estimating the motion blur kernel according to the motion blur model;
    wherein the motion blur kernel is a spatially invariant blur kernel or a spatially variant blur kernel.
  20. The apparatus of claim 17, wherein the processor is specifically configured to:
    calculating the sum of the gray values of all the pixel points in the at least two frames of images according to the gray value of each pixel point in the images;
    and carrying out normalization processing on the sum of the gray values to obtain the HDR image.
  21. The apparatus according to claim 13, wherein the processing result comprises a difference image and a difference value corresponding to each pixel point in the difference image;
    the processor is specifically configured to:
    determining a rendering color corresponding to the pixel point according to a symbol of a differential value corresponding to the pixel point in the differential image, and determining the color concentration of the rendering color according to an absolute value of the differential value;
    and rendering the pixel points according to the rendering color and the color concentration.
  22. The apparatus according to claim 13, wherein the processing result comprises a difference image and a difference value corresponding to each pixel point in the difference image;
    the processor is specifically configured to:
    determining rendering colors corresponding to the pixel points according to a corresponding relation between a preset numerical value and the rendering colors and a differential value corresponding to the pixel points in the differential image;
    and rendering the pixel points according to the rendering colors.
  23. The apparatus of claim 13, wherein the industrial camera has a frame rate of not less than 100 frames per second.
  24. The apparatus of claim 13, wherein the processor is further configured to:
    and outputting a gray scale map or a color map corresponding to the at least two frames of images.
  25. A movable platform comprising an industrial camera and the event detection apparatus of any one of claims 13 to 24; the event detection device is used for executing the following operations:
    acquiring at least two frames of images acquired by an industrial camera;
    performing image processing according to the at least two frames of images to obtain a processing result; the processing result is used for representing the brightness change condition of the pixel points in the image;
    rendering according to the processing result to obtain an event image; the event image is used for representing a brightness change event;
    and outputting the event image.
  26. A computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, performs the operations of:
    acquiring at least two frames of images acquired by an industrial camera;
    performing image processing according to the at least two frames of images to obtain a processing result; the processing result is used for representing the brightness change condition of the pixel points in the image;
    rendering according to the processing result to obtain an event image; the event image is used for representing a brightness change event;
    and outputting the event image.
CN202080059855.5A 2020-08-06 2020-08-06 Event detection method and device, movable platform and computer readable storage medium Pending CN114341650A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/107427 WO2022027444A1 (en) 2020-08-06 2020-08-06 Event detection method and device, movable platform, and computer-readable storage medium

Publications (1)

Publication Number Publication Date
CN114341650A true CN114341650A (en) 2022-04-12

Family

ID=80118712

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080059855.5A Pending CN114341650A (en) 2020-08-06 2020-08-06 Event detection method and device, movable platform and computer readable storage medium

Country Status (2)

Country Link
CN (1) CN114341650A (en)
WO (1) WO2022027444A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6646763B1 (en) * 1999-11-12 2003-11-11 Adobe Systems Incorporated Spectral color matching to a device-independent color value
CN1506686A (en) * 2002-12-09 2004-06-23 北京中星微电子有限公司 Moving image detecting method
CN1632594A (en) * 2004-12-31 2005-06-29 北京中星微电子有限公司 Motion image detecting method and circuit
CN1632593A (en) * 2004-12-31 2005-06-29 北京中星微电子有限公司 Motion image detecting method and circuit
CN102467800A (en) * 2010-11-05 2012-05-23 无锡市美网网络信息技术有限公司 Invasion detection and alarm system
CN202794523U (en) * 2012-07-27 2013-03-13 符建 Three-dimensional imaging radar system based on flight spectrum

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2941493C (en) * 2014-03-03 2023-03-07 Vsk Electronics Nv Intrusion detection with directional sensing

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6646763B1 (en) * 1999-11-12 2003-11-11 Adobe Systems Incorporated Spectral color matching to a device-independent color value
CN1506686A (en) * 2002-12-09 2004-06-23 北京中星微电子有限公司 Moving image detecting method
CN1632594A (en) * 2004-12-31 2005-06-29 北京中星微电子有限公司 Motion image detecting method and circuit
CN1632593A (en) * 2004-12-31 2005-06-29 北京中星微电子有限公司 Motion image detecting method and circuit
CN102467800A (en) * 2010-11-05 2012-05-23 无锡市美网网络信息技术有限公司 Invasion detection and alarm system
CN202794523U (en) * 2012-07-27 2013-03-13 符建 Three-dimensional imaging radar system based on flight spectrum

Also Published As

Publication number Publication date
WO2022027444A1 (en) 2022-02-10

Similar Documents

Publication Publication Date Title
CN110136183B (en) Image processing method and device and camera device
CN110198412B (en) Video recording method and electronic equipment
CN109413563B (en) Video sound effect processing method and related product
CN108234882B (en) Image blurring method and mobile terminal
CN108038836B (en) Image processing method and device and mobile terminal
CN112449120B (en) High dynamic range video generation method and device
CN110944160B (en) Image processing method and electronic equipment
CN107566749B (en) Shooting method and mobile terminal
CN111491072B (en) Pixel clock frequency adjusting method and device and electronic equipment
CN108234894B (en) Exposure adjusting method and terminal equipment
US20220038621A1 (en) Device for automatically capturing photo or video about specific moment, and operation method thereof
CN110930335B (en) Image processing method and electronic equipment
CN110868544B (en) Shooting method and electronic equipment
CN109151348B (en) Image processing method, electronic equipment and computer readable storage medium
CN109005314B (en) Image processing method and terminal
CN109104578B (en) Image processing method and mobile terminal
CN110930329A (en) Starry sky image processing method and device
CN111145151B (en) Motion area determining method and electronic equipment
CN111008929B (en) Image correction method and electronic equipment
CN109639981B (en) Image shooting method and mobile terminal
CN109840476B (en) Face shape detection method and terminal equipment
CN110944163A (en) Image processing method and electronic equipment
CN107798662B (en) Image processing method and mobile terminal
CN110930372B (en) Image processing method, electronic equipment and computer readable storage medium
CN111246053B (en) Image processing method and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination