WO2022027444A1 - Event detection method and device, movable platform, and computer-readable storage medium - Google Patents

Event detection method and device, movable platform, and computer-readable storage medium Download PDF

Info

Publication number
WO2022027444A1
WO2022027444A1 PCT/CN2020/107427 CN2020107427W WO2022027444A1 WO 2022027444 A1 WO2022027444 A1 WO 2022027444A1 CN 2020107427 W CN2020107427 W CN 2020107427W WO 2022027444 A1 WO2022027444 A1 WO 2022027444A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
event
images
frames
processing result
Prior art date
Application number
PCT/CN2020/107427
Other languages
French (fr)
Chinese (zh)
Inventor
黄潇
洪小平
Original Assignee
深圳市大疆创新科技有限公司
南方科技大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司, 南方科技大学 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2020/107427 priority Critical patent/WO2022027444A1/en
Priority to CN202080059855.5A priority patent/CN114341650A/en
Publication of WO2022027444A1 publication Critical patent/WO2022027444A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01PMEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
    • G01P13/00Indicating or recording presence, absence, or direction, of movement

Definitions

  • the present invention belongs to the technical field of machine vision, and in particular, relates to an event detection method, a device, a movable platform and a computer-readable storage medium.
  • an event camera is often installed, and event detection is implemented through the event camera.
  • the event camera is a machine vision image sensor.
  • the event camera can directly output the change value of the light intensity through its internal special pixel circuit when sensing the change of light intensity in the external environment, thereby realizing event detection.
  • the cost is often high. In this way, the detection cost will be higher when the event camera is used for detection.
  • the present invention provides an event detection method, a device, a movable platform and a computer-readable storage medium, so as to solve the problem of high detection cost by using an event camera for detection.
  • an embodiment of the present invention provides an event detection method, which includes:
  • the event image is output.
  • an embodiment of the present invention provides an event detection apparatus, the apparatus includes a memory and a processor;
  • the memory for storing program codes
  • the processor calls the program code, and when the program code is executed, is configured to perform the following operations:
  • the event image is output.
  • an embodiment of the present invention provides a movable platform, where the movable platform includes an industrial camera and the above-mentioned event detection device; the event detection device is configured to perform the following operations:
  • the event image is output.
  • an embodiment of the present invention provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the following operations are implemented:
  • the event image is output.
  • At least two frames of images collected by an industrial camera may be acquired, and image processing is performed according to the at least two frames of images to obtain a processing result, wherein the processing result is used to characterize the brightness change of the pixels in the image, and then according to The processing result is rendered to obtain an event image, wherein the event image is used to represent the brightness change event, and finally, the event image is output.
  • event detection can be realized directly based on images collected by conventional industrial cameras without the need to specially set up an event camera, so the detection cost can be saved to a certain extent.
  • FIG. 2 is a schematic diagram of an output comparison provided by an embodiment of the present invention.
  • FIG. 3 is a schematic diagram of an output provided by an embodiment of the present invention.
  • FIG. 4 is a schematic diagram of a processing process of a specific example provided by an embodiment of the present invention.
  • FIG. 5 is a block diagram of an event detection apparatus provided by an embodiment of the present invention.
  • FIG. 6 is a schematic diagram of a hardware structure of a device implementing various embodiments of the present invention.
  • FIG. 7 is a block diagram of a computing processing device according to an embodiment of the present invention.
  • FIG. 8 is a block diagram of a portable or fixed storage unit according to an embodiment of the present invention.
  • FIG. 1 is a flowchart of steps of an event detection method provided by an embodiment of the present invention. As shown in FIG. 1 , the method may include:
  • Step 101 Acquire at least two frames of images collected by an industrial camera.
  • the event detection method provided in the embodiment of the present invention may be performed by an event detection device.
  • the event detection device may be an unmanned aerial vehicle, an autonomous vehicle, an automatic guided transport vehicle, a mobile robot, and the like.
  • Industrial cameras can shoot continuously to capture a stream of image frames.
  • the at least two frames of images may be images included in the image frame data stream continuously captured by the industrial camera, and the capture moments corresponding to these images may be different.
  • the industrial camera can be mounted on the event detection device, and accordingly, the acquisition can be achieved by directly reading the image captured by the industrial camera.
  • the industrial camera can also be set independently from the event detection device, and accordingly, the acquisition can be achieved by receiving images transmitted by the industrial camera through wireless transmission technology.
  • the industrial camera may use a general-purpose image sensor chip, for example, the industrial camera may be a camera based on a charge coupled device (Charge Coupled Device, CCD) or a complementary metal oxide semiconductor (Complementary Metal Oxide Semiconductor, CMOS) chip.
  • CCD Charge Coupled Device
  • CMOS complementary metal oxide semiconductor
  • the structure is simple, and the manufacturing process is more mature, so the cost is lower. In this way, detection costs can be saved to a certain extent by using images captured by industrial cameras for event detection.
  • Step 102 Perform image processing according to the at least two frames of images to obtain a processing result; the processing result is used to characterize the brightness change of the pixels in the image.
  • the embodiment of the present invention may first perform image processing through at least two frames of images to obtain a processing result that can characterize the brightness change of pixels in the image.
  • the processing result may represent the brightness change of one frame of the at least two frames of images relative to the pixels of the remaining frame of images.
  • the at least two frames of images may be two frames of images, and the processing result may represent the brightness changes of the pixels of the previous frame of images relative to the next frame of images in the two frames of images.
  • the processing result may also represent brightness changes of the multiple frames of images in the at least two frames of images relative to the pixel points of the remaining frame images.
  • at least two frames of images may be three frames of images, and the processing result may represent the brightness changes of pixels in the first two frames of images relative to the last frame of images.
  • Step 103 Render according to the processing result to obtain an event image; the event image is used to represent a brightness change event.
  • the processing result represents the brightness change of a pixel
  • the number of pixels is often large
  • rendering can be performed according to the brightness change of multiple pixels to summarize the brightness changes of these pixels. , which visualizes the overall brightness change relative to other images. Since the brightness change between images is often caused by the brightness change event, the brightness change event can be represented by the event image obtained after rendering.
  • Step 104 outputting the event image.
  • outputting the event image may be directly displaying the event image, or may be sending the event image to the display terminal. Since the event image can represent the brightness change event, and the image is more intuitive, the way of outputting the event by outputting the event image can make the user know the event more intuitively and conveniently to a certain extent, thereby improving the user's viewing efficiency.
  • the event detection method can acquire at least two frames of images collected by an industrial camera, perform image processing according to the at least two frames of images, and obtain a processing result, wherein the processing result is used to characterize the pixels in the image
  • the brightness change of the point is then rendered according to the processing result to obtain an event image, wherein the event image is used to represent the brightness change event, and finally, the event image is output.
  • event detection can be realized directly based on images collected by conventional industrial cameras without the need to specially set up an event camera, so the detection cost can be saved to a certain extent.
  • the above operation of performing image processing according to at least two frames of images may be implemented through the following sub-steps (1) to (2):
  • Sub-step (1) for each frame of the image, obtain the value of each color channel of the image respectively.
  • the color spaces to which the images belong are different, and the color channels included in the images may be different.
  • the image is an image in the red-green-blue (RGB) color space
  • the image can contain three color channels R, G, and B. Accordingly, in this step, R, G, and B in the image can be extracted. Channel values for the G and B color channels.
  • the image is an image in the Red-Yellow-Blue (RYB) color space
  • the image can contain three color channels R, Y, and B.
  • R, Y, and B in the image can be extracted. The channel value of the color channel.
  • the color space to which the image belongs may be the arrangement of color pixels adopted by the photosensitive element in the industrial camera.
  • the RGBGB arrangement an RGB image can be obtained
  • the RYYB arrangement an RYB image can be obtained.
  • Sub-step (2) Perform image difference processing respectively according to the values of each color channel of the at least two frames of images.
  • this step you can first determine the corresponding gray value according to the value of each color channel in the image, and then perform differential processing according to the gray value of the image on each color channel according to the time sequence of the image, so as to obtain the corresponding gray value of each color channel. Corresponding difference result. Finally, these difference results can be aggregated, for example, added to obtain a final difference result, and the final difference result can be used as a processing result. Alternatively, the difference result corresponding to each color channel may also be directly used as the processing result, which is not limited in this embodiment of the present invention. Since the brightness change event will cause the gray value change, in the embodiment of the present invention, the difference processing is performed according to the gray value, so that the finally obtained difference result can reflect the brightness change event.
  • the grayscale value of the image with the later timing can be subtracted from the grayscale value of the image with the earlier timing according to the image timing, and the difference between the two can be calculated.
  • the difference can be performed by subtracting the first frame image from the third frame image.
  • the difference can be performed by first comparing the third frame image and the fourth frame image.
  • the 4 frames are added, and then the first frame image and the second frame image are subtracted, that is, the difference result is used to represent the brightness change of the pixels of the last two frames of images relative to the first two frames of images.
  • differential algorithms may also be used to implement differential processing, which is not limited in this embodiment of the present invention.
  • the overall grayscale value is directly determined according to multiple color channels and differential processing is performed.
  • differential processing by extracting the values of each color channel separately, performing differential processing according to the grayscale values of each color channel, and finally summarizing, it is possible to avoid multiple color channels being combined into one channel for processing, resulting in The channels are coupled with each other, and the differential error increases, which in turn can improve the accuracy of differential processing.
  • the image collected by the industrial camera can also be a grayscale image.
  • the difference processing can be performed directly according to the gray value of each pixel in the image, so that the value of each color channel does not need to be extracted separately, the operation of the difference processing can be simplified to a certain extent, and the difference processing can be improved. s efficiency.
  • the difference result may be a difference image
  • the value of each pixel in the difference image may be the difference between grayscale values.
  • noise interference in the circuit of the image sensor.
  • noise such as shot noise, thermal noise, and fixed pattern noise that fluctuate over time.
  • noise may lead to inaccurate gray value, there are errors.
  • noise filtering processing may also be performed.
  • the following step A may be performed after image processing is performed according to at least two frames of images and a processing result is obtained:
  • Step A Perform noise filtering processing on the differential image.
  • various types of filters can be introduced in the frequency domain or wavelet domain of the differential image, and noise filtering is performed through the filters.
  • the processing result that is, the difference image
  • the processing result can be processed, so that the noise interference included in the processing result can be removed, thereby ensuring the accuracy of the difference result.
  • noise filtering processing may be implemented through the following steps B to D:
  • Step B Obtain m frames of images from the at least two frames of images; the m is a positive integer.
  • the specific value of m may be determined according to the actual situation.
  • the m frames of images may be images that participate in image difference processing among at least two frames of images. That is, the difference processing can be performed once every time m frames of images are acquired.
  • noise filtering processing may be performed on the m frames of images before each differential processing is performed, that is, noise filtering processing and differential processing are performed alternately, so as to maintain the event output while reducing the noise. Noise interference in the differential processing performed each time, thereby ensuring the effect of the differential processing.
  • Step C Perform weighted average processing on the m frames of images to collect noise information in the m frames of images.
  • the noise information in the m frames of images is collected, which can facilitate the subsequent steps to centrally process the noise information.
  • the preset weight corresponding to each frame of image may be determined according to the preset weight and the weight distribution rule. For example, it can be assigned in such a way that the lower the image quality, the higher the weight.
  • a weighted sum corresponding to each pixel in the m frames of images may be calculated according to the preset weight corresponding to each frame of the image and the pixel value of each pixel.
  • Step D performing noise filtering processing on the noise information.
  • the image composed of the weighted sum corresponding to these pixels can be filtered for noise processing, and then realize the noise filtering processing on the noise information.
  • noise filtering processing refer to the implementation manner in step A, or, spatial domain filtering may also be performed by using the temporal domain characteristics of m frames of images, which is not limited in this embodiment of the present invention.
  • noise filtering may be performed on each frame of image separately. However, it needs to perform m times of noise filtering processing to achieve filtering out noise interference.
  • the noise information in the multi-frame images is collected first, and then the noise information is filtered as a whole. In this way, only one noise filtering process needs to be performed, thereby improving the noise filtering efficiency to a certain extent.
  • the processing result may specifically include a difference image and a difference value corresponding to each pixel in the difference image.
  • the operation of rendering according to the processing result to obtain the event image can be implemented through the following sub-steps (3) to (4):
  • Sub-step (3) according to the sign of the difference value corresponding to the pixel point in the difference image, determine the rendering color corresponding to the pixel point, and determine the color of the rendering color according to the absolute value of the difference value concentration.
  • the sign of the difference value when the difference value is a positive number, the sign of the difference value may be a positive sign, and when the difference value is a negative number, the sign of the difference value may be a negative sign.
  • the rendering color is determined according to the sign of the difference value, the rendering color corresponding to the sign of the difference value can be searched according to the preset correspondence between the sign and the rendering color, and then the rendering color corresponding to the sign is determined as the rendering color of the pixel.
  • the preset correspondence between symbols and rendering colors may be set according to actual requirements. For example, the rendering color corresponding to the positive sign may be set to be red, and the rendering color corresponding to the negative sign may be set to green.
  • the sign of the difference value corresponding to a pixel point is a positive sign, it can be determined that the rendering color corresponding to the pixel point is red; if the sign of the difference value corresponding to a pixel point is a negative sign, it can be determined that the pixel point corresponding to the The rendering color is green.
  • the absolute value of the difference value can be positively correlated with the color density.
  • the absolute value of the difference value may be used as the input of the first preset generating function, and the output of the first preset function may be used as the color density.
  • the first preset function may be a function in which the dependent variable and the independent variable are positively correlated.
  • the absolute value of the difference value can also be negatively correlated with the color density.
  • the difference value is 0. In this case, the pixel point may not be rendered, thereby saving processing resources to a certain extent.
  • Sub-step (4) Render the pixel points according to the rendering color and the color density.
  • the pixel point may be rendered according to the rendering color and the color density, so that the rendered pixel point presents the rendering color, and the density of the rendered color matches the color density.
  • the rendered differential image is the event image, and the event image may be a matrix image. Since the difference value of a pixel point is often caused by a brightness change event that occurs, in the embodiment of the present invention, the pixel point is rendered into a rendering color corresponding to the sign of the difference value and a color density corresponding to the absolute value, which is obtained after rendering.
  • the demarcation between colors in the event image can indicate the direction of the event's change. For example, the direction with the clearest color demarcation in the event image may be used as the event change direction.
  • the specific change of the density at the boundary can reflect the starting and ending directions of the change direction. For example, when the absolute value of the difference value is positively correlated with the color density, the direction with the clearest color boundary has a lower density. The end of the concentration is used as the starting position, and the end with the higher concentration is used as the ending position. In this way, the brightness change event can be displayed more clearly and clearly through the event image.
  • the operation of rendering according to the processing result to obtain the event image may be implemented through the following sub-steps (5) to (6):
  • Sub-step (5) Determine the rendering color corresponding to the pixel point according to the corresponding relationship between the preset value and the rendering color and the difference value corresponding to the pixel point in the difference image.
  • the corresponding relationship between the preset numerical value and the rendering color may be set according to actual requirements.
  • one value can be set to correspond to one rendering color.
  • the same rendering color can also be set for values within the same value range. This is not limited.
  • the rendering color corresponding to the difference value may be searched in the corresponding relationship, and the corresponding rendering color may be determined as the rendering color corresponding to the pixel point.
  • Sub-step (6) Render the pixel points according to the rendering color.
  • the pixels can be rendered into corresponding rendering colors according to the default color density, so that the uniformity of the color density of the pixels can be ensured, thereby facilitating viewing by the user. Since the difference value of the pixel point is often caused by the occurrence of the brightness change event, in the embodiment of the present invention, the pixel point is rendered as the rendering color corresponding to the difference value, and the color distribution change in the difference image obtained after rendering can be reflected.
  • the direction of the event's change For example, the direction with the highest regularity of color distribution change may be used as the change direction of the event.
  • the end with a lower degree of regularity is taken as the starting position, and the end with a higher degree of regularity is regarded as the end position. In this way, the brightness change event can be displayed more clearly and clearly through the event image.
  • the event image can be output through a preset data interface.
  • the image frame rate of the output event image may not be higher than the frame rate of the image frame data stream collected by the industrial camera.
  • the brightness change event in this embodiment of the present invention may be an object motion. Accordingly, after rendering is performed according to the processing result to obtain the event image, the following steps E to G may be performed, so as to realize the execution according to the event detection result. Avoidance:
  • Step E Acquire a high dynamic range HDR image according to the at least two frames of images, and determine the moving direction of the object according to the event image.
  • the event change direction when the movement direction of the object is determined according to the event image, the event change direction may be read according to the event image, and the event change direction is determined as the movement direction of the object.
  • the event change direction is determined as the movement direction of the object.
  • an operation may be performed: according to the gray value of each pixel in the image, calculate each of the at least two frames of images. The sum of the grayscale values of the pixel points; normalize the sum of each of the grayscale values to obtain the HDR image.
  • HDR High-Dynamic Range
  • each color channel in the image the value of each color channel in the image can be read into the operation of calculating the sum of grayscale values.
  • this step can also be directly multiplexed and read separately when executing sub-steps (1) to (2).
  • the value of each color channel entered is not limited in this embodiment of the present invention.
  • the sum of the grayscale values can be directly performed according to the grayscale values of each pixel in the image to improve the calculation efficiency.
  • the integration operation may be performed on the preceding and following frame images in the at least two frame images according to the time sequence of the images, that is, the sum of the grayscale values of the pixels in the preceding and following frame images is calculated. Specifically, the sum of the gray values on each color channel can be calculated, and finally the sum of the gray values on the color channels is used as the sum of the gray values of the pixels.
  • the sum of gray values after adding multiple frames of images may exceed the bit depth and display range adopted by common display devices. Therefore, in this embodiment of the present invention, the sum of gray values can be normalized through a normalization operation, so that the HDR image can be displayed normally by a common display device, thereby improving the applicable range of the HDR image.
  • the bit depth used by a common display device may be 8 bits (bit).
  • the sum of gray values may be normalized to an 8-bit bit depth in the numerical range of 0-255.
  • the bit depth of the image can also be increased according to the number of frames of the image participating in the calculation, so as to ensure that the sum of the calculated gray values has enough bit depth for use.
  • the bit depth can be increased to 10-bit when the number of frames of the image involved in the calculation is 4.
  • an 8-bit image with a grayscale value in the range of 0-255 will obtain a 10-bit image in the range of 0-1023 through the integration of 4 frames of images.
  • By increasing it to 10bit it can ensure that the bit depth can meet the needs of use. need.
  • the noise of an image fluctuates randomly, while weak signals in an image, such as darker parts, are relatively stable. Therefore, in the embodiment of the present invention, after the sum operation of multiple frames of images, the increase of the noise intensity will be lower than the increase of the weak signal, so that the darker part of the image can be enhanced from the noise, and the signal-to-noise ratio in the dark area can be improved. , which in turn provides more imaging details. Further, when there are high-brightness and low-brightness areas in the imaging scene at the same time, due to the random fluctuation of noise, it is possible to achieve enhanced perception of low-brightness areas on the premise of keeping high-brightness areas without overexposure to a certain extent.
  • the summation calculation method can also improve the image signal-to-noise ratio to a certain extent. .
  • the HDR image can be output through the preset data interface for use in subsequent steps.
  • the output image may be a color image, which can help determine the material and texture information of the object, thereby improving the efficiency of subsequent image recognition.
  • noise interference there may be some noise interference in the HDR image.
  • the HDR image may also be subjected to noise filtering processing.
  • noise filtering methods refer to the noise filtering processing performed in the above-mentioned related steps, for example, spatial domain filtering, frequency domain filtering, and wavelet domain filtering can be adopted.
  • Step F Identify the position information of the object in the external environment according to the HDR image.
  • the position information of an object in the external environment may be the coordinates of the object in the external environment, or may be the relative positional relationship, relative distance, etc. between the object and other objects in the external environment.
  • a preset image recognition algorithm may be used to recognize the HDR image, for example, operations such as target object recognition, image semantic segmentation, etc. may be performed to determine the location information.
  • the embodiment of the present invention may also compare the features of the recognized object with the features of the event in the event image, and if the two match, it is confirmed that the object is a moving object , and then further determine the position information of the object.
  • Step G Perform obstacle avoidance on the object according to the movement direction and the position information.
  • the movement change that will occur after the object can be estimated according to the movement direction and position information, and the obstacle avoidance strategy is determined according to the movement change. For example, assuming that the movement direction is horizontal movement from left to right, the object is currently in the middle position, and will then move to the far right of the viewing range, then the device can be controlled to move away from the right position to avoid the object.
  • the HDR image is further generated according to the image collected by the industrial camera, and the moving direction of the object is determined in combination with the event image.
  • the HDR image since the HDR image has a high dynamic range, it can provide sufficient information for image recognition and analysis. information, which can ensure that the position information of the object can be identified, thereby ensuring the effect of obstacle avoidance.
  • the position of the object on the front and rear frame images may change, so that motion blur may be formed in the HDR image obtained by the operation to a certain extent. Therefore, in this embodiment of the present invention, after acquiring the HDR image according to at least two frames of images, the following steps may be performed:
  • Step H Determine a motion blur kernel according to the motion information of the object in the event image.
  • the motion information may be related information of pixels in the direction of event change. Since the blurring is caused by the motion of the object, the blurring of the image can also be called image degradation. Therefore, these pixel points can be considered as points causing degradation.
  • the motion blur kernel can be a convolution kernel used to describe the motion trajectory of these pixels.
  • a motion blur model may be estimated first according to the motion information of the object in the event image.
  • the HDR image since there is motion blur in the HDR image, the HDR image may be used as the degraded image, the image used for generating the HDR image may be used as the original clear image, and the motion information may be used as the input of the preset degraded point spread function.
  • the image degradation model is constructed according to the degraded image, the original clear image and the degraded point spread function, and the final image degradation model is used as the motion blur model.
  • the motion blur kernel may be estimated according to the motion blur model.
  • the motion blur kernel may be a space-invariant blur kernel or a space-variant blur kernel.
  • the space-invariant blur kernel assumes that the overall blur degree of the HDR image is the same, and when used, it is equivalent to performing an overall deblurring operation on the image.
  • the space-invariant blur kernel is a global calculation.
  • the overall estimation algorithm can be used according to the motion blur model. Principal component analysis and other overall estimation algorithms to determine the motion blur kernel. Since the operation of the overall estimation is relatively simple, the use of a space-invariant blur kernel can improve the computational efficiency to a certain extent. However, since it is equivalent to performing an overall deblurring operation on the image, it may cause defects such as ringing effects, which in turn lead to poor deblurring effects.
  • the motion blur kernel can be determined by a local estimation algorithm according to the motion blur model.
  • the spatial variation blur kernel assumes that the object motion in the three-dimensional space has different blurring degrees on the image.
  • the spatial variation blur kernel can be used to reflect the difference of this blurring degree in the local area. Therefore, it can often get better deblurring when using it. Effect.
  • the computational complexity of local estimation is high. Therefore, the specific use of the space-invariant blur kernel or the space-varying blur kernel can be selected according to actual needs.
  • Step 1 Perform deblurring on the HDR image according to the motion blur check.
  • a motion blur kernel may be used to perform a deconvolution operation on the HDR image, thereby implementing deblurring processing.
  • the image convolution kernel for motion blur can be directly estimated based on the prior knowledge of the image, for example, the statistical correlation of the previous and subsequent frame images and other data features, as well as the preset convolution kernel function, and finally the image convolution kernel is used to check the HDR.
  • the image is deconvolved.
  • a grayscale image or a color image corresponding to the at least two frames of images may be output.
  • a grayscale image may be output when the collected image is a grayscale image
  • a color image may be output when the collected image is a color image.
  • the color map refers to a map belonging to a color space, such as an RGB map. When outputting an image, it can be output through a preset data interface. In the scenario of continuously acquiring the images in the image frame data stream collected by the industrial camera, the images in the data stream can be continuously output.
  • the output image frame rate may be the same as the frame rate of the image frame data stream.
  • FIG. 2 is a schematic diagram of an output comparison provided by an embodiment of the present invention.
  • the event camera can output the event at a higher time sampling frequency, but can only output the event and cannot output the image.
  • the output of the event camera can be as shown in the first line.
  • using an industrial camera, that is, a classic imaging camera can output the image frames collected at each moment shown in the second row, and determine the event image through inter-frame operations, and output the event image by outputting the event image (only shown in FIG. 2 ). out the event change direction in the event image) to realize the event output.
  • the frame rate of the industrial camera in the embodiment of the present invention may be not less than 100 frames per second (frame per second, fps), that is, the industrial camera may be a high-speed industrial camera.
  • frame per second frame per second
  • the frame rate of the industrial camera in the embodiment of the present invention may be not less than 100 frames per second (frame per second, fps), that is, the industrial camera may be a high-speed industrial camera.
  • the problems of poor real-time detection and low practicability due to too low frame rate can be avoided.
  • the frame rate of the perception of the environment is similar to that of the human eye, about 20fps.
  • the time sampling frequency of the event camera can reach 1 kilohertz (kHz) or even 100 kHz, which meets the application requirements of robots to a large extent, it only provides event information.
  • kHz kilohertz
  • 100 fps the inter-frame operation is used to realize the event output satisfying the application of the robot, and at the same time, information such as images and HDR images can be further provided, thereby providing more comprehensive information for the robot perception. Live image data.
  • FIG. 3 is a schematic diagram of an output provided by an embodiment of the present invention.
  • the embodiment of the present invention can directly output a grayscale image or a color image.
  • the HDR image and the event image are output through the inter-frame operation (only the event change direction is shown in FIG. 3 ).
  • FIG. 4 is a schematic diagram of a processing process of a specific example provided by an embodiment of the present invention.
  • the image output type can be selected according to actual needs.
  • the original image that is, the image in the image frame data stream
  • the output event image the operation of reading in the raw data by channel may be performed first, that is, the operation shown in the sub-step (1) above.
  • the operation of visualizing the difference numerical value symbol that is, the rendering operation shown in the above steps is performed.
  • the event image can be output.
  • the operation of reading in the raw data of the channels may be performed first, and then the inter-frame integration operation, that is, the operation of acquiring the HDR image shown in the above step E may be performed.
  • the noise filtering and deblurring may be performed, wherein deblurring may be performed in conjunction with the event image.
  • an HDR image can be output.
  • an HDR image can be generated and output, and an image collected by an industrial camera can be output, thereby improving the richness of output data.
  • industrial cameras have higher compatibility with various computing platforms, smaller pixel sizes, larger fill factors and larger array scales. Therefore, higher-resolution images can be output to further improve image information. richness.
  • industrial cameras can achieve high acquisition and transmission speeds (eg, using 5 switching bandwidth (Gbps) to achieve speeds of about 200 million pixels/sec), in high-speed imaging modes greater than 200fps
  • the Video Graphics Array (VGA) in the camera can output images at higher image resolutions such as 720p.
  • the image processing process in the embodiment of the present invention may be a field programmable gate array (Field Programmable Gate Array, FPGA), a central processing unit (Central Processing Unit, CPU) carried in the event detection device. , Graphics Processing Unit (GPU), embedded neural network processor (Neural-network Processing Unit, NPU) and other high-speed image computing processing units.
  • the specific number of images used when generating the event image may be determined according to the acquisition and data transmission speed of the industrial camera, the computing power of the processor, and the actual output frame rate requirements of the event image. Not limited.
  • FIG. 5 is a block diagram of an event detection apparatus provided by an embodiment of the present invention.
  • the apparatus 50 may include: a memory 501 and a processor 502 .
  • the memory 501 is used to store program codes.
  • the processor 502 calls the program code, and when the program code is executed, is configured to perform the following operations:
  • the event image is output.
  • the processor 502 is specifically configured to:
  • Image difference processing is performed respectively according to the values of the respective color channels of the at least two frames of images.
  • the processing result includes a difference image; the processor 502 is further configured to: perform noise filtering processing on the difference image.
  • the processor 502 is further configured to: acquire m frames of images from the at least two frames of images; the m is a positive integer; and perform weighted average processing on the m frames of images to collect the m frames Noise information in the image; perform noise filtering processing on the noise information.
  • the brightness change event is movement of an object
  • the processor 502 is further configured to: acquire a high dynamic range HDR image according to the at least two frames of images, and determine the movement direction of the object according to the event image ; Recognize the position information of the object in the external environment according to the HDR image; perform obstacle avoidance for the object according to the movement direction and the position information.
  • the processor 502 is further configured to: determine a motion blur kernel according to the motion information of the object in the event image; and perform deblurring processing on the HDR image according to the motion blur kernel.
  • the processor 502 is specifically configured to: estimate a motion blur model according to the motion information of the object in the event image; estimate the motion blur kernel according to the motion blur model; wherein, the motion The blur kernel is a space-invariant blur kernel or a space-variant blur kernel.
  • the processor 502 is specifically configured to: calculate the sum of the grayscale values of each pixel point in the at least two frames of images according to the grayscale value of each pixel point in the image; The sum of the values is normalized to obtain the HDR image.
  • the processing result includes a difference image and a difference value corresponding to each pixel in the difference image; the processor 502 is specifically configured to: according to the difference value corresponding to the pixel in the difference image , determine the rendering color corresponding to the pixel point, and determine the color density of the rendering color according to the absolute value of the difference value; and render the pixel point according to the rendering color and the color density.
  • the processing result includes a difference image and a difference value corresponding to each of the pixels in the difference image; the processor 502 is specifically configured to: according to the preset value and the corresponding relationship between the rendering color and the corresponding difference value. The difference value corresponding to the pixel point in the difference image determines the rendering color corresponding to the pixel point; the pixel point is rendered according to the rendering color.
  • the frame rate of the industrial camera is not less than 100 frames per second.
  • the processor 502 is further configured to: output a grayscale map or a color map corresponding to the at least two frames of images.
  • the event detection device provided by the embodiment of the present invention can acquire at least two frames of images collected by an industrial camera, perform image processing according to the at least two frames of images, and obtain a processing result, wherein the processing result is used to characterize the pixels in the image The brightness change of the point is then rendered according to the processing result to obtain an event image, wherein the event image is used to represent the brightness change event, and finally, the event image is output.
  • event detection can be realized directly based on images collected by conventional industrial cameras without the need to specially set up an event camera, so the detection cost can be saved to a certain extent.
  • an embodiment of the present invention further provides a movable platform, where the movable platform includes an industrial camera and any of the above-mentioned event detection devices; the event detection device is configured to execute each step in the above-mentioned event detection method, And can achieve the same technical effect, in order to avoid repetition, it is not repeated here.
  • the movable platform includes a powered propeller and a drive motor for driving the powered propeller.
  • an embodiment of the present invention also provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, each step in the above-mentioned event detection method is implemented, and can To achieve the same technical effect, in order to avoid repetition, details are not repeated here.
  • the device 600 includes but is not limited to: a radio frequency unit 601, a network module 602, an audio output unit 603, an input unit 604, a sensor 605, a display unit 606, User input unit 607 , interface unit 608 , memory 609 , processor 610 , and power supply 611 and other components.
  • a radio frequency unit 601 for example, a radio frequency unit 601
  • a network module 602 for example, a radio frequency unit 601
  • an audio output unit 603 an input unit 604
  • a sensor 605 a sensor 605
  • a display unit 606 User input unit 607 , interface unit 608 , memory 609 , processor 610 , and power supply 611 and other components.
  • FIG. 6 does not constitute a limitation on the device, and the device may include more or less components than the one shown, or combine some components, or arrange different components.
  • the devices include but are not limited to mobile phones, tablet computers, notebook computers, handheld computers, vehicle-mounted devices, wearable devices, and pedometers.
  • the radio frequency unit 601 may be used for receiving and sending signals during sending and receiving of information or during a call. Specifically, after receiving the downlink data from the base station, it is processed by the processor 610; The uplink data is sent to the base station.
  • the radio frequency unit 601 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency unit 601 can also communicate with the network and other devices through a wireless communication system.
  • the device provides the user with wireless broadband Internet access through the network module 602, such as helping the user to send and receive emails, browse web pages, access streaming media, and the like.
  • the audio output unit 603 may convert audio data received by the radio frequency unit 601 or the network module 602 or stored in the memory 609 into audio signals and output as sound. Also, the audio output unit 603 may also provide audio output related to a specific function performed by the device 600 (eg, call signal reception sound, message reception sound, etc.).
  • the audio output unit 603 includes a speaker, a buzzer, a receiver, and the like.
  • the input unit 604 is used to receive audio or video signals.
  • the input unit 604 may include a graphics processor (Graphics Processing Unit, GPU) 6041 and a microphone 6042, and the graphics processor 6041 is used for still pictures or video images obtained by an image capture device (such as a camera) in a video capture mode or an image capture mode data is processed.
  • the processed image frames may be displayed on the display unit 606 .
  • the image frames processed by the graphics processor 6041 may be stored in the memory 609 (or other storage medium) or transmitted via the radio frequency unit 601 or the network module 602 .
  • the microphone 6042 can receive sound and can process such sound into audio data.
  • the processed audio data can be converted into a format that can be transmitted to a mobile communication base station via the radio frequency unit 601 for output in the case of a telephone call mode.
  • Device 600 also includes at least one sensor 605, such as a light sensor, motion sensor, and other sensors.
  • the light sensor includes an ambient light sensor and a proximity sensor, wherein the ambient light sensor can adjust the brightness of the display panel 6061 according to the brightness of the ambient light, and the proximity sensor can turn off the display panel 6061 and/or when the device 600 is moved to the ear. or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in all directions (usually three axes), and can detect the magnitude and direction of gravity when stationary, and can be used to identify the device posture (such as horizontal and vertical screen switching, related games,
  • the sensor 605 may also include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared Sensors, etc., will not be repeated here.
  • the display unit 606 is used to display information input by the user or information provided to the user.
  • the display unit 606 may include a display panel 6061, and the display panel 6061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
  • the user input unit 607 may be used to receive input numerical or character information, and generate key signal input related to user settings and function control of the device.
  • the user input unit 607 includes a touch panel 6041 and other input devices 6072 .
  • the touch panel 6041 also referred to as a touch screen, can collect the user's touch operations on or near it (such as the user's finger, stylus, etc., any suitable object or accessory on or near the touch panel 6041). operate).
  • the touch panel 6041 may include two parts, a touch detection device and a touch controller. Among them, the touch detection device detects the user's touch orientation, detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact coordinates, and then sends it to the touch controller. To the processor 610, the command sent by the processor 610 is received and executed.
  • the touch panel 6041 can be implemented in various types such as resistive, capacitive, infrared, and surface acoustic waves.
  • the user input unit 607 may also include other input devices 6072 .
  • other input devices 6072 may include, but are not limited to, physical keyboards, function keys (such as volume control keys, switch keys, etc.), trackballs, mice, and joysticks, which are not described herein again.
  • the touch panel 6041 can be covered on the display panel 6061. When the touch panel 6041 detects a touch operation on or near it, it transmits it to the processor 610 to determine the type of the touch event, and then the processor 610 determines the type of the touch event according to the touch The type of event provides a corresponding visual output on the display panel 6061.
  • the interface unit 608 is an interface for connecting an external device to the device 600 .
  • external devices may include wired or wireless headset ports, external power (or battery charger) ports, wired or wireless data ports, memory card ports, ports for connecting devices with identification modules, audio input/output (I/O) ports, video I/O ports, headphone ports, and more.
  • Interface unit 608 may be used to receive input (eg, data information, power, etc.) from an external device and transmit the received input to one or more elements within device 600 or may be used between device 600 and an external device. transfer data between.
  • the memory 609 may be used to store software programs as well as various data.
  • the memory 609 may mainly include a stored program area and a stored data area, wherein the stored program area may store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), etc.; Data created by the use of the mobile phone (such as audio data, phone book, etc.), etc.
  • memory 609 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
  • the processor 610 is the control center of the device, uses various interfaces and lines to connect various parts of the entire device, and executes by running or executing the software programs and/or modules stored in the memory 609, and calling the data stored in the memory 609. Various functions of the equipment and processing data, so as to monitor the equipment as a whole.
  • the processor 610 may include one or more processing units; preferably, the processor 610 may integrate an application processor and a modem processor, wherein the application processor mainly processes the operating system, user interface, and application programs, etc., and the modem
  • the processor mainly handles wireless communication. It can be understood that, the above-mentioned modulation and demodulation processor may not be integrated into the processor 610.
  • the device 600 may also include a power supply 611 (such as a battery) for supplying power to various components.
  • the power supply 611 may be logically connected to the processor 610 through a power management system, so as to manage charging, discharging, and power consumption management through the power management system.
  • the device 600 includes some unshown functional modules, which will not be repeated here.
  • the device embodiments described above are only illustrative, wherein the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in One place, or it can be distributed over multiple network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution in this embodiment. Those of ordinary skill in the art can understand and implement it without creative effort.
  • FIG. 1 Various component embodiments of the present invention may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof.
  • a microprocessor or a digital signal processor may be used in practice to implement some or all of the functions of some or all of the components in the computing processing device according to the embodiments of the present invention.
  • the present invention can also be implemented as apparatus or apparatus programs (eg, computer programs and computer program products) for performing part or all of the methods described herein.
  • Such a program implementing the present invention may be stored on a computer-readable medium, or may be in the form of one or more signals.
  • Such signals may be downloaded from Internet sites, or provided on carrier signals, or in any other form. For example, FIG.
  • FIG. 7 is a block diagram of a computing and processing device provided by an embodiment of the present invention. As shown in FIG. 7 , FIG. 7 shows a computing and processing device that can implement the method according to the present invention.
  • the computing processing device traditionally includes a processor 710 and a computer program product or computer readable medium in the form of a memory 720 .
  • the memory 720 may be electronic memory such as flash memory, EEPROM (electrically erasable programmable read only memory), EPROM, hard disk, or ROM.
  • the memory 720 has storage space 730 for program code for performing any of the method steps in the above-described methods.
  • the storage space 730 for program codes may include various program codes for implementing various steps in the above methods, respectively.
  • These program codes can be read from or written to one or more computer program products.
  • These computer program products include program code carriers such as hard disks, compact disks (CDs), memory cards or floppy disks.
  • Such computer program products are typically portable or fixed storage units as described with reference to FIG. 8 .
  • the storage unit may have storage segments, storage spaces, etc. arranged similarly to the memory 720 in the computing processing device of FIG. 7 .
  • the program code may, for example, be compressed in a suitable form.
  • the storage unit includes computer readable code, ie code readable by a processor such as 710 for example, which when executed by a computing processing device, causes the computing processing device to perform each of the methods described above. step.
  • any reference signs placed between parentheses shall not be construed as limiting the claim.
  • the word “comprising” does not exclude the presence of elements or steps not listed in a claim.
  • the word “a” or “an” preceding an element does not exclude the presence of a plurality of such elements.
  • the invention can be implemented by means of hardware comprising several different elements and by means of a suitably programmed computer. In a unit claim enumerating several means, several of these means may be embodied by one and the same item of hardware.
  • the use of the words first, second, and third, etc. do not denote any order. These words can be interpreted as names.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Image Processing (AREA)

Abstract

An event detection method and device, a movable platform, and a computer-readable storage medium. The method comprises: obtaining at least two images acquired by an industrial camera; performing image processing according to the at least two images, so as to obtain a processing result, wherein the processing result is used to indicate changes in pixel brightness in the images; performing rendering according to the processing result, so as to obtain an event image, wherein the event image is used to indicate a brightness change event; and outputting the event image. In this way, event detection can be performed by directly using images acquired by a conventional industrial camera without needing to specifically provide an event camera, thereby reducing detection costs to a certain extent.

Description

事件检测方法、装置、可移动平台及计算机可读存储介质Event detection method, apparatus, removable platform, and computer-readable storage medium 技术领域technical field
本发明属于机器视觉技术领域,特别是涉及一种事件检测方法、装置、可移动平台及计算机可读存储介质。The present invention belongs to the technical field of machine vision, and in particular, relates to an event detection method, a device, a movable platform and a computer-readable storage medium.
背景技术Background technique
在机器视觉领域的任务中,例如,运动物体的高速感知、高动态范围场景感知、机器人实时定位与地图重建(SLAM)、运动物体目标识别及跟踪等任务中,经常需要检测外界环境的事件变化,即,检测外界环境中引起亮度变化的事件。In tasks in the field of machine vision, such as high-speed perception of moving objects, high dynamic range scene perception, real-time robot localization and map reconstruction (SLAM), and target recognition and tracking of moving objects, it is often necessary to detect changes in the external environment. , that is, to detect events in the external environment that cause changes in brightness.
现有技术中,往往是安装事件相机,通过事件相机实现事件检测。具体的,事件相机是一种机器视觉图像传感器,事件相机可以通过其内部特殊的像素电路,在感知到外界环境中存在光强变化时,直接输出光强的变化值,进而实现事件检测。但是,由于事件相机中像素电路的结构较为复杂,工艺难度较大,因此成本往往较高。这样,会导致使用事件相机进行检测时,检测成本较高。In the prior art, an event camera is often installed, and event detection is implemented through the event camera. Specifically, the event camera is a machine vision image sensor. The event camera can directly output the change value of the light intensity through its internal special pixel circuit when sensing the change of light intensity in the external environment, thereby realizing event detection. However, since the structure of the pixel circuit in the event camera is complex and the process is difficult, the cost is often high. In this way, the detection cost will be higher when the event camera is used for detection.
发明内容SUMMARY OF THE INVENTION
本发明提供一种事件检测方法、装置、可移动平台及计算机可读存储介质,以便解决使用事件相机进行检测,检测成本较高的问题。The present invention provides an event detection method, a device, a movable platform and a computer-readable storage medium, so as to solve the problem of high detection cost by using an event camera for detection.
为了解决上述技术问题,本发明是这样实现的:In order to solve the above-mentioned technical problems, the present invention is achieved in this way:
第一方面,本发明实施例提供了一种事件检测方法,该方法包括:In a first aspect, an embodiment of the present invention provides an event detection method, which includes:
获取通过工业相机采集的至少两帧图像;Acquire at least two frames of images captured by an industrial camera;
根据所述至少两帧图像进行图像处理,得到处理结果;所述处理结果用于表征所述图像中像素点的亮度变化情况;Perform image processing according to the at least two frames of images to obtain a processing result; the processing result is used to characterize the brightness change of the pixels in the image;
根据所述处理结果进行渲染,以获取事件图像;所述事件图像用于表征亮度变化事件;Rendering according to the processing result to obtain an event image; the event image is used to represent a brightness change event;
输出所述事件图像。The event image is output.
第二方面,本发明实施例提供了一种事件检测装置,所述装置包括存储器和处理器;In a second aspect, an embodiment of the present invention provides an event detection apparatus, the apparatus includes a memory and a processor;
所述存储器,用于存储程序代码;the memory for storing program codes;
所述处理器,调用所述程序代码,当所述程序代码被执行时,用于执行以下操作:The processor calls the program code, and when the program code is executed, is configured to perform the following operations:
获取通过工业相机采集的至少两帧图像;Acquire at least two frames of images captured by an industrial camera;
根据所述至少两帧图像进行图像处理,得到处理结果;所述处理结果用于表征所述图像中像素点的亮度变化情况;Perform image processing according to the at least two frames of images to obtain a processing result; the processing result is used to characterize the brightness change of the pixels in the image;
根据所述处理结果进行渲染,以获取事件图像;所述事件图像用于表征亮度变化事件;Rendering according to the processing result to obtain an event image; the event image is used to represent a brightness change event;
输出所述事件图像。The event image is output.
第三方面,本发明实施例提供了一种可移动平台,所述可移动平台包含工业相机和上述的事件检测装置;所述事件检测装置用于执行以下操作:In a third aspect, an embodiment of the present invention provides a movable platform, where the movable platform includes an industrial camera and the above-mentioned event detection device; the event detection device is configured to perform the following operations:
获取通过工业相机采集的至少两帧图像;Acquire at least two frames of images captured by an industrial camera;
根据所述至少两帧图像进行图像处理,得到处理结果;所述处理结果用于表征所述图像中像素点的亮度变化情况;Perform image processing according to the at least two frames of images to obtain a processing result; the processing result is used to characterize the brightness change of the pixels in the image;
根据所述处理结果进行渲染,以获取事件图像;所述事件图像用于表征亮度变化事件;Rendering according to the processing result to obtain an event image; the event image is used to represent a brightness change event;
输出所述事件图像。The event image is output.
第四方面,本发明实施例提供了一种计算机可读存储介质,所述计算机可读存储介质上存储计算机程序,所述计算机程序被处理器执行时实现以下操作:In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the following operations are implemented:
获取通过工业相机采集的至少两帧图像;Acquire at least two frames of images captured by an industrial camera;
根据所述至少两帧图像进行图像处理,得到处理结果;所述处理结果用于表征所述图像中像素点的亮度变化情况;Perform image processing according to the at least two frames of images to obtain a processing result; the processing result is used to characterize the brightness change of the pixels in the image;
根据所述处理结果进行渲染,以获取事件图像;所述事件图像用于表征亮度变化事件;Rendering according to the processing result to obtain an event image; the event image is used to represent a brightness change event;
输出所述事件图像。The event image is output.
在本发明实施例中,可以获取通过工业相机采集的至少两帧图像,根据至少两帧图像进行图像处理,得到处理结果,其中,处理结果用于表征图像中像素点的亮度变化情况,接着根据处理结果进行渲染,以获取事件图像,其中,事件图像用于表征亮度变化事件,最后,输出事件图像。这样,无需专门设置事件相机,直接根据常规的工业相机采集的图像,即可实现事件检测,因此一定程度上可以节省检测成本。In the embodiment of the present invention, at least two frames of images collected by an industrial camera may be acquired, and image processing is performed according to the at least two frames of images to obtain a processing result, wherein the processing result is used to characterize the brightness change of the pixels in the image, and then according to The processing result is rendered to obtain an event image, wherein the event image is used to represent the brightness change event, and finally, the event image is output. In this way, event detection can be realized directly based on images collected by conventional industrial cameras without the need to specially set up an event camera, so the detection cost can be saved to a certain extent.
附图说明Description of drawings
图1是本发明实施例提供的一种事件检测方法的步骤流程图;1 is a flowchart of steps of an event detection method provided by an embodiment of the present invention;
图2是本发明实施例提供的一种输出对比示意图;2 is a schematic diagram of an output comparison provided by an embodiment of the present invention;
图3是本发明实施例提供的一种输出示意图;3 is a schematic diagram of an output provided by an embodiment of the present invention;
图4是本发明实施例提供的一种具体实例的处理过程示意图;4 is a schematic diagram of a processing process of a specific example provided by an embodiment of the present invention;
图5是本发明实施例提供的一种事件检测装置的框图;5 is a block diagram of an event detection apparatus provided by an embodiment of the present invention;
图6为实现本发明各个实施例的一种设备的硬件结构示意图;6 is a schematic diagram of a hardware structure of a device implementing various embodiments of the present invention;
图7为本发明实施例提供的一种计算处理设备的框图;FIG. 7 is a block diagram of a computing processing device according to an embodiment of the present invention;
图8为本发明实施例提供的一种便携式或者固定存储单元的框图。FIG. 8 is a block diagram of a portable or fixed storage unit according to an embodiment of the present invention.
具体实施方式detailed description
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本发明保 护的范围。The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are part of the embodiments of the present invention, but not all of the embodiments. Based on the embodiments of the present invention, all other embodiments obtained by those of ordinary skill in the art without creative work fall within the protection scope of the present invention.
图1是本发明实施例提供的一种事件检测方法的步骤流程图,如图1所示,该方法可以包括:FIG. 1 is a flowchart of steps of an event detection method provided by an embodiment of the present invention. As shown in FIG. 1 , the method may include:
步骤101、获取通过工业相机采集的至少两帧图像。Step 101: Acquire at least two frames of images collected by an industrial camera.
本发明实施例中提供的事件检测方法可以通过事件检测设备执行,示例的,事件检测设备可以为无人机、自动驾驶汽车、自动导引运输车、可移动机器人,等等。The event detection method provided in the embodiment of the present invention may be performed by an event detection device. For example, the event detection device may be an unmanned aerial vehicle, an autonomous vehicle, an automatic guided transport vehicle, a mobile robot, and the like.
工业相机可以连续进行拍摄以采集图像帧数据流。至少两帧图像可以是工业相机连续拍摄的图像帧数据流中包含的图像,这些图像对应的拍摄时刻可以不同。工业相机可以搭载在事件检测设备上,相应地,可以通过直接读取工业相机采集的图像以实现获取。工业相机也可以与事件检测设备独立设置,相应地,可以通过接收工业相机通过无线传输技术传输的图像以实现获取。进一步地,工业相机可以采用通用的图像传感器芯片,例如,工业相机可以是基于电荷耦合器件(Charge Coupled Device,CCD)或互补金属氧化物半导体(Complementary Metal Oxide Semiconductor,CMOS)芯片的相机。由于工业相机无需特殊像素电路、即,无需在传感器芯片层面进行特殊设计和工艺,结构简单,制造工艺更加成熟,因此成本较低。这样,通过使用工业相机采集的图像进行事件检测,一定程度上可以节省检测成本。Industrial cameras can shoot continuously to capture a stream of image frames. The at least two frames of images may be images included in the image frame data stream continuously captured by the industrial camera, and the capture moments corresponding to these images may be different. The industrial camera can be mounted on the event detection device, and accordingly, the acquisition can be achieved by directly reading the image captured by the industrial camera. The industrial camera can also be set independently from the event detection device, and accordingly, the acquisition can be achieved by receiving images transmitted by the industrial camera through wireless transmission technology. Further, the industrial camera may use a general-purpose image sensor chip, for example, the industrial camera may be a camera based on a charge coupled device (Charge Coupled Device, CCD) or a complementary metal oxide semiconductor (Complementary Metal Oxide Semiconductor, CMOS) chip. Since the industrial camera does not require special pixel circuits, that is, no special design and process are required at the sensor chip level, the structure is simple, and the manufacturing process is more mature, so the cost is lower. In this way, detection costs can be saved to a certain extent by using images captured by industrial cameras for event detection.
步骤102、根据所述至少两帧图像进行图像处理,得到处理结果;所述处理结果用于表征所述图像中像素点的亮度变化情况。Step 102: Perform image processing according to the at least two frames of images to obtain a processing result; the processing result is used to characterize the brightness change of the pixels in the image.
由于要检测的事件是引起光强变化的事件,而亮度变化往往会引起光强变化。同时,由于每帧图像蕴含拍摄该帧图像时的亮度情况,因此,本发明实施例可以先通过至少两帧图像进行图像处理,以获取能够表征图像中像素点的亮度变化情况的处理结果。其中,处理结果可以是表征至少两帧图像中的一帧图像相对于其余帧图像的像素点的亮度变化情况。例如,至少两帧图像可以为两帧图像,处理结果可以是表征这两帧图像中前一帧图像相对于后一帧图像的像素点的亮度变化情况。处理结果也可以是表征至少两帧图像中的多帧图像相对于其余帧图像的像素点的亮度变化情况。例如,至少两帧图像可以为三帧图像,处理结果可以是表征这三帧图像中前两帧图像相对于最后一帧图像的像素点的亮度变化情况。Since the events to be detected are events that cause changes in light intensity, changes in brightness often cause changes in light intensity. At the same time, since each frame of image contains the brightness of the frame of image, the embodiment of the present invention may first perform image processing through at least two frames of images to obtain a processing result that can characterize the brightness change of pixels in the image. Wherein, the processing result may represent the brightness change of one frame of the at least two frames of images relative to the pixels of the remaining frame of images. For example, the at least two frames of images may be two frames of images, and the processing result may represent the brightness changes of the pixels of the previous frame of images relative to the next frame of images in the two frames of images. The processing result may also represent brightness changes of the multiple frames of images in the at least two frames of images relative to the pixel points of the remaining frame images. For example, at least two frames of images may be three frames of images, and the processing result may represent the brightness changes of pixels in the first two frames of images relative to the last frame of images.
步骤103、根据所述处理结果进行渲染,以获取事件图像;所述事件图像用于表征亮度变化事件。Step 103: Render according to the processing result to obtain an event image; the event image is used to represent a brightness change event.
本发明实施例中,由于处理结果是表征像素点的亮度变化情况,而像素点的数量往往较多,因此可以根据多个像素点的亮度变化情况进行渲染,以汇总这些像素点的亮度变化情况,使得相对于其他图像的整体亮度变化情况具象化。由于图像之间发生的亮度变化,往往由亮度变化事件引起,因此可以通过渲染之后得到的事件图像表征亮度变化事件。In the embodiment of the present invention, since the processing result represents the brightness change of a pixel, and the number of pixels is often large, rendering can be performed according to the brightness change of multiple pixels to summarize the brightness changes of these pixels. , which visualizes the overall brightness change relative to other images. Since the brightness change between images is often caused by the brightness change event, the brightness change event can be represented by the event image obtained after rendering.
步骤104、输出所述事件图像。 Step 104, outputting the event image.
本发明实施例中,输出事件图像可以是直接显示事件图像,也可以是将事件图像发送给显 示终端。由于事件图像可以表征亮度变化事件,且图像更加直观形象,因此,通过输出事件图像以实现输出事件的方式,一定程度上可以使用户更加直观便捷的获知事件,进而提高用户的查看效率。In this embodiment of the present invention, outputting the event image may be directly displaying the event image, or may be sending the event image to the display terminal. Since the event image can represent the brightness change event, and the image is more intuitive, the way of outputting the event by outputting the event image can make the user know the event more intuitively and conveniently to a certain extent, thereby improving the user's viewing efficiency.
综上所述,本发明实施例提供的事件检测方法,可以获取通过工业相机采集的至少两帧图像,根据至少两帧图像进行图像处理,得到处理结果,其中,处理结果用于表征图像中像素点的亮度变化情况,接着根据处理结果进行渲染,以获取事件图像,其中,事件图像用于表征亮度变化事件,最后,输出事件图像。这样,无需专门设置事件相机,直接根据常规的工业相机采集的图像,即可实现事件检测,因此一定程度上可以节省检测成本。To sum up, the event detection method provided by the embodiment of the present invention can acquire at least two frames of images collected by an industrial camera, perform image processing according to the at least two frames of images, and obtain a processing result, wherein the processing result is used to characterize the pixels in the image The brightness change of the point is then rendered according to the processing result to obtain an event image, wherein the event image is used to represent the brightness change event, and finally, the event image is output. In this way, event detection can be realized directly based on images collected by conventional industrial cameras without the need to specially set up an event camera, so the detection cost can be saved to a certain extent.
可选的,上述根据至少两帧图像进行图像处理的操作可以通过下述子步骤(1)~子步骤(2)实现:Optionally, the above operation of performing image processing according to at least two frames of images may be implemented through the following sub-steps (1) to (2):
子步骤(1):对于每帧所述图像,分别获取所述图像的各个颜色通道的值。Sub-step (1): for each frame of the image, obtain the value of each color channel of the image respectively.
本发明实施例中,图像所属的色彩空间不同,图像中包含的颜色通道可以不同。示例的,如果图像为红绿蓝(Red-Green-Blue,RGB)色彩空间的图像,那么图像中可以包含R、G、B三个颜色通道,相应地,本步骤中可以提取图像中R、G、B颜色通道的通道值。如果图像为红黄蓝(Red-Yellow-Blue,RYB)色彩空间的图像,那么图像中可以包含R、Y、B三个颜色通道,相应地,本步骤中可以提取图像中R、Y、B颜色通道的通道值。具体的,图像所属的颜色空间可以是由工业相机中感光元件所采用的颜色像素点的排列方式。例如,在采用RGGB排列方式的情况下,可以得到RGB图像,在采用RYYB排列方式的情况下,可以得到RYB图像。In this embodiment of the present invention, the color spaces to which the images belong are different, and the color channels included in the images may be different. For example, if the image is an image in the red-green-blue (RGB) color space, the image can contain three color channels R, G, and B. Accordingly, in this step, R, G, and B in the image can be extracted. Channel values for the G and B color channels. If the image is an image in the Red-Yellow-Blue (RYB) color space, the image can contain three color channels R, Y, and B. Correspondingly, in this step, R, Y, and B in the image can be extracted. The channel value of the color channel. Specifically, the color space to which the image belongs may be the arrangement of color pixels adopted by the photosensitive element in the industrial camera. For example, in the case of using the RGGB arrangement, an RGB image can be obtained, and in the case of the RYYB arrangement, an RYB image can be obtained.
子步骤(2):根据所述至少两帧图像的各个颜色通道的值,分别进行图像差分处理。Sub-step (2): Perform image difference processing respectively according to the values of each color channel of the at least two frames of images.
本步骤中,可以先分别根据图像中各个颜色通道的值,确定对应的灰度值,然后按照图像的时序分别根据图像在各个颜色通道上的灰度值进行差分处理,得到每个颜色通道上对应的差分结果。最后,可以对这些差分结果进行汇总,例如,进行相加,得到最终的差分结果,将最终的差分结果作为处理结果。或者,也可以直接将每个颜色通道上对应的差分结果作为处理结果,本发明实施例对此不作限定。由于亮度变化事件会引起灰度值变化,因此,本发明实施例中根据灰度值进行差分处理,使得最终得到的差分结果可以体现亮度变化事件。In this step, you can first determine the corresponding gray value according to the value of each color channel in the image, and then perform differential processing according to the gray value of the image on each color channel according to the time sequence of the image, so as to obtain the corresponding gray value of each color channel. Corresponding difference result. Finally, these difference results can be aggregated, for example, added to obtain a final difference result, and the final difference result can be used as a processing result. Alternatively, the difference result corresponding to each color channel may also be directly used as the processing result, which is not limited in this embodiment of the present invention. Since the brightness change event will cause the gray value change, in the embodiment of the present invention, the difference processing is performed according to the gray value, so that the finally obtained difference result can reflect the brightness change event.
具体的,进行图像差分处理时,可以按照图像时序,使用时序晚的图像的灰度值减去时序早的图像的灰度值,计算两者的差值。示例的,假设至少两帧图像为工业相机采集的数据流中的第1帧图像和第3帧图像,进行差分时可以是使用第3帧图像减去第1帧图像。又或者,假设至少两帧图像为工业相机采集的数据流中的第1帧图像、第2帧图像、第3帧图像和第4帧图像,进行差分时可以是先将第3帧图像和第4帧相加,然后再减去第1帧图像及第2帧图像,即,差分结果用于表征后两帧图像相对于前两帧图像的像素点的亮度变化情况。当然,也可以是采用其他差分算法实现差分处理,本发明实施例对此不作限定。Specifically, when performing image difference processing, the grayscale value of the image with the later timing can be subtracted from the grayscale value of the image with the earlier timing according to the image timing, and the difference between the two can be calculated. For example, it is assumed that at least two frames of images are the first frame image and the third frame image in the data stream collected by the industrial camera, and the difference can be performed by subtracting the first frame image from the third frame image. Or, assuming that at least two frames of images are the first frame image, the second frame image, the third frame image and the fourth frame image in the data stream collected by the industrial camera, the difference can be performed by first comparing the third frame image and the fourth frame image. The 4 frames are added, and then the first frame image and the second frame image are subtracted, that is, the difference result is used to represent the brightness change of the pixels of the last two frames of images relative to the first two frames of images. Of course, other differential algorithms may also be used to implement differential processing, which is not limited in this embodiment of the present invention.
进一步地,相较于不对各颜色通道区分处理,直接根据多个颜色通道,确定整体的灰度值 并进行差分处理的方式。本发明实施例通过分别提取各个颜色通道的值,并分别根据各个颜色通道上的灰度值进行差分处理,最后再进行汇总的方式,可以避免多个颜色通道被合并为一个通道进行处理,导致通道相互耦合,差分误差增大的问题,进而可以提高差分处理的精确度。Further, compared to not processing each color channel differently, the overall grayscale value is directly determined according to multiple color channels and differential processing is performed. In the embodiment of the present invention, by extracting the values of each color channel separately, performing differential processing according to the grayscale values of each color channel, and finally summarizing, it is possible to avoid multiple color channels being combined into one channel for processing, resulting in The channels are coupled with each other, and the differential error increases, which in turn can improve the accuracy of differential processing.
需要说明的是,受限于相机中传感器的光谱响应以及滤光片阵列特性,工业相机采集到的图像也可以是灰度图。相应地,在这种情况下,可以直接根据图像中各个像素点的灰度值进行差分处理,这样,无需单独提取各颜色通道的值,一定程度上可以简化进行差分处理的操作,提高差分处理的效率。It should be noted that, limited by the spectral response of the sensor in the camera and the characteristics of the filter array, the image collected by the industrial camera can also be a grayscale image. Correspondingly, in this case, the difference processing can be performed directly according to the gray value of each pixel in the image, so that the value of each color channel does not need to be extracted separately, the operation of the difference processing can be simplified to a certain extent, and the difference processing can be improved. s efficiency.
可选的,差分结果可以为差分图像,差分图像中各个像素点的值可以为灰度值之差。进一步地,由于图像传感器的电路中往往会存在噪声干扰。例如,可能会存在随时间涨落的散粒噪声、热噪声和固定模式噪声等噪声。而噪声可能会导致灰度值不准确,存在误差。为了提高差分结果的准确性,本发明实施例中,还可以进行滤噪处理。在一种实现方式中,可以在根据至少两帧图像进行图像处理,得到处理结果之后执行下述步骤A:Optionally, the difference result may be a difference image, and the value of each pixel in the difference image may be the difference between grayscale values. Further, there is often noise interference in the circuit of the image sensor. For example, there may be noise such as shot noise, thermal noise, and fixed pattern noise that fluctuate over time. And noise may lead to inaccurate gray value, there are errors. In order to improve the accuracy of the difference result, in this embodiment of the present invention, noise filtering processing may also be performed. In an implementation manner, the following step A may be performed after image processing is performed according to at least two frames of images and a processing result is obtained:
步骤A、对所述差分图像进行滤噪处理。Step A: Perform noise filtering processing on the differential image.
具体的,可以在差分图像的频域或小波域中引入各类滤波器,通过滤波器进行滤噪处理。本发明实施例中,通过在完成图像处理之后,对处理结果,即,差分图像进行处理,可以将处理结果中包含的噪声干扰去除,进而确保差分结果的准确性。Specifically, various types of filters can be introduced in the frequency domain or wavelet domain of the differential image, and noise filtering is performed through the filters. In the embodiment of the present invention, after the image processing is completed, the processing result, that is, the difference image, can be processed, so that the noise interference included in the processing result can be removed, thereby ensuring the accuracy of the difference result.
在另一种实现方式中,可以在根据至少两帧图像进行图像处理,得到处理结果之前,通过下述步骤B~步骤D实现滤噪处理:In another implementation manner, before image processing is performed according to at least two frames of images and a processing result is obtained, noise filtering processing may be implemented through the following steps B to D:
步骤B、从所述至少两帧图像中获取m帧图像;所述m为正整数。Step B: Obtain m frames of images from the at least two frames of images; the m is a positive integer.
本发明实施例中,m的具体值可以是根据实际情况确定。可选的,m帧图像可以为至少两帧图像中参与图像差分处理的图像。即,可以每次获取m帧图像执行一次差分处理。相应地,本发明实施例中,可以在每次执行差分处理之前,对这m帧的图像进行滤噪处理,即,将滤噪处理和差分处理交替进行,进而在保持事件输出的同时,降低每次执行的差分处理中的噪声干扰,进而确保差分处理的效果。In this embodiment of the present invention, the specific value of m may be determined according to the actual situation. Optionally, the m frames of images may be images that participate in image difference processing among at least two frames of images. That is, the difference processing can be performed once every time m frames of images are acquired. Correspondingly, in this embodiment of the present invention, noise filtering processing may be performed on the m frames of images before each differential processing is performed, that is, noise filtering processing and differential processing are performed alternately, so as to maintain the event output while reducing the noise. Noise interference in the differential processing performed each time, thereby ensuring the effect of the differential processing.
步骤C、对所述m帧图像进行加权平均处理,以汇集所述m帧图像中的噪声信息。Step C: Perform weighted average processing on the m frames of images to collect noise information in the m frames of images.
实际场景中,受到噪声干扰,每个图像中都会存在噪声信息。本步骤中,通过对m帧图像进行加权平均处理,汇集m帧图像中的噪声信息,可以方便后续步骤集中对噪声信息进行处理。具体的,进行加权平均处理时,可以是按照预设权重及权重分配规则,确定每帧图像对应的预设权重。例如,可以按照图像质量越低,权重越大的方式进行分配。接着,可以根据每帧图像对应的预设权重及每个像素点的像素值,计算m帧图像中每个像素点对应的加权和。In the actual scene, due to noise interference, there will be noise information in each image. In this step, by performing weighted average processing on the m frames of images, the noise information in the m frames of images is collected, which can facilitate the subsequent steps to centrally process the noise information. Specifically, when performing the weighted average processing, the preset weight corresponding to each frame of image may be determined according to the preset weight and the weight distribution rule. For example, it can be assigned in such a way that the lower the image quality, the higher the weight. Next, a weighted sum corresponding to each pixel in the m frames of images may be calculated according to the preset weight corresponding to each frame of the image and the pixel value of each pixel.
步骤D、对所述噪声信息进行滤噪处理。Step D, performing noise filtering processing on the noise information.
本发明实施例中,由于上述步骤中计算得到的所有像素点对应的加权和汇集了所有的噪声信息,因此,本发明实施例中,可以对这些像素点对应的加权和组成的图像进行滤噪处理,进 而实现对噪声信息进行滤噪处理。具体的滤噪处理的实现方式参照步骤A中的实现方式,或者,也可以利用m帧图像的时域特性,进行空域滤波,本发明实施例对此不作限定。在一种实现方式中,可以对每帧图像分别进行滤噪处理。但这样需要执行m次滤噪处理才能实现过滤掉噪声干扰。本发明实施例中通过先汇集多帧图像中的噪声信息,然后对噪声信息整体进行滤噪处理。这样,仅需执行一次滤噪处理即可,进而一定程度上可以提高滤噪效率。In the embodiment of the present invention, since the weighted sum corresponding to all the pixels calculated in the above steps collects all the noise information, therefore, in the embodiment of the present invention, the image composed of the weighted sum corresponding to these pixels can be filtered for noise processing, and then realize the noise filtering processing on the noise information. For a specific implementation manner of noise filtering processing, refer to the implementation manner in step A, or, spatial domain filtering may also be performed by using the temporal domain characteristics of m frames of images, which is not limited in this embodiment of the present invention. In an implementation manner, noise filtering may be performed on each frame of image separately. However, it needs to perform m times of noise filtering processing to achieve filtering out noise interference. In the embodiment of the present invention, the noise information in the multi-frame images is collected first, and then the noise information is filtered as a whole. In this way, only one noise filtering process needs to be performed, thereby improving the noise filtering efficiency to a certain extent.
可选的,处理结果具体可以包括差分图像及差分图像中各个像素点对应的差分值。相应地,在一种实现方式中,根据处理结果进行渲染,以获取事件图像的操作可以通过下述子步骤(3)~子步骤(4)实现:Optionally, the processing result may specifically include a difference image and a difference value corresponding to each pixel in the difference image. Correspondingly, in an implementation manner, the operation of rendering according to the processing result to obtain the event image can be implemented through the following sub-steps (3) to (4):
子步骤(3):根据所述差分图像中所述像素点对应的差分值的符号,确定所述像素点对应的渲染颜色,以及根据所述差分值的绝对值,确定所述渲染颜色的颜色浓度。Sub-step (3): according to the sign of the difference value corresponding to the pixel point in the difference image, determine the rendering color corresponding to the pixel point, and determine the color of the rendering color according to the absolute value of the difference value concentration.
本发明实施例中,在差分值为正数时,差分值的符号可以为正号,在差分值为负数时,差分值的符号可以为负号。根据差分值的符号确定渲染颜色时,可以根据预设的符号与渲染颜色对应关系,查找该差分值的符号对应的渲染颜色,然后将该符号对应的渲染颜色确定为该像素点的渲染颜色。其中,预设的符号与渲染颜色对应关系可以是根据实际需求设定的,示例的,可以设定正号对应的渲染颜色为红色,负号对应的渲染颜色为绿色。相应地,如果像素点对应的差分值的符号为正号,则可以确定该像素点对应的渲染颜色为红色,如果像素点对应的差分值的符号为负号,则可以确定该像素点对应的渲染颜色为绿色。In this embodiment of the present invention, when the difference value is a positive number, the sign of the difference value may be a positive sign, and when the difference value is a negative number, the sign of the difference value may be a negative sign. When the rendering color is determined according to the sign of the difference value, the rendering color corresponding to the sign of the difference value can be searched according to the preset correspondence between the sign and the rendering color, and then the rendering color corresponding to the sign is determined as the rendering color of the pixel. The preset correspondence between symbols and rendering colors may be set according to actual requirements. For example, the rendering color corresponding to the positive sign may be set to be red, and the rendering color corresponding to the negative sign may be set to green. Correspondingly, if the sign of the difference value corresponding to a pixel point is a positive sign, it can be determined that the rendering color corresponding to the pixel point is red; if the sign of the difference value corresponding to a pixel point is a negative sign, it can be determined that the pixel point corresponding to the The rendering color is green.
进一步地,差分值的绝对值可以与颜色浓度正相关,根据差分值的绝对值确定颜色浓度时,可以为越大的绝对值,设置越高的颜色浓度,为越小的绝对值,设置越低的颜色浓度。示例的,可以将差分值的绝对值作为第一预设生成函数的输入,将该第一预设函数的输出作为颜色浓度。其中,该第一预设函数可以是因变量和自变量正相关的函数。或者,差分值的绝对值也可以与颜色浓度负相关,根据差分值的绝对值确定颜色浓度时,可以为越小的绝对值,设置越高的颜色浓度,为越大的绝对值,设置越低的颜色浓度,本发明实施例对此不作限定。当然,有可能会出现差分值为0的情况下,这种情况下,可以不对该像素点进行渲染,进而一定程度上节省处理资源。Further, the absolute value of the difference value can be positively correlated with the color density. When determining the color density according to the absolute value of the difference value, the larger the absolute value, the higher the color density, and the smaller the absolute value, the higher the setting. Low color intensity. For example, the absolute value of the difference value may be used as the input of the first preset generating function, and the output of the first preset function may be used as the color density. Wherein, the first preset function may be a function in which the dependent variable and the independent variable are positively correlated. Alternatively, the absolute value of the difference value can also be negatively correlated with the color density. When determining the color density according to the absolute value of the difference value, the smaller the absolute value, the higher the color density. Low color density, which is not limited in this embodiment of the present invention. Of course, there may be a case where the difference value is 0. In this case, the pixel point may not be rendered, thereby saving processing resources to a certain extent.
子步骤(4):根据所述渲染颜色及所述颜色浓度对所述像素点进行渲染。Sub-step (4): Render the pixel points according to the rendering color and the color density.
本发明实施例中,可以按照该渲染颜色及颜色浓度,对该像素点进行渲染,使得渲染后的像素点呈现该渲染颜色,且呈现出的渲染颜色的浓度与该颜色浓度相匹配。其中,渲染后的差分图像即为事件图像,该事件图像可以为一矩阵图像。由于像素点的差分值往往是由发生的亮度变化事件导致的,本发明实施例中,将像素点渲染为与差分值的符号对应的渲染颜色及与绝对值对应的颜色浓度,通过渲染后得到的事件图像中颜色之间的分界可以表示事件的变化方向。示例的,可以将事件图像中颜色分界最清晰的方向作为事件变化方向。通过分界处浓度的具体变化可以体现出变化方向的起始方位和终止方位,例如,可以在差分值的绝对值与颜色浓度正 相关的情况下,将颜色分界最清晰的方向上,浓度较低的一端作为起始方位,将浓度较高的一端作为终止方位。这样,通过事件图像即可较为清晰明了的展示亮度变化事件。In the embodiment of the present invention, the pixel point may be rendered according to the rendering color and the color density, so that the rendered pixel point presents the rendering color, and the density of the rendered color matches the color density. The rendered differential image is the event image, and the event image may be a matrix image. Since the difference value of a pixel point is often caused by a brightness change event that occurs, in the embodiment of the present invention, the pixel point is rendered into a rendering color corresponding to the sign of the difference value and a color density corresponding to the absolute value, which is obtained after rendering. The demarcation between colors in the event image can indicate the direction of the event's change. For example, the direction with the clearest color demarcation in the event image may be used as the event change direction. The specific change of the density at the boundary can reflect the starting and ending directions of the change direction. For example, when the absolute value of the difference value is positively correlated with the color density, the direction with the clearest color boundary has a lower density. The end of the concentration is used as the starting position, and the end with the higher concentration is used as the ending position. In this way, the brightness change event can be displayed more clearly and clearly through the event image.
在另一种实现方式中,根据处理结果进行渲染,以获取事件图像的操作可以通过下述子步骤(5)~子步骤(6)实现:In another implementation manner, the operation of rendering according to the processing result to obtain the event image may be implemented through the following sub-steps (5) to (6):
子步骤(5):根据预设的数值与渲染颜色之间的对应关系及所述差分图像中所述像素点对应的差分值,确定所述像素点对应的渲染颜色。Sub-step (5): Determine the rendering color corresponding to the pixel point according to the corresponding relationship between the preset value and the rendering color and the difference value corresponding to the pixel point in the difference image.
本发明实施例中,预设的数值与渲染颜色之间的对应关系可以是根据实际需求设定的。示例的,可以设定一个数值对应一种渲染颜色,当然,为了避免颜色过多导致视觉负担过重,也可以为处于同一数值范围内的数值,设定相同的渲染颜色,本发明实施例对此不作限定。In the embodiment of the present invention, the corresponding relationship between the preset numerical value and the rendering color may be set according to actual requirements. As an example, one value can be set to correspond to one rendering color. Of course, in order to avoid excessive visual burden caused by too many colors, the same rendering color can also be set for values within the same value range. This is not limited.
具体的,可以在该对应关系中查找与该差分值对应的渲染颜色,将该对应的渲染颜色确定为该像素点对应的渲染颜色。Specifically, the rendering color corresponding to the difference value may be searched in the corresponding relationship, and the corresponding rendering color may be determined as the rendering color corresponding to the pixel point.
子步骤(6):根据所述渲染颜色对所述像素点进行渲染。Sub-step (6): Render the pixel points according to the rendering color.
本发明实施例中,可以按照默认的颜色浓度,将像素点渲染为对应的渲染颜色,这样,可以确保像素点的颜色浓度的统一性,进而方便用户观看。由于像素点的差分值往往是由发生的亮度变化事件导致的,本发明实施例中,将像素点渲染为与差分值对应的渲染颜色,通过渲染后得到的差分图像中颜色的分布变化可以体现事件的变化方向。示例的,可以将颜色分布变化规律性最高的方向作为事件的变化方向。将规律性最高方向上,规律程度较低的一端作为起始方位,规律程度较高的一端作为终止方位。这样,通过事件图像即可较为清晰明了的展示亮度变化事件。In the embodiment of the present invention, the pixels can be rendered into corresponding rendering colors according to the default color density, so that the uniformity of the color density of the pixels can be ensured, thereby facilitating viewing by the user. Since the difference value of the pixel point is often caused by the occurrence of the brightness change event, in the embodiment of the present invention, the pixel point is rendered as the rendering color corresponding to the difference value, and the color distribution change in the difference image obtained after rendering can be reflected. The direction of the event's change. For example, the direction with the highest regularity of color distribution change may be used as the change direction of the event. In the direction with the highest regularity, the end with a lower degree of regularity is taken as the starting position, and the end with a higher degree of regularity is regarded as the end position. In this way, the brightness change event can be displayed more clearly and clearly through the event image.
需要说明的是,在生成事件图像之后,可以通过预设的数据接口输出事件图像。其中,输出的事件图像的图像帧率可以不高于工业相机采集的图像帧数据流的帧率。It should be noted that, after the event image is generated, the event image can be output through a preset data interface. The image frame rate of the output event image may not be higher than the frame rate of the image frame data stream collected by the industrial camera.
可选的,本发明实施例中的亮度变化事件可以为物体运动,相应地,根据处理结果进行渲染,以获取事件图像之后,还可以执行以下步骤E~步骤G,以实现根据事件检测结果进行避障:Optionally, the brightness change event in this embodiment of the present invention may be an object motion. Accordingly, after rendering is performed according to the processing result to obtain the event image, the following steps E to G may be performed, so as to realize the execution according to the event detection result. Avoidance:
步骤E、根据所述至少两帧图像获取高动态范围HDR图像,以及,根据所述事件图像确定所述物体的运动方向。Step E: Acquire a high dynamic range HDR image according to the at least two frames of images, and determine the moving direction of the object according to the event image.
本发明实施例中,根据事件图像确定物体的运动方向时,可以根据事件图像读取事件变化方向,将该事件变化方向确定为物体的运动方向。具体的,确定事件变化方向的实现方式可以参照上述子步骤(4)或子步骤(6)中的描述,本发明实施例在此不作赘述。In the embodiment of the present invention, when the movement direction of the object is determined according to the event image, the event change direction may be read according to the event image, and the event change direction is determined as the movement direction of the object. Specifically, for an implementation manner of determining the direction of an event change, reference may be made to the description in the sub-step (4) or sub-step (6) above, which is not repeated in this embodiment of the present invention.
进一步地,根据至少两帧图像获取高动态范围(High-Dynamic Range,HDR)图像时,可以是执行操作:根据所述图像中各个像素点的灰度值,计算所述至少两帧图像中各个像素点的灰度值之和;对各个所述灰度值之和进行归一化处理,得到所述HDR图像。Further, when obtaining a high dynamic range (High-Dynamic Range, HDR) image according to at least two frames of images, an operation may be performed: according to the gray value of each pixel in the image, calculate each of the at least two frames of images. The sum of the grayscale values of the pixel points; normalize the sum of each of the grayscale values to obtain the HDR image.
具体的,可以按照图像中的各个颜色通道,分别读入图像的各个颜色通道的值进行计算灰度值之和的操作。这样,可以避免由于多个颜色通道被合并为一个通道进行处理,导致通道相 互耦合,增大计算误差的问题,进而可以提高计算的精确度。具体的分别读入的方式可以参照上述子步骤(1)~子步骤(2)中的相关描述,当然,本步骤也可以直接复用执行子步骤(1)~子步骤(2)时分别读入的各个颜色通道的值,本发明实施例对此不作限定。当然,在工业相机采集到的图像本身为灰度图时,可以直接根据图像中各个像素点的灰度值进行灰度值之和,以提高计算效率。Specifically, according to each color channel in the image, the value of each color channel in the image can be read into the operation of calculating the sum of grayscale values. In this way, it is possible to avoid the problem that multiple color channels are combined into one channel for processing, resulting in the mutual coupling of the channels, which increases the calculation error, thereby improving the calculation accuracy. For the specific way of reading in separately, please refer to the relevant descriptions in the above sub-steps (1) to (2). Of course, this step can also be directly multiplexed and read separately when executing sub-steps (1) to (2). The value of each color channel entered is not limited in this embodiment of the present invention. Of course, when the image captured by the industrial camera itself is a grayscale image, the sum of the grayscale values can be directly performed according to the grayscale values of each pixel in the image to improve the calculation efficiency.
在计算灰度值之和时,可以是按照图像的时序将这至少两帧图像中前后帧图像进行积分运算,即,计算前后帧图像中像素点的灰度值之和。具体的,可以是计算每个颜色通道上的灰度值之和,最后将颜色通道上的灰度值之和的总和,作为像素点的灰度值之和。When calculating the sum of grayscale values, the integration operation may be performed on the preceding and following frame images in the at least two frame images according to the time sequence of the images, that is, the sum of the grayscale values of the pixels in the preceding and following frame images is calculated. Specifically, the sum of the gray values on each color channel can be calculated, and finally the sum of the gray values on the color channels is used as the sum of the gray values of the pixels.
由于多帧图像相加之后的灰度值之和可能会超过常用显示器件所采用的位深度以及显示范围。因此,本发明实施例中可以通过归一化操作对灰度值之和进行归一化处理,以使HDR图像可以被常用显示器件正常显示,进而提高HDR图像的适用范围。示例的,常用显示器件采用的位深度可以为8比特(bit),相应地,本发明实施例中可以将灰度值之和归一化到0-255数值区间的8-bit位深度。进一步地,还可以根据参与计算的图像的帧数增加图像的位深度,以确保计算的灰度值之和有足够的位深度使用。例如,可以在参与计算的图像的帧数为4时,可以将位深度增加至10-bit。例如,灰度值在0-255数值范围的8-bit图像,经过4帧图像的积分,将得到在0-1023数值范围的10-bit图像,通过增加到10bit,可以确保位深度可以满足使用需求。Because the sum of grayscale values after adding multiple frames of images may exceed the bit depth and display range adopted by common display devices. Therefore, in this embodiment of the present invention, the sum of gray values can be normalized through a normalization operation, so that the HDR image can be displayed normally by a common display device, thereby improving the applicable range of the HDR image. For example, the bit depth used by a common display device may be 8 bits (bit). Correspondingly, in the embodiment of the present invention, the sum of gray values may be normalized to an 8-bit bit depth in the numerical range of 0-255. Further, the bit depth of the image can also be increased according to the number of frames of the image participating in the calculation, so as to ensure that the sum of the calculated gray values has enough bit depth for use. For example, the bit depth can be increased to 10-bit when the number of frames of the image involved in the calculation is 4. For example, an 8-bit image with a grayscale value in the range of 0-255 will obtain a 10-bit image in the range of 0-1023 through the integration of 4 frames of images. By increasing it to 10bit, it can ensure that the bit depth can meet the needs of use. need.
实际应用中图像的噪声是随机涨落的,而图像中弱信号,例如,较暗的部分,是相对稳定的。因此,本发明实施例中经过多帧图像的加和运算,噪声强度的增加将低于弱信号的增加,从而可以将图像中较暗部分从噪声中增强出来,提高暗区域内的信噪比,进而提供更多的成像细节。进一步地,在成像场景中同时存在高亮度和低亮度区域时,由于噪声的随机涨落,因此,一定程度上可以在保持高亮度区域不产生过曝的前提下,实现低亮度区域的增强感知,进而实现HDR成像,即,得到HDR图像。在采用的工业相机为高速工业相机的情况下,由于高速工业相机的曝光时间较为短,单位时间内采集的光子数目较少,因此,加和计算的方式一定程度上还可以提高图像信噪比。In practical applications, the noise of an image fluctuates randomly, while weak signals in an image, such as darker parts, are relatively stable. Therefore, in the embodiment of the present invention, after the sum operation of multiple frames of images, the increase of the noise intensity will be lower than the increase of the weak signal, so that the darker part of the image can be enhanced from the noise, and the signal-to-noise ratio in the dark area can be improved. , which in turn provides more imaging details. Further, when there are high-brightness and low-brightness areas in the imaging scene at the same time, due to the random fluctuation of noise, it is possible to achieve enhanced perception of low-brightness areas on the premise of keeping high-brightness areas without overexposure to a certain extent. , and then realize HDR imaging, that is, obtain an HDR image. When the industrial camera used is a high-speed industrial camera, since the exposure time of the high-speed industrial camera is relatively short and the number of photons collected per unit time is small, the summation calculation method can also improve the image signal-to-noise ratio to a certain extent. .
需要说明的是,生成的HDR图像之后可以通过预设数据接口输出HDR图像,以供后续步骤使用。其中,输出的图像可以为彩色图像,这样可以有助于判断物体的材质、纹理信息,进而提高后续的图像识别效率。进一步地,HDR图像中可能还存在一定的噪声干扰,为了提高HDR图像的图像质量,还可以对HDR图像进行滤噪处理。具体的滤噪方式可以参照上述相关步骤中执行的滤噪处理,例如,可以采用空域滤波、频域滤波、小波域滤波的方式。It should be noted that, after the generated HDR image, the HDR image can be output through the preset data interface for use in subsequent steps. The output image may be a color image, which can help determine the material and texture information of the object, thereby improving the efficiency of subsequent image recognition. Further, there may be some noise interference in the HDR image. In order to improve the image quality of the HDR image, the HDR image may also be subjected to noise filtering processing. For specific noise filtering methods, refer to the noise filtering processing performed in the above-mentioned related steps, for example, spatial domain filtering, frequency domain filtering, and wavelet domain filtering can be adopted.
步骤F、根据所述HDR图像识别所述物体在外界环境中的位置信息。Step F: Identify the position information of the object in the external environment according to the HDR image.
本发明实施例中,物体在外界环境中的位置信息可以是该物体在外界环境中的坐标,也可以是物体与外界环境中的其他物体之间的相对位置关系、相对距离,等等。具体的,可以利用预设的图像识别算法对HDR图像进行识别,例如,进行目标物体识别、图像语义分割等操作, 以确定位置信息。需要说明的是,为了提高识别的准确性,本发明实施例还可以将识别的物体的特征与事件图像中事件的特征进行比对,如果两者相匹配,则确认该物体是发生运动的物体,之后再进一步确定该物体的位置信息。In this embodiment of the present invention, the position information of an object in the external environment may be the coordinates of the object in the external environment, or may be the relative positional relationship, relative distance, etc. between the object and other objects in the external environment. Specifically, a preset image recognition algorithm may be used to recognize the HDR image, for example, operations such as target object recognition, image semantic segmentation, etc. may be performed to determine the location information. It should be noted that, in order to improve the accuracy of recognition, the embodiment of the present invention may also compare the features of the recognized object with the features of the event in the event image, and if the two match, it is confirmed that the object is a moving object , and then further determine the position information of the object.
步骤G、根据所述运动方向及所述位置信息,对所述物体进行避障。Step G: Perform obstacle avoidance on the object according to the movement direction and the position information.
本发明实施例中,可以根据运动方向及位置信息,预估物体之后会发生的移动变化,根据该移动变化确定避障策略。示例的,假设运动方向为从左向右水平移动,物体目前在中间位置,之后会移动到取景范围的最右边,那么可以控制设备远离右边的位置,以躲避该物体。In the embodiment of the present invention, the movement change that will occur after the object can be estimated according to the movement direction and position information, and the obstacle avoidance strategy is determined according to the movement change. For example, assuming that the movement direction is horizontal movement from left to right, the object is currently in the middle position, and will then move to the far right of the viewing range, then the device can be controlled to move away from the right position to avoid the object.
在光强变化剧烈的场景中,例如,进出隧道,室内外切换的额场景中,对运动物体进行避障时,如果仅通过对工业相机采集的图像进行图像分析来避障,由于图像的动态范围不足可能会导致图像分析失败,进而导致避障效果较差。本发明实施例中,根据工业相机采集的图像进一步生成HDR图像,结合事件图像判断出物体的运动方向,进行避障的方式中,由于HDR图像具备高动态范围,因此可以为图像识别分析提供足够的信息,进而可以确保能够识别出物体的位置信息,进而确保避障效果。In scenes with drastic changes in light intensity, such as entering and exiting tunnels, and switching between indoor and outdoor, when avoiding obstacles for moving objects, if obstacles are avoided only by analyzing the images collected by industrial cameras, due to the dynamic nature of the images Insufficient range can cause image analysis to fail, resulting in poor obstacle avoidance. In the embodiment of the present invention, the HDR image is further generated according to the image collected by the industrial camera, and the moving direction of the object is determined in combination with the event image. In the method of obstacle avoidance, since the HDR image has a high dynamic range, it can provide sufficient information for image recognition and analysis. information, which can ensure that the position information of the object can be identified, thereby ensuring the effect of obstacle avoidance.
可选的,当场景中存在运动物体时,物体在前后帧图像上的位置可能发生了变化,这样,经过运算得到的HDR图像中一定程度上可能会形成运动模糊。因此,本发明实施例中还可以在根据至少两帧图像获取HDR图像之后,执行下述步骤:Optionally, when there is a moving object in the scene, the position of the object on the front and rear frame images may change, so that motion blur may be formed in the HDR image obtained by the operation to a certain extent. Therefore, in this embodiment of the present invention, after acquiring the HDR image according to at least two frames of images, the following steps may be performed:
步骤H、根据所述事件图像中所述物体的运动信息,确定运动模糊核。Step H: Determine a motion blur kernel according to the motion information of the object in the event image.
本发明实施例中,运动信息可以是事件变化方向上的像素点的相关信息。由于模糊是由于物体运动导致的,而图像发生模糊又可以称为图像退化。因此,这些像素点可认为是引起退化的点。运动模糊核可以是用于描述这些像素点的运动轨迹的卷积核。In this embodiment of the present invention, the motion information may be related information of pixels in the direction of event change. Since the blurring is caused by the motion of the object, the blurring of the image can also be called image degradation. Therefore, these pixel points can be considered as points causing degradation. The motion blur kernel can be a convolution kernel used to describe the motion trajectory of these pixels.
可选的,在确定运动模糊核时,可以先根据所述事件图像中所述物体的运动信息,估计运动模糊模型。示例的,由于HDR图像中存在运动模糊,因此可以将HDR图像作为退化图像,将生成该HDR图像时所使用的图像作为原始清晰图像,将运动信息作为预设的退化的点扩展函数的输入。根据退化图像、原始清晰图像和退化的点扩展函数构建图像退化模型,将最终构建的图像退化模型作为运动模糊模型。Optionally, when the motion blur kernel is determined, a motion blur model may be estimated first according to the motion information of the object in the event image. For example, since there is motion blur in the HDR image, the HDR image may be used as the degraded image, the image used for generating the HDR image may be used as the original clear image, and the motion information may be used as the input of the preset degraded point spread function. The image degradation model is constructed according to the degraded image, the original clear image and the degraded point spread function, and the final image degradation model is used as the motion blur model.
接着,可以根据所述运动模糊模型,估计所述运动模糊核。其中,运动模糊核可以为空间不变模糊核或空间变化模糊核。具体的,空间不变模糊核是假设HDR图像整体的模糊程度相同,在使用时是相当于是对图像进行整体去模糊运算。空间不变模糊核是一种全局的计算,在运动模糊核为空间不变模糊核的情况下,可以根据运动模糊模型,通过整体估计算法,例如,对事件图像反映的运动程度进行整体平均、主要成分分析等整体估计算法,确定运动模糊核。由于整体估计的运算较为简单,因此使用空间不变模糊核一定程度上可以提高计算效率。但由于使用时相当于是对图像进行整体去模糊运算,因此可能会导致振铃效应等缺陷,进而导致去模糊效果较差。Next, the motion blur kernel may be estimated according to the motion blur model. The motion blur kernel may be a space-invariant blur kernel or a space-variant blur kernel. Specifically, the space-invariant blur kernel assumes that the overall blur degree of the HDR image is the same, and when used, it is equivalent to performing an overall deblurring operation on the image. The space-invariant blur kernel is a global calculation. When the motion blur kernel is a space-invariant blur kernel, the overall estimation algorithm can be used according to the motion blur model. Principal component analysis and other overall estimation algorithms to determine the motion blur kernel. Since the operation of the overall estimation is relatively simple, the use of a space-invariant blur kernel can improve the computational efficiency to a certain extent. However, since it is equivalent to performing an overall deblurring operation on the image, it may cause defects such as ringing effects, which in turn lead to poor deblurring effects.
在运动模糊核为空间变化模糊核的情况下,可以根据运动模糊模型,通过局部估计算法确定运动模糊核。空间变化模糊核是假设三维空间内物体运动在图像上的模糊程度不同,空间变化模糊核可以用于反映这种模糊程度在局部区域的区别,因此,在使用时往往可以得到更好的去模糊效果。但是局部估计的计算复杂度较高。因此,具体使用空间不变模糊核或空间变化模糊核,可以根据实际需求选择。In the case where the motion blur kernel is a spatially variable blur kernel, the motion blur kernel can be determined by a local estimation algorithm according to the motion blur model. The spatial variation blur kernel assumes that the object motion in the three-dimensional space has different blurring degrees on the image. The spatial variation blur kernel can be used to reflect the difference of this blurring degree in the local area. Therefore, it can often get better deblurring when using it. Effect. However, the computational complexity of local estimation is high. Therefore, the specific use of the space-invariant blur kernel or the space-varying blur kernel can be selected according to actual needs.
步骤I、根据所述运动模糊核对所述HDR图像进行去模糊处理。 Step 1. Perform deblurring on the HDR image according to the motion blur check.
本发明实施例中,可以利用运动模糊核对HDR图像进行反卷积运算,进而实现去模糊处理。In this embodiment of the present invention, a motion blur kernel may be used to perform a deconvolution operation on the HDR image, thereby implementing deblurring processing.
当然,也可以采用其他方式实现去模糊。示例的,可以根据图像先验知识,例如,前后帧图像的统计相关性等数据特征,以及预设的卷积核函数,直接估计运动模糊的图像卷积核,最后使用该图像卷积核对HDR图像进行反卷积运算。Of course, deblurring can also be implemented in other ways. Illustratively, the image convolution kernel for motion blur can be directly estimated based on the prior knowledge of the image, for example, the statistical correlation of the previous and subsequent frame images and other data features, as well as the preset convolution kernel function, and finally the image convolution kernel is used to check the HDR. The image is deconvolved.
进一步地,本发明实施例中还可以在获取到工业相机采集的至少两帧图像之后,输出所述至少两帧图像对应的灰度图或颜色图。示例的,可以在采集的图像为灰度图时输出灰度图,在采集的图像为颜色图时输出颜色图。其中,颜色图指的是属于色彩空间的图,例如RGB图。在输出图像时可以是通过预设的数据接口输出。在不断获取工业相机采集的图像帧数据流中图像的场景下,可以实现不断输出数据流中图像。其中,输出的图像帧率可以与图像帧数据流的帧率相同。Further, in this embodiment of the present invention, after acquiring at least two frames of images collected by an industrial camera, a grayscale image or a color image corresponding to the at least two frames of images may be output. For example, a grayscale image may be output when the collected image is a grayscale image, and a color image may be output when the collected image is a color image. Among them, the color map refers to a map belonging to a color space, such as an RGB map. When outputting an image, it can be output through a preset data interface. In the scenario of continuously acquiring the images in the image frame data stream collected by the industrial camera, the images in the data stream can be continuously output. The output image frame rate may be the same as the frame rate of the image frame data stream.
示例的,以亮度变化事件为运动事件为例,图2是本发明实施例提供的一种输出对比示意图。其中,对于物体运动或其他因素带来的亮度变化事件而言,事件相机可以以较高的时间采样频率来输出事件,但仅能输出事件,无法输出图像。示例的,事件相机的输出可以为第一行所示。本发明实施例中,采用工业相机,即,经典成像相机,可以输出第二行所示的各个时刻采集的图像帧,以及经过帧间运算确定事件图像,通过输出事件图像(图2中仅示出了事件图像中的事件变化方向)实现事件输出。Illustratively, taking the brightness change event as a motion event as an example, FIG. 2 is a schematic diagram of an output comparison provided by an embodiment of the present invention. Among them, for the brightness change event caused by object motion or other factors, the event camera can output the event at a higher time sampling frequency, but can only output the event and cannot output the image. Illustratively, the output of the event camera can be as shown in the first line. In the embodiment of the present invention, using an industrial camera, that is, a classic imaging camera, can output the image frames collected at each moment shown in the second row, and determine the event image through inter-frame operations, and output the event image by outputting the event image (only shown in FIG. 2 ). out the event change direction in the event image) to realize the event output.
可选的,本发明实施例中的工业相机的帧率可以不小于100帧/每秒(frame per second,fps),即,工业相机可以为高速工业相机。这样,通过采用帧率不小于100的工业相机,可以避免由于帧率过低,导致检测的实时性较差,实用性较低的问题。示例的,在实际应用场景中,例如,常见的机器人应用场景中,当机器人以较低速度移动时,对环境的感知帧率与人眼近似,约为20fps。而事件相机的时间采样频率虽然可以达到1千赫(kHz)甚至100kHz,较大程度满足机器人的应用需求,但其仅仅提供事件信息。而本发明实施例中,采用帧率大于100fps的工业相机,利用帧间运算实现满足机器人应用的事件输出的同时,还可以进一步提供图像、HDR图像等信息,进而可以为机器人感知提供更加全面的实时图像数据。Optionally, the frame rate of the industrial camera in the embodiment of the present invention may be not less than 100 frames per second (frame per second, fps), that is, the industrial camera may be a high-speed industrial camera. In this way, by using an industrial camera with a frame rate of not less than 100, the problems of poor real-time detection and low practicability due to too low frame rate can be avoided. For example, in a practical application scenario, for example, in a common robot application scenario, when the robot moves at a low speed, the frame rate of the perception of the environment is similar to that of the human eye, about 20fps. Although the time sampling frequency of the event camera can reach 1 kilohertz (kHz) or even 100 kHz, which meets the application requirements of robots to a large extent, it only provides event information. However, in the embodiment of the present invention, an industrial camera with a frame rate greater than 100 fps is used, and the inter-frame operation is used to realize the event output satisfying the application of the robot, and at the same time, information such as images and HDR images can be further provided, thereby providing more comprehensive information for the robot perception. Live image data.
进一步地,图3是本发明实施例提供的一种输出示意图。如图3所示,本发明实施例可以直接输出灰度图像或颜色图像。或者,通过帧间运算,输出HDR图像及事件图像(图3中仅示出了事件变化方向)。Further, FIG. 3 is a schematic diagram of an output provided by an embodiment of the present invention. As shown in FIG. 3 , the embodiment of the present invention can directly output a grayscale image or a color image. Alternatively, the HDR image and the event image are output through the inter-frame operation (only the event change direction is shown in FIG. 3 ).
图4是本发明实施例提供的一种具体实例的处理过程示意图。如图4所示,对于工业相机输出的图像帧数据流,可以根据实际需求选择图像输出类型。在选择直接输出的情况下,可以直接输出原始图像,即,图像帧数据流中的图像。在选择输出事件图像的情况下可以先执行分通道原始数据读入的操作,即,上述子步骤(1)中所示的操作。然后进行帧间差分运算、滤噪处理之后,执行差分数值符号可视化的操作,即,上述步骤所示的进行渲染的操作。最后,可以输出事件图像。在选择输出HDR图像的情况下可以先执行分通道原始数据读入的操作,然后执行帧间积分运算,即,上述步骤E中所示的获取HDR图像的操作。接着,可以进行滤噪及去模糊处理,其中,去模糊处理可以结合事件图像进行。最后,可以输出HDR图像。FIG. 4 is a schematic diagram of a processing process of a specific example provided by an embodiment of the present invention. As shown in Figure 4, for the image frame data stream output by the industrial camera, the image output type can be selected according to actual needs. In the case where direct output is selected, the original image, that is, the image in the image frame data stream, can be directly output. In the case of selecting the output event image, the operation of reading in the raw data by channel may be performed first, that is, the operation shown in the sub-step (1) above. Then, after performing the inter-frame difference operation and the noise filtering process, the operation of visualizing the difference numerical value symbol, that is, the rendering operation shown in the above steps is performed. Finally, the event image can be output. In the case of selecting the output HDR image, the operation of reading in the raw data of the channels may be performed first, and then the inter-frame integration operation, that is, the operation of acquiring the HDR image shown in the above step E may be performed. Next, noise filtering and deblurring may be performed, wherein deblurring may be performed in conjunction with the event image. Finally, an HDR image can be output.
进一步地,相较于使用事件相机的方式,由于事件相机只对光强的变化值敏感,对光强分布不敏感,因此,输出的信息往往较为稀疏,且无法输出图像。本发明实施例中,可以生成并输出HDR图像,以及输出工业相机采集的图像,进而提高输出数据的丰富性。同时,相较于事件相机,工业相机与各类计算平台的兼容性更高、像素尺寸更小、填充因子及阵列规模更大,因此,可以输出实现较高分辨率的图像,进一步提高图像信息的丰富性。示例的,通过信号处理传输电路设计,工业相机可以实现较高的采集和传输速度(例如,使用5交换带宽(Gbps)达到约2亿像素/秒的速度),在大于200fps的高速成像模式下可使相机中的视频图像阵列(Video Graphics Array,VGA)以720p等较高图像分辨率输出图像。Further, compared with the method of using the event camera, since the event camera is only sensitive to the change value of the light intensity and is not sensitive to the light intensity distribution, the output information is often sparse, and the image cannot be output. In the embodiment of the present invention, an HDR image can be generated and output, and an image collected by an industrial camera can be output, thereby improving the richness of output data. At the same time, compared with event cameras, industrial cameras have higher compatibility with various computing platforms, smaller pixel sizes, larger fill factors and larger array scales. Therefore, higher-resolution images can be output to further improve image information. richness. Illustratively, through the design of signal processing transmission circuits, industrial cameras can achieve high acquisition and transmission speeds (eg, using 5 switching bandwidth (Gbps) to achieve speeds of about 200 million pixels/sec), in high-speed imaging modes greater than 200fps The Video Graphics Array (VGA) in the camera can output images at higher image resolutions such as 720p.
需要说明的是,本发明实施例中对图像的处理过程,可以是由事件检测设备中搭载的现场可编程门阵列的(Field Programmable Gate Array,FPGA)、中央处理器(Central Processing Unit,CPU)、图形处理器(Graphics Processing Unit,GPU)、嵌入式神经网络处理器(Neural-network Processing Unit,NPU)等高速图像计算处理单元实现的。其中,生成事件图像时所使用的图像的具体数量,可以是根据工业相机的采集及数据传输速度、处理器的计算能力、以及实际的事件图像的输出帧频需求确定,本发明实施例对此不作限定。It should be noted that the image processing process in the embodiment of the present invention may be a field programmable gate array (Field Programmable Gate Array, FPGA), a central processing unit (Central Processing Unit, CPU) carried in the event detection device. , Graphics Processing Unit (GPU), embedded neural network processor (Neural-network Processing Unit, NPU) and other high-speed image computing processing units. The specific number of images used when generating the event image may be determined according to the acquisition and data transmission speed of the industrial camera, the computing power of the processor, and the actual output frame rate requirements of the event image. Not limited.
图5是本发明实施例提供的一种事件检测装置的框图,该装置50可以包括:存储器501和处理器502。FIG. 5 is a block diagram of an event detection apparatus provided by an embodiment of the present invention. The apparatus 50 may include: a memory 501 and a processor 502 .
所述存储器501,用于存储程序代码。The memory 501 is used to store program codes.
所述处理器502,调用所述程序代码,当所述程序代码被执行时,用于执行以下操作:The processor 502 calls the program code, and when the program code is executed, is configured to perform the following operations:
获取通过工业相机采集的至少两帧图像;Acquire at least two frames of images captured by an industrial camera;
根据所述至少两帧图像进行图像处理,得到处理结果;所述处理结果用于表征所述图像中像素点的亮度变化情况;Perform image processing according to the at least two frames of images to obtain a processing result; the processing result is used to characterize the brightness change of the pixels in the image;
根据所述处理结果进行渲染,以获取事件图像;所述事件图像用于表征亮度变化事件;Rendering according to the processing result to obtain an event image; the event image is used to represent a brightness change event;
输出所述事件图像。The event image is output.
可选的,所述处理器502具体用于:Optionally, the processor 502 is specifically configured to:
对于每帧所述图像,分别获取所述图像的各个颜色通道的值;For each frame of the image, obtain the value of each color channel of the image respectively;
根据所述至少两帧图像的各个颜色通道的值,分别进行图像差分处理。Image difference processing is performed respectively according to the values of the respective color channels of the at least two frames of images.
可选的,所述处理结果包括差分图像;所述处理器502还用于:对所述差分图像进行滤噪处理。Optionally, the processing result includes a difference image; the processor 502 is further configured to: perform noise filtering processing on the difference image.
可选的,所述处理器502还用于:从所述至少两帧图像中获取m帧图像;所述m为正整数;对所述m帧图像进行加权平均处理,以汇集所述m帧图像中的噪声信息;对所述噪声信息进行滤噪处理。Optionally, the processor 502 is further configured to: acquire m frames of images from the at least two frames of images; the m is a positive integer; and perform weighted average processing on the m frames of images to collect the m frames Noise information in the image; perform noise filtering processing on the noise information.
可选的,所述亮度变化事件为物体运动;所述处理器502还用于:根据所述至少两帧图像获取高动态范围HDR图像,以及,根据所述事件图像确定所述物体的运动方向;根据所述HDR图像识别所述物体在外界环境中的位置信息;根据所述运动方向及所述位置信息,对所述物体进行避障。Optionally, the brightness change event is movement of an object; the processor 502 is further configured to: acquire a high dynamic range HDR image according to the at least two frames of images, and determine the movement direction of the object according to the event image ; Recognize the position information of the object in the external environment according to the HDR image; perform obstacle avoidance for the object according to the movement direction and the position information.
可选的,所述处理器502还用于:根据所述事件图像中所述物体的运动信息,确定运动模糊核;根据所述运动模糊核对所述HDR图像进行去模糊处理。Optionally, the processor 502 is further configured to: determine a motion blur kernel according to the motion information of the object in the event image; and perform deblurring processing on the HDR image according to the motion blur kernel.
可选的,所述处理器502具体用于:根据所述事件图像中所述物体的运动信息,估计运动模糊模型;根据所述运动模糊模型,估计所述运动模糊核;其中,所述运动模糊核为空间不变模糊核或空间变化模糊核。Optionally, the processor 502 is specifically configured to: estimate a motion blur model according to the motion information of the object in the event image; estimate the motion blur kernel according to the motion blur model; wherein, the motion The blur kernel is a space-invariant blur kernel or a space-variant blur kernel.
可选的,所述处理器502具体用于:根据所述图像中各个像素点的灰度值,计算所述至少两帧图像中各个像素点的灰度值之和;对各个所述灰度值之和进行归一化处理,得到所述HDR图像。Optionally, the processor 502 is specifically configured to: calculate the sum of the grayscale values of each pixel point in the at least two frames of images according to the grayscale value of each pixel point in the image; The sum of the values is normalized to obtain the HDR image.
可选的,所述处理结果包括差分图像及所述差分图像中各个所述像素点对应的差分值;所述处理器502具体用于:根据所述差分图像中所述像素点对应的差分值的符号,确定所述像素点对应的渲染颜色,以及根据所述差分值的绝对值,确定所述渲染颜色的颜色浓度;根据所述渲染颜色及所述颜色浓度对所述像素点进行渲染。可选的,所述处理结果包括差分图像及所述差分图像中各个所述像素点对应的差分值;所述处理器502具体用于:根据预设的数值与渲染颜色之间的对应关系及所述差分图像中所述像素点对应的差分值,确定所述像素点对应的渲染颜色;根据所述渲染颜色对所述像素点进行渲染。Optionally, the processing result includes a difference image and a difference value corresponding to each pixel in the difference image; the processor 502 is specifically configured to: according to the difference value corresponding to the pixel in the difference image , determine the rendering color corresponding to the pixel point, and determine the color density of the rendering color according to the absolute value of the difference value; and render the pixel point according to the rendering color and the color density. Optionally, the processing result includes a difference image and a difference value corresponding to each of the pixels in the difference image; the processor 502 is specifically configured to: according to the preset value and the corresponding relationship between the rendering color and the corresponding difference value. The difference value corresponding to the pixel point in the difference image determines the rendering color corresponding to the pixel point; the pixel point is rendered according to the rendering color.
可选的,所述工业相机的帧率不小于100帧/每秒。可选的,所述处理器502还用于:输出所述至少两帧图像对应的灰度图或颜色图。综上所述,本发明实施例提供的事件检测装置,可以获取通过工业相机采集的至少两帧图像,根据至少两帧图像进行图像处理,得到处理结果,其中,处理结果用于表征图像中像素点的亮度变化情况,接着根据处理结果进行渲染,以获取事件图像,其中,事件图像用于表征亮度变化事件,最后,输出事件图像。这样,无需专门设置事件相机,直接根据常规的工业相机采集的图像,即可实现事件检测,因此一定程度上可以节省检测成本。Optionally, the frame rate of the industrial camera is not less than 100 frames per second. Optionally, the processor 502 is further configured to: output a grayscale map or a color map corresponding to the at least two frames of images. To sum up, the event detection device provided by the embodiment of the present invention can acquire at least two frames of images collected by an industrial camera, perform image processing according to the at least two frames of images, and obtain a processing result, wherein the processing result is used to characterize the pixels in the image The brightness change of the point is then rendered according to the processing result to obtain an event image, wherein the event image is used to represent the brightness change event, and finally, the event image is output. In this way, event detection can be realized directly based on images collected by conventional industrial cameras without the need to specially set up an event camera, so the detection cost can be saved to a certain extent.
进一步地,本发明实施例还提供一种可移动平台,所述可移动平台包含工业相机和上述任 一所述事件检测装置;所述事件检测装置用于执行上述事件检测方法中的各个步骤,且能达到相同的技术效果,为避免重复,这里不再赘述。Further, an embodiment of the present invention further provides a movable platform, where the movable platform includes an industrial camera and any of the above-mentioned event detection devices; the event detection device is configured to execute each step in the above-mentioned event detection method, And can achieve the same technical effect, in order to avoid repetition, it is not repeated here.
可选的,所述可移动平台包括动力旋桨以及用于驱动所述动力螺旋桨的驱动电机。Optionally, the movable platform includes a powered propeller and a drive motor for driving the powered propeller.
进一步地,本发明实施例还提供一种计算机可读存储介质,所述计算机可读存储介质上存储计算机程序,所述计算机程序被处理器执行时实现上述事件检测方法中的各个步骤,且能达到相同的技术效果,为避免重复,这里不再赘述。Further, an embodiment of the present invention also provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, each step in the above-mentioned event detection method is implemented, and can To achieve the same technical effect, in order to avoid repetition, details are not repeated here.
图6为实现本发明各个实施例的一种设备的硬件结构示意图,该设备600包括但不限于:射频单元601、网络模块602、音频输出单元603、输入单元604、传感器605、显示单元606、用户输入单元607、接口单元608、存储器609、处理器610、以及电源611等部件。本领域技术人员可以理解,图6中示出的设备结构并不构成对设备的限定,设备可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。在本发明实施例中,设备包括但不限于手机、平板电脑、笔记本电脑、掌上电脑、车载设备、可穿戴设备、以及计步器等。应理解的是,本发明实施例中,射频单元601可用于收发信息或通话过程中,信号的接收和发送,具体的,将来自基站的下行数据接收后,给处理器610处理;另外,将上行的数据发送给基站。通常,射频单元601包括但不限于天线、至少一个放大器、收发信机、耦合器、低噪声放大器、双工器等。此外,射频单元601还可以通过无线通信系统与网络和其他设备通信。设备通过网络模块602为用户提供了无线的宽带互联网访问,如帮助用户收发电子邮件、浏览网页和访问流式媒体等。音频输出单元603可以将射频单元601或网络模块602接收的或者在存储器609中存储的音频数据转换成音频信号并且输出为声音。而且,音频输出单元603还可以提供与设备600执行的特定功能相关的音频输出(例如,呼叫信号接收声音、消息接收声音等等)。音频输出单元603包括扬声器、蜂鸣器以及受话器等。输入单元604用于接收音频或视频信号。输入单元604可以包括图形处理器(Graphics Processing Unit,GPU)6041和麦克风6042,图形处理器6041对在视频捕获模式或图像捕获模式中由图像捕获设备(如摄像头)获得的静态图片或视频的图像数据进行处理。处理后的图像帧可以显示在显示单元606上。经图形处理器6041处理后的图像帧可以存储在存储器609(或其它存储介质)中或者经由射频单元601或网络模块602进行发送。麦克风6042可以接收声音,并且能够将这样的声音处理为音频数据。处理后的音频数据可以在电话通话模式的情况下转换为可经由射频单元601发送到移动通信基站的格式输出。设备600还包括至少一种传感器605,比如光传感器、运动传感器以及其他传感器。具体地,光传感器包括环境光传感器及接近传感器,其中,环境光传感器可根据环境光线的明暗来调节显示面板6061的亮度,接近传感器可在设备600移动到耳边时,关闭显示面板6061和/或背光。作为运动传感器的一种,加速计传感器可检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别设备姿态(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击)等;传感器605还可以包括指纹传感器、 压力传感器、虹膜传感器、分子传感器、陀螺仪、气压计、湿度计、温度计、红外线传感器等,在此不再赘述。显示单元606用于显示由用户输入的信息或提供给用户的信息。显示单元606可包括显示面板6061,可以采用液晶显示器(Liquid Crystal Display,LCD)、有机发光二极管(Organic Light-Emitting Diode,OLED)等形式来配置显示面板6061。用户输入单元607可用于接收输入的数字或字符信息,以及产生与设备的用户设置以及功能控制有关的键信号输入。具体地,用户输入单元607包括触控面板6041以及其他输入设备6072。触控面板6041,也称为触摸屏,可收集用户在其上或附近的触摸操作(比如用户使用手指、触笔等任何适合的物体或附件在触控面板6041上或在触控面板6041附近的操作)。触控面板6041可包括触摸检测设备和触摸控制器两个部分。其中,触摸检测设备检测用户的触摸方位,并检测触摸操作带来的信号,将信号传送给触摸控制器;触摸控制器从触摸检测设备上接收触摸信息,并将它转换成触点坐标,再送给处理器610,接收处理器610发来的命令并加以执行。此外,可以采用电阻式、电容式、红外线以及表面声波等多种类型实现触控面板6041。除了触控面板6041,用户输入单元607还可以包括其他输入设备6072。具体地,其他输入设备6072可以包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆,在此不再赘述。进一步的,触控面板6041可覆盖在显示面板6061上,当触控面板6041检测到在其上或附近的触摸操作后,传送给处理器610以确定触摸事件的类型,随后处理器610根据触摸事件的类型在显示面板6061上提供相应的视觉输出。虽然触控面板6041与显示面板6061是作为两个独立的部件来实现设备的输入和输出功能,但是在某些实施例中,可以将触控面板6041与显示面板6061集成而实现设备的输入和输出功能,具体此处不做限定。接口单元608为外部设备与设备600连接的接口。例如,外部设备可以包括有线或无线头戴式耳机端口、外部电源(或电池充电器)端口、有线或无线数据端口、存储卡端口、用于连接具有识别模块的设备的端口、音频输入/输出(I/O)端口、视频I/O端口、耳机端口等等。接口单元608可以用于接收来自外部设备的输入(例如,数据信息、电力等等)并且将接收到的输入传输到设备600内的一个或多个元件或者可以用于在设备600和外部设备之间传输数据。6 is a schematic diagram of the hardware structure of a device implementing various embodiments of the present invention. The device 600 includes but is not limited to: a radio frequency unit 601, a network module 602, an audio output unit 603, an input unit 604, a sensor 605, a display unit 606, User input unit 607 , interface unit 608 , memory 609 , processor 610 , and power supply 611 and other components. Those skilled in the art can understand that the device structure shown in FIG. 6 does not constitute a limitation on the device, and the device may include more or less components than the one shown, or combine some components, or arrange different components. In this embodiment of the present invention, the devices include but are not limited to mobile phones, tablet computers, notebook computers, handheld computers, vehicle-mounted devices, wearable devices, and pedometers. It should be understood that, in this embodiment of the present invention, the radio frequency unit 601 may be used for receiving and sending signals during sending and receiving of information or during a call. Specifically, after receiving the downlink data from the base station, it is processed by the processor 610; The uplink data is sent to the base station. Generally, the radio frequency unit 601 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 601 can also communicate with the network and other devices through a wireless communication system. The device provides the user with wireless broadband Internet access through the network module 602, such as helping the user to send and receive emails, browse web pages, access streaming media, and the like. The audio output unit 603 may convert audio data received by the radio frequency unit 601 or the network module 602 or stored in the memory 609 into audio signals and output as sound. Also, the audio output unit 603 may also provide audio output related to a specific function performed by the device 600 (eg, call signal reception sound, message reception sound, etc.). The audio output unit 603 includes a speaker, a buzzer, a receiver, and the like. The input unit 604 is used to receive audio or video signals. The input unit 604 may include a graphics processor (Graphics Processing Unit, GPU) 6041 and a microphone 6042, and the graphics processor 6041 is used for still pictures or video images obtained by an image capture device (such as a camera) in a video capture mode or an image capture mode data is processed. The processed image frames may be displayed on the display unit 606 . The image frames processed by the graphics processor 6041 may be stored in the memory 609 (or other storage medium) or transmitted via the radio frequency unit 601 or the network module 602 . The microphone 6042 can receive sound and can process such sound into audio data. The processed audio data can be converted into a format that can be transmitted to a mobile communication base station via the radio frequency unit 601 for output in the case of a telephone call mode. Device 600 also includes at least one sensor 605, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor and a proximity sensor, wherein the ambient light sensor can adjust the brightness of the display panel 6061 according to the brightness of the ambient light, and the proximity sensor can turn off the display panel 6061 and/or when the device 600 is moved to the ear. or backlight. As a kind of motion sensor, the accelerometer sensor can detect the magnitude of acceleration in all directions (usually three axes), and can detect the magnitude and direction of gravity when stationary, and can be used to identify the device posture (such as horizontal and vertical screen switching, related games, The sensor 605 may also include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared Sensors, etc., will not be repeated here. The display unit 606 is used to display information input by the user or information provided to the user. The display unit 606 may include a display panel 6061, and the display panel 6061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. The user input unit 607 may be used to receive input numerical or character information, and generate key signal input related to user settings and function control of the device. Specifically, the user input unit 607 includes a touch panel 6041 and other input devices 6072 . The touch panel 6041, also referred to as a touch screen, can collect the user's touch operations on or near it (such as the user's finger, stylus, etc., any suitable object or accessory on or near the touch panel 6041). operate). The touch panel 6041 may include two parts, a touch detection device and a touch controller. Among them, the touch detection device detects the user's touch orientation, detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact coordinates, and then sends it to the touch controller. To the processor 610, the command sent by the processor 610 is received and executed. In addition, the touch panel 6041 can be implemented in various types such as resistive, capacitive, infrared, and surface acoustic waves. In addition to the touch panel 6041 , the user input unit 607 may also include other input devices 6072 . Specifically, other input devices 6072 may include, but are not limited to, physical keyboards, function keys (such as volume control keys, switch keys, etc.), trackballs, mice, and joysticks, which are not described herein again. Further, the touch panel 6041 can be covered on the display panel 6061. When the touch panel 6041 detects a touch operation on or near it, it transmits it to the processor 610 to determine the type of the touch event, and then the processor 610 determines the type of the touch event according to the touch The type of event provides a corresponding visual output on the display panel 6061. Although the touch panel 6041 and the display panel 6061 are used as two independent components to realize the input and output functions of the device, in some embodiments, the touch panel 6041 and the display panel 6061 can be integrated to realize the input and output functions of the device. The output function is not limited here. The interface unit 608 is an interface for connecting an external device to the device 600 . For example, external devices may include wired or wireless headset ports, external power (or battery charger) ports, wired or wireless data ports, memory card ports, ports for connecting devices with identification modules, audio input/output (I/O) ports, video I/O ports, headphone ports, and more. Interface unit 608 may be used to receive input (eg, data information, power, etc.) from an external device and transmit the received input to one or more elements within device 600 or may be used between device 600 and an external device. transfer data between.
存储器609可用于存储软件程序以及各种数据。存储器609可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需的应用程序(比如声音播放功能、图像播放功能等)等;存储数据区可存储根据手机的使用所创建的数据(比如音频数据、电话本等)等。此外,存储器609可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。处理器610是设备的控制中心,利用各种接口和线路连接整个设备的各个部分,通过运行或执行存储在存储器609内的软件程序和/或模块,以及调用存储在存储器609内的数据,执行设备的各种功能和处理数据,从而对设备进行整体监控。处理器610可包括一个或多个处理单元;优选的,处理器610可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作系统、用户界面和应用程序 等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器610中。设备600还可以包括给各个部件供电的电源611(比如电池),优选的,电源611可以通过电源管理系统与处理器610逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗管理等功能。另外,设备600包括一些未示出的功能模块,在此不再赘述。以上所描述的装置实施例仅仅是示意性的,其中所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部模块来实现本实施例方案的目的。本领域普通技术人员在不付出创造性的劳动的情况下,即可以理解并实施。The memory 609 may be used to store software programs as well as various data. The memory 609 may mainly include a stored program area and a stored data area, wherein the stored program area may store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), etc.; Data created by the use of the mobile phone (such as audio data, phone book, etc.), etc. Additionally, memory 609 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. The processor 610 is the control center of the device, uses various interfaces and lines to connect various parts of the entire device, and executes by running or executing the software programs and/or modules stored in the memory 609, and calling the data stored in the memory 609. Various functions of the equipment and processing data, so as to monitor the equipment as a whole. The processor 610 may include one or more processing units; preferably, the processor 610 may integrate an application processor and a modem processor, wherein the application processor mainly processes the operating system, user interface, and application programs, etc., and the modem The processor mainly handles wireless communication. It can be understood that, the above-mentioned modulation and demodulation processor may not be integrated into the processor 610. The device 600 may also include a power supply 611 (such as a battery) for supplying power to various components. Preferably, the power supply 611 may be logically connected to the processor 610 through a power management system, so as to manage charging, discharging, and power consumption management through the power management system. Features. In addition, the device 600 includes some unshown functional modules, which will not be repeated here. The device embodiments described above are only illustrative, wherein the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in One place, or it can be distributed over multiple network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution in this embodiment. Those of ordinary skill in the art can understand and implement it without creative effort.
本发明的各个部件实施例可以以硬件实现,或者以在一个或者多个处理器上运行的软件模块实现,或者以它们的组合实现。本领域的技术人员应当理解,可以在实践中使用微处理器或者数字信号处理器来实现根据本发明实施例的计算处理设备中的一些或者全部部件的一些或者全部功能。本发明还可以实现为用于执行这里所描述的方法的一部分或者全部的设备或者装置程序(例如,计算机程序和计算机程序产品)。这样的实现本发明的程序可以存储在计算机可读介质上,或者可以具有一个或者多个信号的形式。这样的信号可以从因特网网站上下载得到,或者在载体信号上提供,或者以任何其他形式提供。例如,图7为本发明实施例提供的一种计算处理设备的框图,如图7所示,图7示出了可以实现根据本发明的方法的计算处理设备。该计算处理设备传统上包括处理器710和以存储器720形式的计算机程序产品或者计算机可读介质。存储器720可以是诸如闪存、EEPROM(电可擦除可编程只读存储器)、EPROM、硬盘或者ROM之类的电子存储器。存储器720具有用于执行上述方法中的任何方法步骤的程序代码的存储空间730。例如,用于程序代码的存储空间730可以包括分别用于实现上面的方法中的各种步骤的各个程序代码。这些程序代码可以从一个或者多个计算机程序产品中读出或者写入到这一个或者多个计算机程序产品中。这些计算机程序产品包括诸如硬盘,紧致盘(CD)、存储卡或者软盘之类的程序代码载体。这样的计算机程序产品通常为如参考图8所述的便携式或者固定存储单元。该存储单元可以具有与图7的计算处理设备中的存储器720类似布置的存储段、存储空间等。程序代码可以例如以适当形式进行压缩。通常,存储单元包括计算机可读代码,即可以由例如诸如710之类的处理器读取的代码,这些代码当由计算处理设备运行时,导致该计算处理设备执行上面所描述的方法中的各个步骤。本说明书中的各个实施例均采用递进的方式描述,每个实施例重点说明的都是与其他实施例的不同之处,各个实施例之间相同相似的部分互相参见即可。本文中所称的“一个实施例”、“实施例”或者“一个或者多个实施例”意味着,结合实施例描述的特定特征、结构或者特性包括在本发明的至少一个实施例中。此外,请注意,这里“在一个实施例中”的词语例子不一定全指同一个实施例。在此处所提供的说明书中,说明了大量具体细节。然而,能够理解,本发明的实施例可以在没有这些具体细节的情况下被实践。在一些实例中,并未详细示出公知的方法、结构和技术,以便不模糊对本说明书 的理解。在权利要求中,不应将位于括号之间的任何参考符号构造成对权利要求的限制。单词“包含”不排除存在未列在权利要求中的元件或步骤。位于元件之前的单词“一”或“一个”不排除存在多个这样的元件。本发明可以借助于包括有若干不同元件的硬件以及借助于适当编程的计算机来实现。在列举了若干装置的单元权利要求中,这些装置中的若干个可以是通过同一个硬件项来具体体现。单词第一、第二、以及第三等的使用不表示任何顺序。可将这些单词解释为名称。最后应说明的是:以上实施例仅用以说明本发明的技术方案,而非对其限制;尽管参照前述实施例对本发明进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本发明各实施例技术方案的精神和范围。Various component embodiments of the present invention may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art should understand that a microprocessor or a digital signal processor may be used in practice to implement some or all of the functions of some or all of the components in the computing processing device according to the embodiments of the present invention. The present invention can also be implemented as apparatus or apparatus programs (eg, computer programs and computer program products) for performing part or all of the methods described herein. Such a program implementing the present invention may be stored on a computer-readable medium, or may be in the form of one or more signals. Such signals may be downloaded from Internet sites, or provided on carrier signals, or in any other form. For example, FIG. 7 is a block diagram of a computing and processing device provided by an embodiment of the present invention. As shown in FIG. 7 , FIG. 7 shows a computing and processing device that can implement the method according to the present invention. The computing processing device traditionally includes a processor 710 and a computer program product or computer readable medium in the form of a memory 720 . The memory 720 may be electronic memory such as flash memory, EEPROM (electrically erasable programmable read only memory), EPROM, hard disk, or ROM. The memory 720 has storage space 730 for program code for performing any of the method steps in the above-described methods. For example, the storage space 730 for program codes may include various program codes for implementing various steps in the above methods, respectively. These program codes can be read from or written to one or more computer program products. These computer program products include program code carriers such as hard disks, compact disks (CDs), memory cards or floppy disks. Such computer program products are typically portable or fixed storage units as described with reference to FIG. 8 . The storage unit may have storage segments, storage spaces, etc. arranged similarly to the memory 720 in the computing processing device of FIG. 7 . The program code may, for example, be compressed in a suitable form. Typically, the storage unit includes computer readable code, ie code readable by a processor such as 710 for example, which when executed by a computing processing device, causes the computing processing device to perform each of the methods described above. step. The various embodiments in this specification are described in a progressive manner, and each embodiment focuses on the differences from other embodiments, and the same and similar parts between the various embodiments may be referred to each other. Reference herein to "one embodiment," "an embodiment," or "one or more embodiments" means that a particular feature, structure, or characteristic described in connection with an embodiment is included in at least one embodiment of the present invention. Also, please note that instances of the phrase "in one embodiment" herein are not necessarily all referring to the same embodiment. In the description provided herein, numerous specific details are set forth. It will be understood, however, that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention can be implemented by means of hardware comprising several different elements and by means of a suitably programmed computer. In a unit claim enumerating several means, several of these means may be embodied by one and the same item of hardware. The use of the words first, second, and third, etc. do not denote any order. These words can be interpreted as names. Finally, it should be noted that the above embodiments are only used to illustrate the technical solutions of the present invention, but not to limit them; although the present invention has been described in detail with reference to the foregoing embodiments, those of ordinary skill in the art should understand that it can still be Modifications are made to the technical solutions described in the foregoing embodiments, or some technical features thereof are equivalently replaced; and these modifications or replacements do not make the essence of the corresponding technical solutions depart from the spirit and scope of the technical solutions of the embodiments of the present invention.

Claims (26)

  1. 一种事件检测方法,其特征在于,所述方法包括:An event detection method, characterized in that the method comprises:
    获取通过工业相机采集的至少两帧图像;Acquire at least two frames of images captured by an industrial camera;
    根据所述至少两帧图像进行图像处理,得到处理结果;所述处理结果用于表征所述图像中像素点的亮度变化情况;Perform image processing according to the at least two frames of images to obtain a processing result; the processing result is used to characterize the brightness change of the pixels in the image;
    根据所述处理结果进行渲染,以获取事件图像;所述事件图像用于表征亮度变化事件;Rendering according to the processing result to obtain an event image; the event image is used to represent a brightness change event;
    输出所述事件图像。The event image is output.
  2. 根据权利要求1所述方法,其特征在于,所述根据所述至少两帧图像进行图像处理,包括:The method according to claim 1, wherein the performing image processing according to the at least two frames of images comprises:
    对于每帧所述图像,分别获取所述图像的各个颜色通道的值;For each frame of the image, obtain the value of each color channel of the image respectively;
    根据所述至少两帧图像的各个颜色通道的值,分别进行图像差分处理。Image difference processing is performed respectively according to the values of the respective color channels of the at least two frames of images.
  3. 根据权利要求1所述方法,其特征在于,所述处理结果包括差分图像;所述根据所述至少两帧图像进行图像处理,得到处理结果之后,所述方法还包括:The method according to claim 1, wherein the processing result comprises a differential image; after performing image processing according to the at least two frames of images to obtain a processing result, the method further comprises:
    对所述差分图像进行滤噪处理。Perform noise filtering processing on the difference image.
  4. 根据权利要求1所述方法,其特征在于,所述根据所述至少两帧图像进行图像处理,得到处理结果之前,所述方法还包括:The method according to claim 1, wherein, before the image processing is performed according to the at least two frames of images and a processing result is obtained, the method further comprises:
    从所述至少两帧图像中获取m帧图像;所述m为正整数;Obtain m frames of images from the at least two frames of images; the m is a positive integer;
    对所述m帧图像进行加权平均处理,以汇集所述m帧图像中的噪声信息;performing weighted average processing on the m frames of images to collect noise information in the m frames of images;
    对所述噪声信息进行滤噪处理。Perform noise filtering processing on the noise information.
  5. 根据权利要求1至4任一所述方法,其特征在于,所述亮度变化事件为物体运动;所述根据所述处理结果进行渲染,以获取事件图像之后,所述方法还包括:The method according to any one of claims 1 to 4, wherein the brightness change event is object motion; after the rendering according to the processing result to obtain the event image, the method further comprises:
    根据所述至少两帧图像获取高动态范围HDR图像,以及,根据所述事件图像确定所述物体的运动方向;Acquiring a high dynamic range HDR image according to the at least two frames of images, and determining the movement direction of the object according to the event image;
    根据所述HDR图像识别所述物体在外界环境中的位置信息;Identify the position information of the object in the external environment according to the HDR image;
    根据所述运动方向及所述位置信息,对所述物体进行避障。According to the movement direction and the position information, the object is avoided from obstacles.
  6. 根据权利要求5所述方法,其特征在于,所述根据所述至少两帧图像获取高动态范围HDR图像之后,所述方法还包括:The method according to claim 5, wherein after acquiring the high dynamic range HDR image according to the at least two frames of images, the method further comprises:
    根据所述事件图像中所述物体的运动信息,确定运动模糊核;determining a motion blur kernel according to the motion information of the object in the event image;
    根据所述运动模糊核对所述HDR图像进行去模糊处理。The HDR image is deblurred according to the motion blur check.
  7. 根据权利要求6所述方法,其特征在于,所述根据所述事件图像中所述物体的运动信息,确定运动模糊核,包括:The method according to claim 6, wherein the determining a motion blur kernel according to the motion information of the object in the event image comprises:
    根据所述事件图像中所述物体的运动信息,估计运动模糊模型;Estimating a motion blur model according to the motion information of the object in the event image;
    根据所述运动模糊模型,估计所述运动模糊核;estimating the motion blur kernel according to the motion blur model;
    其中,所述运动模糊核为空间不变模糊核或空间变化模糊核。Wherein, the motion blur kernel is a space-invariant blur kernel or a space-variant blur kernel.
  8. 根据权利要求5所述方法,其特征在于,所述根据所述至少两帧图像获取高动态范围HDR图像,包括:The method according to claim 5, wherein the acquiring a high dynamic range HDR image according to the at least two frames of images comprises:
    根据所述图像中各个像素点的灰度值,计算所述至少两帧图像中各个像素点的灰度值之和;According to the gray value of each pixel in the image, calculate the sum of the gray value of each pixel in the at least two frames of images;
    对各个所述灰度值之和进行归一化处理,得到所述HDR图像。The sum of each of the gray values is normalized to obtain the HDR image.
  9. 根据权利要求1所述方法,其特征在于,所述处理结果包括差分图像及所述差分图像中各个所述像素点对应的差分值;The method according to claim 1, wherein the processing result comprises a difference image and a difference value corresponding to each of the pixel points in the difference image;
    所述根据所述处理结果进行渲染,以获取事件图像,包括:The rendering according to the processing result to obtain the event image includes:
    根据所述差分图像中所述像素点对应的差分值的符号,确定所述像素点对应的渲染颜色,以及根据所述差分值的绝对值,确定所述渲染颜色的颜色浓度;Determine the rendering color corresponding to the pixel point according to the sign of the difference value corresponding to the pixel point in the difference image, and determine the color density of the rendering color according to the absolute value of the difference value;
    根据所述渲染颜色及所述颜色浓度对所述像素点进行渲染。The pixel points are rendered according to the rendering color and the color density.
  10. 根据权利要求1所述方法,其特征在于,所述处理结果包括差分图像及所述差分图像中各个所述像素点对应的差分值;The method according to claim 1, wherein the processing result comprises a difference image and a difference value corresponding to each of the pixel points in the difference image;
    所述根据所述处理结果进行渲染,以获取事件图像,包括:The rendering according to the processing result to obtain the event image includes:
    根据预设的数值与渲染颜色之间的对应关系及所述差分图像中所述像素点对应的差分值,确定所述像素点对应的渲染颜色;Determine the rendering color corresponding to the pixel point according to the corresponding relationship between the preset value and the rendering color and the difference value corresponding to the pixel point in the difference image;
    根据所述渲染颜色对所述像素点进行渲染。The pixel points are rendered according to the rendering color.
  11. 根据权利要求1所述方法,其特征在于,所述工业相机的帧率不小于100帧/每秒。The method according to claim 1, wherein the frame rate of the industrial camera is not less than 100 frames per second.
  12. 根据权利要求1所述方法,其特征在于,所述方法还包括:The method according to claim 1, wherein the method further comprises:
    输出所述至少两帧图像对应的灰度图或颜色图。A grayscale map or a color map corresponding to the at least two frames of images is output.
  13. 一种事件检测装置,其特征在于,所述装置包括存储器和处理器;An event detection device, characterized in that the device includes a memory and a processor;
    所述存储器,用于存储程序代码;the memory for storing program codes;
    所述处理器,调用所述程序代码,当所述程序代码被执行时,用于执行以下操作:The processor calls the program code, and when the program code is executed, is configured to perform the following operations:
    获取通过工业相机采集的至少两帧图像;Acquire at least two frames of images captured by an industrial camera;
    根据所述至少两帧图像进行图像处理,得到处理结果;所述处理结果用于表征所述图像中像素点的亮度变化情况;Perform image processing according to the at least two frames of images to obtain a processing result; the processing result is used to characterize the brightness change of the pixels in the image;
    根据所述处理结果进行渲染,以获取事件图像;所述事件图像用于表征亮度变化事件;Rendering according to the processing result to obtain an event image; the event image is used to represent a brightness change event;
    输出所述事件图像。The event image is output.
  14. 根据权利要求13所述装置,其特征在于,所述处理器具体用于:The apparatus according to claim 13, wherein the processor is specifically configured to:
    对于每帧所述图像,分别获取所述图像的各个颜色通道的值;For each frame of the image, obtain the value of each color channel of the image respectively;
    根据所述至少两帧图像的各个颜色通道的值,分别进行图像差分处理。Image difference processing is performed respectively according to the values of the respective color channels of the at least two frames of images.
  15. 根据权利要求13所述装置,其特征在于,所述处理结果包括差分图像;所述处理器还用于:The apparatus according to claim 13, wherein the processing result comprises a differential image; the processor is further configured to:
    对所述差分图像进行滤噪处理。Perform noise filtering processing on the difference image.
  16. 根据权利要求13所述装置,其特征在于,所述处理器还用于:The apparatus according to claim 13, wherein the processor is further configured to:
    从所述至少两帧图像中获取m帧图像;所述m为正整数;Obtain m frames of images from the at least two frames of images; the m is a positive integer;
    对所述m帧图像进行加权平均处理,以汇集所述m帧图像中的噪声信息;performing weighted average processing on the m frames of images to collect noise information in the m frames of images;
    对所述噪声信息进行滤噪处理。Perform noise filtering processing on the noise information.
  17. 根据权利要求13至16任一所述装置,其特征在于,所述亮度变化事件为物体运动;所述处理器还用于:The device according to any one of claims 13 to 16, wherein the brightness change event is object motion; the processor is further configured to:
    根据所述至少两帧图像获取高动态范围HDR图像,以及,根据所述事件图像确定所述物体的运动方向;Acquiring a high dynamic range HDR image according to the at least two frames of images, and determining the movement direction of the object according to the event image;
    根据所述HDR图像识别所述物体在外界环境中的位置信息;Identify the position information of the object in the external environment according to the HDR image;
    根据所述运动方向及所述位置信息,对所述物体进行避障。According to the movement direction and the position information, the object is avoided from obstacles.
  18. 根据权利要求17所述装置,其特征在于,所述处理器还用于:The apparatus of claim 17, wherein the processor is further configured to:
    根据所述事件图像中所述物体的运动信息,确定运动模糊核;determining a motion blur kernel according to the motion information of the object in the event image;
    根据所述运动模糊核对所述HDR图像进行去模糊处理。The HDR image is deblurred according to the motion blur check.
  19. 根据权利要求18所述装置,其特征在于,所述处理器具体用于:The apparatus according to claim 18, wherein the processor is specifically configured to:
    根据所述事件图像中所述物体的运动信息,估计运动模糊模型;Estimating a motion blur model according to the motion information of the object in the event image;
    根据所述运动模糊模型,估计所述运动模糊核;estimating the motion blur kernel according to the motion blur model;
    其中,所述运动模糊核为空间不变模糊核或空间变化模糊核。Wherein, the motion blur kernel is a space-invariant blur kernel or a space-variant blur kernel.
  20. 根据权利要求17所述装置,其特征在于,所述处理器具体用于:The apparatus according to claim 17, wherein the processor is specifically configured to:
    根据所述图像中各个像素点的灰度值,计算所述至少两帧图像中各个像素点的灰度值之和;According to the gray value of each pixel in the image, calculate the sum of the gray value of each pixel in the at least two frames of images;
    对各个所述灰度值之和进行归一化处理,得到所述HDR图像。The sum of each of the gray values is normalized to obtain the HDR image.
  21. 根据权利要求13所述装置,其特征在于,所述处理结果包括差分图像及所述差分图像中各个所述像素点对应的差分值;The device according to claim 13, wherein the processing result comprises a difference image and a difference value corresponding to each of the pixel points in the difference image;
    所述处理器具体用于:The processor is specifically used for:
    根据所述差分图像中所述像素点对应的差分值的符号,确定所述像素点对应的渲染颜色,以及根据所述差分值的绝对值,确定所述渲染颜色的颜色浓度;Determine the rendering color corresponding to the pixel point according to the sign of the difference value corresponding to the pixel point in the difference image, and determine the color density of the rendering color according to the absolute value of the difference value;
    根据所述渲染颜色及所述颜色浓度对所述像素点进行渲染。The pixel points are rendered according to the rendering color and the color density.
  22. 根据权利要求13所述装置,其特征在于,所述处理结果包括差分图像及所述差分图像中各个所述像素点对应的差分值;The device according to claim 13, wherein the processing result comprises a difference image and a difference value corresponding to each of the pixel points in the difference image;
    所述处理器具体用于:The processor is specifically used for:
    根据预设的数值与渲染颜色之间的对应关系及所述差分图像中所述像素点对应的差分值,确定所述像素点对应的渲染颜色;Determine the rendering color corresponding to the pixel point according to the corresponding relationship between the preset value and the rendering color and the difference value corresponding to the pixel point in the difference image;
    根据所述渲染颜色对所述像素点进行渲染。The pixel points are rendered according to the rendering color.
  23. 根据权利要求13所述装置,其特征在于,所述工业相机的帧率不小于100帧/每秒。The device according to claim 13, wherein the frame rate of the industrial camera is not less than 100 frames per second.
  24. 根据权利要求13所述装置,其特征在于,所述处理器还用于:The apparatus according to claim 13, wherein the processor is further configured to:
    输出所述至少两帧图像对应的灰度图或颜色图。A grayscale map or a color map corresponding to the at least two frames of images is output.
  25. 一种可移动平台,其特征在于,所述可移动平台包含工业相机和权利要求13至24中任一所述的事件检测装置;所述事件检测装置用于执行以下操作:A movable platform, characterized in that the movable platform comprises an industrial camera and the event detection device according to any one of claims 13 to 24; the event detection device is used to perform the following operations:
    获取通过工业相机采集的至少两帧图像;Acquire at least two frames of images captured by an industrial camera;
    根据所述至少两帧图像进行图像处理,得到处理结果;所述处理结果用于表征所述图像中像素点的亮度变化情况;Perform image processing according to the at least two frames of images to obtain a processing result; the processing result is used to characterize the brightness change of the pixels in the image;
    根据所述处理结果进行渲染,以获取事件图像;所述事件图像用于表征亮度变化事件;Rendering according to the processing result to obtain an event image; the event image is used to represent a brightness change event;
    输出所述事件图像。The event image is output.
  26. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质上存储计算机程序,所述计算机程序被处理器执行时实现以下操作:A computer-readable storage medium, characterized in that a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the following operations are implemented:
    获取通过工业相机采集的至少两帧图像;Acquire at least two frames of images captured by an industrial camera;
    根据所述至少两帧图像进行图像处理,得到处理结果;所述处理结果用于表征所述图像中像素点的亮度变化情况;Perform image processing according to the at least two frames of images to obtain a processing result; the processing result is used to characterize the brightness change of the pixels in the image;
    根据所述处理结果进行渲染,以获取事件图像;所述事件图像用于表征亮度变化事件;Rendering according to the processing result to obtain an event image; the event image is used to represent a brightness change event;
    输出所述事件图像。The event image is output.
PCT/CN2020/107427 2020-08-06 2020-08-06 Event detection method and device, movable platform, and computer-readable storage medium WO2022027444A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2020/107427 WO2022027444A1 (en) 2020-08-06 2020-08-06 Event detection method and device, movable platform, and computer-readable storage medium
CN202080059855.5A CN114341650A (en) 2020-08-06 2020-08-06 Event detection method and device, movable platform and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/107427 WO2022027444A1 (en) 2020-08-06 2020-08-06 Event detection method and device, movable platform, and computer-readable storage medium

Publications (1)

Publication Number Publication Date
WO2022027444A1 true WO2022027444A1 (en) 2022-02-10

Family

ID=80118712

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/107427 WO2022027444A1 (en) 2020-08-06 2020-08-06 Event detection method and device, movable platform, and computer-readable storage medium

Country Status (2)

Country Link
CN (1) CN114341650A (en)
WO (1) WO2022027444A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118470071A (en) * 2024-07-10 2024-08-09 大连展航科技有限公司 Intelligent vision monitoring and tracking method and system for general network

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1506686A (en) * 2002-12-09 2004-06-23 北京中星微电子有限公司 Moving image detecting method
CN1632593A (en) * 2004-12-31 2005-06-29 北京中星微电子有限公司 Motion image detecting method and circuit
CN1632594A (en) * 2004-12-31 2005-06-29 北京中星微电子有限公司 Motion image detecting method and circuit
CN102467800A (en) * 2010-11-05 2012-05-23 无锡市美网网络信息技术有限公司 Invasion detection and alarm system
CN106463032A (en) * 2014-03-03 2017-02-22 Vsk电子有限公司 Intrusion detection with directional sensing

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6646763B1 (en) * 1999-11-12 2003-11-11 Adobe Systems Incorporated Spectral color matching to a device-independent color value
CN202794523U (en) * 2012-07-27 2013-03-13 符建 Three-dimensional imaging radar system based on flight spectrum

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1506686A (en) * 2002-12-09 2004-06-23 北京中星微电子有限公司 Moving image detecting method
CN1632593A (en) * 2004-12-31 2005-06-29 北京中星微电子有限公司 Motion image detecting method and circuit
CN1632594A (en) * 2004-12-31 2005-06-29 北京中星微电子有限公司 Motion image detecting method and circuit
CN102467800A (en) * 2010-11-05 2012-05-23 无锡市美网网络信息技术有限公司 Invasion detection and alarm system
CN106463032A (en) * 2014-03-03 2017-02-22 Vsk电子有限公司 Intrusion detection with directional sensing

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118470071A (en) * 2024-07-10 2024-08-09 大连展航科技有限公司 Intelligent vision monitoring and tracking method and system for general network

Also Published As

Publication number Publication date
CN114341650A (en) 2022-04-12

Similar Documents

Publication Publication Date Title
TWI696146B (en) Method and apparatus of image processing, computer reading storage medium and mobile terminal
CN109413563B (en) Video sound effect processing method and related product
CN107613191B (en) Photographing method, photographing equipment and computer readable storage medium
CN110198412B (en) Video recording method and electronic equipment
CN107230182B (en) Image processing method and device and storage medium
CN107038681B (en) Image blurring method and device, computer readable storage medium and computer device
CN112449120B (en) High dynamic range video generation method and device
CN108234882B (en) Image blurring method and mobile terminal
CN108038836B (en) Image processing method and device and mobile terminal
CN108900750B (en) Image sensor and mobile terminal
CN107707827A (en) A kind of high-dynamics image image pickup method and mobile terminal
CN107948505B (en) Panoramic shooting method and mobile terminal
CN108965665B (en) image sensor and mobile terminal
CN109151348B (en) Image processing method, electronic equipment and computer readable storage medium
CN107895352A (en) A kind of image processing method and mobile terminal
WO2021190387A1 (en) Detection result output method, electronic device, and medium
CN110969060A (en) Neural network training method, neural network training device, neural network tracking method, neural network training device, visual line tracking device and electronic equipment
CN109104578B (en) Image processing method and mobile terminal
CN111145151B (en) Motion area determining method and electronic equipment
CN113421211A (en) Method for blurring light spots, terminal device and storage medium
CN109639981B (en) Image shooting method and mobile terminal
CN108932505B (en) Image processing method and electronic equipment
CN110944163A (en) Image processing method and electronic equipment
WO2022027444A1 (en) Event detection method and device, movable platform, and computer-readable storage medium
CN107798662B (en) Image processing method and mobile terminal

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20948110

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20948110

Country of ref document: EP

Kind code of ref document: A1