CN116614711A - Event camera control method, device, storage medium and event camera - Google Patents

Event camera control method, device, storage medium and event camera Download PDF

Info

Publication number
CN116614711A
CN116614711A CN202210119627.3A CN202210119627A CN116614711A CN 116614711 A CN116614711 A CN 116614711A CN 202210119627 A CN202210119627 A CN 202210119627A CN 116614711 A CN116614711 A CN 116614711A
Authority
CN
China
Prior art keywords
point cloud
event
cloud data
module
data output
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210119627.3A
Other languages
Chinese (zh)
Inventor
宋亚龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202210119627.3A priority Critical patent/CN116614711A/en
Publication of CN116614711A publication Critical patent/CN116614711A/en
Pending legal-status Critical Current

Links

Landscapes

  • Studio Devices (AREA)

Abstract

The application discloses an event camera control method and device, a storage medium and an event camera, and relates to the technical field of event cameras. Firstly, if the light intensity value of the point cloud data output by the TOF module is monitored to be within a preset light intensity range, acquiring the point cloud data output by the TOF module; event data output by the event camera is then determined from the point cloud data. When the light intensity value of the point cloud data output by the TOF module is monitored to be in the preset light intensity range, the event camera is represented in the extremely bright or extremely dark environment, the interference suffered by the event camera module is large, and the point cloud data output by the TOF module in the extremely bright or extremely dark environment is available, so that the point cloud data output by the TOF module can be acquired, the event data output by the event camera is determined according to the point cloud data, the event camera can be ensured to be still used in the extremely bright or extremely dark environment, and the application range and the use stability of the event camera are improved.

Description

Event camera control method, device, storage medium and event camera
Technical Field
The present application relates to the field of event cameras, and in particular, to a method and apparatus for controlling an event camera, a storage medium, and an event camera.
Background
For a conventional camera, there is one parameter: the frame rate, i.e. the conventional camera, captures images at a constant frequency. However, even though the conventional camera can reach a higher frequency, the conventional camera has a certain delay, so the conventional camera has a certain delay problem.
The occurrence of the event camera solves the technical problem that when a large number of pixels change due to the movement of an object or the change of illumination in a scene, the event camera generates a series of events which are output in an event stream mode, the data volume of the event stream is far smaller than the data transmitted by the traditional camera, and the event stream has no minimum time unit, so that the event camera has low delay characteristics unlike the traditional camera which outputs the data at regular time.
However, in the related art, the event camera is very interfered by the self-contained vision module in the camera in a very bright or very dark environment, so that it is difficult to accurately distinguish the brightness change of the current environment, and the event camera is difficult to normally use in such an extreme environment.
Disclosure of Invention
The application provides an event camera control method, an event camera control device, a storage medium and an event camera, which can solve the technical problem that in the prior art, the event camera is difficult to accurately distinguish the brightness change of the current environment in the extremely bright or extremely dark environment, and the normal use of the event camera is influenced.
In a first aspect, an embodiment Of the present application provides an event camera control method, applied to an event camera, where the event camera at least includes a Time Of Flight (TOF) module and an event camera module, the method includes:
if the light intensity value of the point cloud data output by the TOF module is monitored to be in a preset light intensity range, the point cloud data output by the TOF module is obtained;
and determining event data output by the event camera according to the point cloud data.
Optionally, the determining the event data output by the event camera according to the point cloud data includes: converting the point cloud data to obtain conversion data, and taking the conversion data as event data output by the event camera; wherein the conversion data and the event data are the same type of data.
Optionally, the converting the point cloud data to obtain converted data includes: converting the point cloud data from a TOF module coordinate system corresponding to the TOF module to a camera pixel coordinate system corresponding to the event camera module based on a preset conversion matrix; and obtaining conversion data according to the converted point cloud data, so that the conversion data and the event data are data under the same coordinate system.
Optionally, before the converting the point cloud data to obtain converted data, the method includes: determining a TOF module coordinate system corresponding to the TOF module and a camera pixel coordinate system corresponding to the event camera module; and calibrating the TOF module coordinate system and the camera pixel coordinate system to obtain a preset conversion matrix between the TOF module coordinate system and the camera pixel coordinate system.
Optionally, if the light intensity value of the point cloud data output by the TOF module is detected to be within a preset light intensity range, the method includes: if it is determined that the point cloud data with the light intensity value within the preset light intensity range exists in the point cloud data output by the TOF module, it is determined that the light intensity value of the point cloud data output by the TOF module is monitored within the preset light intensity range.
Optionally, the acquiring the point cloud data output by the TOF module includes: and if the number of the target point cloud data is determined to be larger than a preset number threshold, acquiring the target point cloud data in the point cloud data output by the TOF module.
Optionally, the method further comprises: if the light intensity value of the point cloud data output by the TOF module is not in the preset light intensity range, acquiring the data output by the event camera module, and determining the event data output by the event camera module according to the data output by the event camera module.
In a second aspect, an embodiment of the present application provides an event camera control device, which is applied to an event camera, wherein a TOF module and an event camera module are disposed in the event camera, and the device includes:
the light intensity judging module is used for acquiring the point cloud data output by the TOF module if the light intensity value of the point cloud data output by the TOF module is monitored to be in a preset light intensity range;
and the conversion module is used for determining event data output by the event camera according to the point cloud data.
In a third aspect, embodiments of the present application provide a computer storage medium storing a plurality of instructions adapted to be loaded by a processor and to perform the steps of the method described above.
In a fourth aspect, an embodiment of the present application provides an event camera, including a TOF module and an event camera module; the event camera further comprises a memory, a processor and a computer program stored on the memory and executable on the processor, said computer program being adapted to be loaded by the processor and to perform the above-mentioned method steps.
In a fifth aspect, an embodiment of the present application provides an electronic device, including an event camera, where the event camera includes a TOF module and an event camera module;
The electronic device further comprises a memory, a processor and a computer program stored on the memory and executable on the processor, the computer program being adapted to be loaded by the processor and to perform the above-mentioned method steps.
The technical scheme provided by the embodiments of the application has the beneficial effects that at least:
the application provides an event camera control method, which comprises the steps of firstly, if the light intensity value of point cloud data output by a TOF module is monitored to be in a preset light intensity range, acquiring the point cloud data output by the TOF module; event data output by the event camera is then determined from the point cloud data. When the light intensity value of the point cloud data output by the TOF module is monitored to be in the preset light intensity range, the event camera is represented in the extremely bright or extremely dark environment, the event camera module is greatly disturbed, the brightness change of the current environment is difficult to accurately distinguish, the point cloud data output by the TOF module in the extremely bright or extremely dark environment is available, therefore, the point cloud data output by the TOF module can be obtained, the event data output by the event camera is determined according to the point cloud data, the event camera can be ensured to be still used in the extremely bright or extremely dark environment, and the application range and the use stability of the event camera are improved.
Drawings
In order to more clearly illustrate the embodiments of the application or the technical solutions in the prior art, the drawings that are necessary for the description of the embodiments or the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the application and that other drawings may be obtained from them without inventive effort for a person skilled in the art.
Fig. 1 is an exemplary system architecture diagram of a method for controlling an event camera according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of an event camera according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of a TOF module according to an embodiment of the present application;
fig. 4 is a flowchart of a method for controlling an event camera according to another embodiment of the present application;
fig. 5 is a flowchart of a method for controlling an event camera according to another embodiment of the present application;
fig. 6 is a schematic structural diagram of an event camera control device according to another embodiment of the present application;
fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the features and advantages of the present application more comprehensible, embodiments accompanied with figures in the present application are described in detail below, wherein the embodiments are described only in some but not all embodiments of the present application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
Fig. 1 is an exemplary system architecture diagram of a method for controlling an event camera according to an embodiment of the present application.
As shown in fig. 1, the system architecture may include an electronic device 101, a network 102, and a server 103. Network 102 is the medium used to provide communication links between electronic device 101 and server 103. Network 102 may include various types of wired or wireless communication links, such as: the wired communication link includes an optical fiber, a twisted pair wire, or a coaxial cable, and the Wireless communication link includes a bluetooth communication link, a Wireless-Fidelity (Wi-Fi) communication link, a microwave communication link, or the like.
The electronic device 101 may interact with the server 103 over the network 102 to receive messages from the server 103 or to send messages to the server 103. The electronic device 101 may be hardware or software. When the electronic device 101 is hardware, it may be a variety of electronic devices including, but not limited to, a smart robot, a smart watch, a smart phone, a tablet computer, a laptop portable computer, a desktop computer, and the like. When the electronic device 101 is software, it may be installed in the above-listed electronic device, and may be implemented as a plurality of software or software modules (for example, to provide distributed services), or may be implemented as a single software or software module, which is not specifically limited herein.
The electronic device 101 may also be provided with an event camera, where the event camera includes a TOF module and an event camera module, and the event camera may drive the TOF module and the event camera module to work through the camera driving board at the same time, that is, after the event camera is started, point cloud data corresponding to a current scene may be continuously obtained through the TOF module, where the point cloud data includes coordinates of each pixel and a light intensity value in the current scene; and when the brightness in the current scene changes, acquiring event data corresponding to the current scene through the event camera module, wherein the event data comprises a time stamp, a pixel coordinate and a polarity of each pixel.
The server 103 may be a business server providing various services. The server 103 may be hardware or software. When the server 103 is hardware, it may be implemented as a distributed server cluster composed of a plurality of servers, or may be implemented as a single server. When the server 103 is software, it may be implemented as a plurality of software or software modules (for example, to provide a distributed service), or may be implemented as a single software or software module, which is not specifically limited herein.
It should be understood that the number of electronic devices, networks, and servers in fig. 1 is merely illustrative, and any number of electronic devices, networks, and servers may be used as desired for implementation.
For a conventional camera, either a CMOS sensor, a CCD sensor, or an RGBD camera, there is one parameter: the frame rate, and thus the conventional camera captures an image at a constant frequency, has a delay of 1ms even though the frame rate of the conventional camera can reach 1KHz, so that the conventional camera has a certain delay problem. In addition, the conventional camera requires exposure for a certain time to accumulate a certain amount of photons on the photosensitive device, and thus, if an object moves at a high speed during the exposure time, blurring may occur, which is also a problem of the conventional camera.
The above problems of the conventional camera are that they cannot be completely solved due to the principle of the conventional camera even though the high performance camera can reduce the problems to some extent due to the limitation of the hardware of the conventional camera itself, and the problems greatly limit the application scenarios of the conventional camera.
The event camera is a novel sensor, and the event camera does not have the problems. Unlike a traditional camera which shoots a complete image, an event camera shoots an event, which can be simply understood as a change of pixel brightness, namely, the event camera outputs a change of pixel brightness. The event camera has the following English name: event-based Camera, or simply Event Camera, abbreviated EB, sometimes also referred to as dynamic vision sensor (DVS, dynamic Vision Sensor).
The most basic principle of event cameras is: when the brightness change of a certain pixel in a scene corresponding to an object reaches a certain threshold value, an event data is output, specifically, when a large number of pixels change due to the movement or illumination change of the object in the scene, a series of event data are generated, the event data are output in an event stream mode, the data amount of the event stream is far smaller than the data transmitted by a traditional camera, and the event stream has no minimum time unit, so that the event stream has low delay characteristics unlike the timing output data of the traditional camera. Several concepts in event cameras need to be emphasized here: first, the brightness changes, which are related to the output and changes of the event camera, are not related to the absolute value of brightness; second, threshold: when the brightness changes to some extent, event data will be output, this threshold being an intrinsic parameter of the event camera.
Further, event data output by the event camera has three elements: the time stamp, the pixel coordinates and the polarities respectively represent the time when the pixel corresponding to the event data changes, the pixel coordinate position corresponding to the event data and the brightness change condition of the pixel corresponding to the event data, namely, an event data expresses 'at what time, which pixel point the brightness is increased or decreased'.
However, in the related art, the event camera is in a very bright or very dark environment, the self-contained vision module in the camera is greatly disturbed, and it is difficult to accurately distinguish the brightness change of the current environment.
Referring to fig. 2, fig. 2 is a schematic structural diagram of an event camera according to an embodiment of the application.
As shown in fig. 2, in order to solve the problem that the event camera is difficult to be normally used in an extreme environment, the event camera in the related art is improved. Specifically, a side view Of the event camera 200 is shown in fig. 2A, and a front view Of the event camera 200 is shown in fig. 2B, in an embodiment Of the present application, the event camera 200 includes at least a Time Of Flight (TOF) module 210 and an event camera module 220, further, the event camera 200 may further include a camera driving board 230, and the camera driving board 230 may drive the TOF module 210 and the event camera module 220 simultaneously, so that the TOF module 210 and the event camera module 220 may operate simultaneously. When a large number of pixels change due to object motion or illumination change in the scene, the event camera module 220 is configured to generate a series of event data, and the event data is output in an event stream (event stream) manner.
Referring to fig. 3, fig. 3 is a schematic structural diagram of a TOF module according to an embodiment of the application.
As shown in fig. 3, a side view of the TOF module 210 is shown in fig. 3A, and a front view of the TOF module 210 is shown in fig. 3B, where the TOF module 210 may at least include an emitter 211, a receiver 212, and a module driving board 213, and the principle of the TOF module 210 is that, after the module driving board 213 drives the emitter 211 and the receiver 212 to operate, a modulated light pulse is emitted by the emitter 211, the light pulse encounters an object and then the reflected light pulse is received by the receiver 212, and a distance between the light pulse and the object is calculated according to a round trip time of the light pulse, and point cloud data of a scene corresponding to the object is determined according to the light pulse, where the point cloud data includes coordinates of each pixel in the scene and a light intensity value.
Optionally, the TOF module in the embodiment of the present application may specifically be a direct measurement time of flight (DirectTime Of Flight, DTOF) module; the TOF module in the embodiment of the present application may specifically be an indirect measurement time of flight (Indirect Time Of Flight, ITOF) module, and for convenience of description, the TOF module is hereinafter described as a DTOF module, where the transmitter may be a vertical cavity surface emitting laser, and the receiver may be a single photon avalanche diode.
Because TOF module can directly measure every pixel coordinate and light intensity value in the scene through the light pulse, and the light pulse is not influenced by external brightness variation, therefore TOF module has the performance of high interference immunity to brightness variation, can realize the luminance perception of extreme environment. In order to solve the problem that the event camera is difficult to use normally in an extreme environment, one possible idea is to acquire the acquired point cloud data of the TOF module when the event camera is in an extremely bright or dark environment, and to process the point cloud data to serve as the event data of the current event camera.
Referring to fig. 4, fig. 4 is a flowchart illustrating a method for controlling an event camera according to another embodiment of the application.
As shown in fig. 4, the method includes:
s401, if the light intensity value of the point cloud data output by the TOF module is monitored to be in the preset light intensity range, the point cloud data output by the TOF module is obtained.
In the embodiment of the application, the event camera control method is mainly applied to an event camera with a TOF module and an event camera module, and after the event camera is started in a certain scene, the TOF module and the event camera module can be driven to work simultaneously through a camera driving board, namely, after the event camera is started, point cloud data corresponding to the current scene can be continuously obtained through the TOF module, wherein the point cloud data comprises each pixel coordinate and a light intensity value in the current scene; and when the brightness in the current scene changes, acquiring event data corresponding to the current scene through the event camera module, wherein the event data comprises a time stamp, a pixel coordinate and a polarity of each pixel.
Because the most basic principle of the event camera is that when the brightness change of a certain pixel reaches a certain threshold value, an event data is output, so when the event camera is in a very bright or very dark environment, even if the brightness of the current environment changes, the event camera cannot output the corresponding event data at the moment, the TOF module can directly measure the coordinate and the light intensity value of each pixel in the scene through light pulses, and the light pulses are not influenced by the change of the external brightness, so that the TOF module can continuously realize the brightness perception of the external environment.
The preset light intensity range can be determined in advance according to the light intensity value of the corresponding extremely bright environment or the light intensity value of the corresponding extremely dark environment when the event camera cannot normally work, namely, the event camera is represented to be in the extremely bright or extremely dark environment as long as the light intensity value of the pixel in the current scene is within the preset light intensity range, and the event camera cannot normally work at the moment.
For example, in a very bright environment (the illumination value of the environment is greater than 100000 lux) or in a very dark environment (the illumination intensity of the environment is less than 0.001 lux), when the event camera cannot work normally in the current environment, the preset light intensity range may be set as follows: the illumination value is greater than 100000 lux and/or the illumination intensity is less than 0.001 lux.
After the event camera is started in a certain scene, the point cloud data corresponding to the current scene output by the TOF module can be continuously obtained, and the light intensity value of the point cloud data output by the TOF module is monitored, in particular whether the light intensity value of the point cloud data output by the TOF module is in a preset light intensity range or not is monitored.
If the light intensity value of the point cloud data output by the TOF module is monitored to be within the preset light intensity range, the event camera is in an extremely bright or extremely dark environment, and cannot work normally, even if the brightness of the current environment changes, the event camera cannot output corresponding event data at the moment. The TOF module directly measures the coordinates and the light intensity value of each pixel in the scene through the light pulse, and the light pulse is not influenced by the change of external brightness, so that the point cloud data output by the TOF module can be used and accurate, and the point cloud data currently output by the TOF module can be acquired, so that the event data output by the event camera can be conveniently determined according to the point cloud data output by the TOF module.
S402, determining event data output by the event camera according to the point cloud data.
It can be understood that, since the TOF module and the event camera module are two different imaging devices, and the TOF module and the event camera module output corresponding pixel data based on different imaging principles, the data types of the point cloud data output by the TOF module and the event data output by the event camera module are different, so if the light intensity value of the point cloud data output by the TOF module is monitored to be within the preset light intensity range, the point cloud data output by the TOF module is obtained, and the point cloud data output by the TOF module is processed, so that the processed point cloud data is the same as the data type of the event data output by the event camera module, and the processed point cloud data can be used as the event data finally output by the event camera because the event camera is to output the event data.
It can be understood that the data precision of the point cloud data output by the TOF module and the accuracy of the description scene pixels are smaller than the event data directly output by the event camera module, so that the processed point cloud data can be used as the event data finally output by the event camera only when the light intensity value of the point cloud data output by the TOF module is monitored to be within the preset light intensity range.
If the light intensity value of the point cloud data output by the TOF module is not in the preset light intensity range, the event camera is not in the extremely bright or extremely dark environment, and the event camera can work normally, the data output by the event camera module (namely, the data type output by the event camera module is the data of the event data) can be obtained, and the event data output by the event camera is determined according to the data output by the event camera module, so that the accuracy of the event data output by the event camera under the normal condition can be ensured.
In the embodiment of the application, firstly, if the light intensity value of the point cloud data output by the TOF module is monitored to be within the preset light intensity range, the point cloud data output by the TOF module is obtained; event data output by the event camera is then determined from the point cloud data. When the light intensity value of the point cloud data output by the TOF module is monitored to be in the preset light intensity range, the event camera is represented in the extremely bright or extremely dark environment, the event camera module is greatly disturbed, the brightness change of the current environment is difficult to accurately distinguish, the point cloud data output by the TOF module in the extremely bright or extremely dark environment is available, therefore, the point cloud data output by the TOF module can be obtained, the event data output by the event camera is determined according to the point cloud data, the event camera can be ensured to be still used in the extremely bright or extremely dark environment, and the application range and the use stability of the event camera are improved.
Referring to fig. 5, fig. 5 is a flowchart illustrating a method for controlling an event camera according to another embodiment of the application.
As shown in fig. 5, the method includes:
s501, if it is determined that the point cloud data with the light intensity value within the preset light intensity range exists in the point cloud data output by the TOF module, it is determined that the light intensity value of the point cloud data output by the TOF module is monitored within the preset light intensity range.
It can be understood that after a preset light intensity range is determined in advance according to the light intensity value of the corresponding extremely bright environment or the light intensity value of the corresponding extremely dark environment when the event camera cannot normally work, the point cloud data corresponding to the current scene output by the TOF module can be continuously obtained after the event camera is started in a certain scene, the light intensity value of the point cloud data output by the TOF module is monitored, specifically, whether the point cloud data with the light intensity value in the preset light intensity range exists in the point cloud data output by the TOF module in the preset time is monitored, if the point cloud data with the light intensity value in the preset light intensity range exists in the point cloud data output by the TOF module is monitored, the point cloud data with all the light intensity values in the preset light intensity range in the preset time is determined to be the target point cloud data, and if the point cloud data with the target point cloud data exists in the point cloud data output by the TOF module is determined, the light intensity value of the point cloud data output by the TOF module is monitored in the preset light intensity range is also determined.
Optionally, the light intensity values of the point cloud data output by the TOF module may exist in a sequence form of a certain arrangement order, so a method for monitoring whether the point cloud data with the light intensity value within a preset light intensity range exists in the point cloud data output by the TOF module in a preset time may be to directly monitor whether the light intensity value sequence value of the light intensity value corresponding to the point cloud data output by the TOF module in the sequence within the preset light intensity range, and if the light intensity value sequence value exists in the point cloud data output by the TOF module in the preset light intensity range, determining the point cloud data with all the light intensity value sequence values within the preset light intensity range within the preset time as the target point cloud data.
S502, if the number of the target point cloud data is determined to be larger than a preset number threshold, acquiring the target point cloud data in the point cloud data output by the TOF module.
Further, in order to avoid that the light intensity value of the individual point cloud data output by the TOF module is within the preset light intensity range, so that the light intensity value of most of the point cloud data output by the TOF module is within the preset light intensity range, the number of the target point cloud data whose light intensity value is within the preset light intensity range may be compared with a preset number threshold before the point cloud data output by the TOF module is acquired, where the preset number threshold represents a minimum number value for judging that the light intensity value of most of the point cloud data output by the TOF module is within the preset light intensity range according to the number of the target point cloud data, and the preset number threshold may be set as required.
If the number of the target point cloud data is determined to be greater than the preset number threshold, the majority of the point cloud data output by the representative TOF module are in the preset light intensity range, and the event camera can be determined to be in a very bright or very dark environment and can not work normally, and at the moment, the target point cloud data in the point cloud data output by the TOF module are acquired.
S503, converting the point cloud data to obtain conversion data, and taking the conversion data as event data output by an event camera.
It can be understood that, since the TOF module and the event camera module are two different imaging devices, and the TOF module and the event camera module output corresponding pixel data based on different imaging principles, the data types of the point cloud data output by the TOF module and the event data output by the event camera module are different, so if the light intensity value of the point cloud data output by the TOF module is monitored to be within the preset light intensity range, the target point cloud data output by the TOF module is obtained, and the target point cloud data output by the TOF module is processed, so that the processed target point cloud data is the same as the data type of the event data output by the event camera module.
Specifically, in the processing process of the target point cloud data output by the TOF module, one possible implementation manner is that the target point cloud data can be converted to obtain conversion data, so that the conversion data and event data generated by the event camera module are the same type of data, and then the conversion data can be used as event data output by the event camera.
Further, as the point cloud data output by the TOF module comprises each pixel coordinate and the light intensity value in the scene, the pixel coordinates can be two-dimensional pixel coordinates or three-dimensional pixel coordinates, and the point cloud data output by the TOF module comprises the position coordinates of the pixel coordinates, the position coordinates can be determined based on a certain coordinate system in the TOF module; similarly, the event data output from the event camera module has three elements: the time stamp, the pixel coordinates and the polarity, the pixel coordinates in the event data output by the event camera module are also determined based on a coordinate system in the event camera module.
In the process of converting the target point cloud data, the pixel coordinates in the point cloud data output by the TOF module are mainly converted, and the idea of converting the target point cloud data to obtain converted data is to determine intermediate conversion data based on a coordinate system corresponding to the pixel coordinates in the point cloud data output by the TOF module and a coordinate system corresponding to the pixel coordinates in the event data output by the event camera module, so that the pixel coordinates in the point cloud data output by the TOF module can be converted based on the intermediate conversion data, and further conversion of the target point cloud data is realized.
Specifically, since the pixel coordinates in the point cloud data output by the TOF module are determined by using the TOF module coordinate system corresponding to the TOF module, and the pixel coordinates in the event data output by the event camera module are determined by using the camera pixel coordinate system corresponding to the event camera module, in the process of converting the point cloud data to obtain converted data, one possible implementation manner is that the TOF module coordinate system corresponding to the TOF module and the camera pixel coordinate system corresponding to the event camera module can be determined first, then calibration is performed between the TOF module coordinate system and the camera pixel coordinate system to obtain intermediate conversion data between the TOF module coordinate system and the camera pixel coordinate system, and the intermediate conversion data can be specifically a preset conversion matrix.
After a preset conversion matrix between the TOF module coordinate system and the camera pixel coordinate system is obtained, the target point cloud data can be converted from the TOF module coordinate system corresponding to the TOF module to the camera pixel coordinate system corresponding to the event camera module based on the preset conversion matrix, so that converted data can be obtained according to the converted point cloud data, so that the converted data and the event data are data in the same coordinate system, namely, the converted data and the event data are data in the camera pixel coordinate system, at the moment, the converted data and the event data are data of the same type, and then the converted data can be used as event data output by an event camera.
Specifically, in the process of converting the point cloud data from the TOF module coordinate system corresponding to the TOF module into the camera pixel coordinate system corresponding to the event camera module based on the preset conversion matrix, the event data output by the event camera module has three elements: the time stamp, the pixel coordinates and the polarity, the pixel coordinates in the target point cloud data output by the TOF module can be converted into a camera pixel coordinate system corresponding to the event camera module based on the conversion matrix, and the pixel coordinates in the converted target point cloud data can be used as the pixel coordinates in the event data output by the event camera module; the time stamp in the event data output by the event camera module can be determined according to the time stamp of the target point cloud data output by the TOF module; the polarity in the event data output by the event camera module can be determined according to the light intensity value of the target point cloud data output by the TOF module under the current timestamp and the light intensity value of the target point cloud data output by the TOF module in the previous timestamp, namely, the polarity in the event data output by the event camera module can be determined according to the change condition of the light intensity value of the target point cloud data output by the TOF module. For example, if the light intensity value of the target point cloud data output by the TOF module under the current timestamp is greater than the light intensity value of the target point cloud data output by the TOF module in the previous timestamp, then the polarity in the event data output by the current event camera module can be determined to be brightness increase; the light intensity value of the target point cloud data output by the TOF module under the current time stamp is smaller than that of the target point cloud data output by the TOF module in the previous time stamp, so that the polarity in the event data output by the current event camera module can be determined to be brightness reduction.
In the embodiment of the application, firstly, if the light intensity value of the point cloud data output by the TOF module is monitored to be within the preset light intensity range, the point cloud data output by the TOF module is obtained; event data output by the event camera is then determined from the point cloud data. When the light intensity value of the point cloud data output by the TOF module is monitored to be in the preset light intensity range, the event camera is represented in the extremely bright or extremely dark environment, the event camera module is greatly disturbed, the brightness change of the current environment is difficult to accurately distinguish, the point cloud data output by the TOF module in the extremely bright or extremely dark environment is available, therefore, the point cloud data output by the TOF module can be obtained, the event data output by the event camera is determined according to the point cloud data, the event camera can be ensured to be still used in the extremely bright or extremely dark environment, and the application range and the use stability of the event camera are improved.
Referring to fig. 6, fig. 6 is a schematic structural diagram of an event camera control device according to another embodiment of the application.
As shown in fig. 6, the event camera control device 600 is applied to an event camera, the event camera at least includes a TOF and an event camera module, and the event camera control device 600 includes:
The light intensity judging module 610 is configured to acquire the point cloud data output by the TOF module if it is monitored that the light intensity value of the point cloud data output by the TOF module is within a preset light intensity range;
the conversion module 620 is configured to determine event data output by the event camera according to the point cloud data.
Optionally, the conversion module 620 is further configured to convert the point cloud data to obtain conversion data, and use the conversion data as event data output by the event camera; the data and the event data are converted into the same type of data.
Optionally, the conversion module 620 is further configured to convert the point cloud data from the TOF module coordinate system corresponding to the TOF module into the camera pixel coordinate system corresponding to the event camera module based on a preset conversion matrix; and obtaining conversion data according to the converted point cloud data so that the conversion data and the event data are data under the same coordinate system.
Optionally, the conversion module 620 is further configured to determine a TOF module coordinate system corresponding to the TOF module and a camera pixel coordinate system corresponding to the event camera module; and calibrating the TOF module coordinate system and the camera pixel coordinate system to obtain a preset conversion matrix between the TOF module coordinate system and the camera pixel coordinate system.
Optionally, the light intensity judging module 610 is further configured to determine that the light intensity value of the point cloud data output by the TOF module is within the preset light intensity range if it is determined that the point cloud data with the light intensity value within the preset light intensity range exists in the point cloud data output by the TOF module.
Optionally, the light intensity judging module 610 is further configured to obtain the target point cloud data in the point cloud data output by the TOF module if the number of the target point cloud data is determined to be greater than the preset number threshold.
Optionally, the event camera control apparatus 600 further includes:
and the normal output module is used for acquiring the data output by the event camera module and determining the event data output by the event camera according to the data output by the event camera module if the light intensity value of the point cloud data output by the TOF module is not in the preset light intensity range.
In an embodiment of the present application, an event camera control apparatus includes: the light intensity judging module is used for acquiring the point cloud data output by the TOF module if the light intensity value of the point cloud data output by the TOF module is monitored to be in a preset light intensity range; and the conversion module is used for determining event data output by the event camera according to the point cloud data. When the light intensity value of the point cloud data output by the TOF module is monitored to be in the preset light intensity range, the event camera is represented in the extremely bright or extremely dark environment, the event camera module is greatly disturbed, the brightness change of the current environment is difficult to accurately distinguish, the point cloud data output by the TOF module in the extremely bright or extremely dark environment is available, therefore, the point cloud data output by the TOF module can be obtained, the event data output by the event camera is determined according to the point cloud data, the event camera can be ensured to be still used in the extremely bright or extremely dark environment, and the application range and the use stability of the event camera are improved.
Embodiments of the present application also provide a computer storage medium storing a plurality of instructions adapted to be loaded by a processor and to perform the steps of the method of any of the above embodiments.
The embodiment of the application also provides an event camera, which comprises a TOF module and an event camera module; the event camera further comprises a memory, a processor and a computer program stored on the memory and executable on the processor, the computer program being adapted to be loaded by the processor and to perform the steps of the method according to any of the above embodiments.
Further, referring to fig. 7, fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application. As shown in fig. 7, an electronic device 700 may include: at least one central processor 701, at least one network interface 704, a user interface 703, a memory 705, at least one communication bus 702.
The electronic device 700 may further include an event camera including at least a TOF module and an event camera module.
Wherein the communication bus 702 is used to enable connected communications between these components.
The user interface 703 may include a Display screen (Display), a Camera (Camera), and the optional user interface 703 may further include a standard wired interface, and a wireless interface.
The network interface 704 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface), among others.
Wherein the central processor 701 may comprise one or more processing cores. The central processor 701 connects the various parts within the overall electronic device 700 using various interfaces and lines, performs various functions of the electronic device 700 and processes data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 705, and invoking data stored in the memory 705. Alternatively, the central processor 701 may be implemented in at least one hardware form of digital signal processing (Digital Signal Processing, DSP), field-Programmable gate array (FPGA), programmable Logic Array (PLA). The central processor 701 may integrate one or a combination of several of a central processor (Central Processing Unit, CPU), an image central processor (Graphics ProcessingUnit, GPU), a modem, and the like. The CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing the content required to be displayed by the display screen; the modem is used to handle wireless communications. It should be understood that the modem may not be integrated into the cpu 701 and may be implemented by a single chip.
The Memory 705 may include a random access Memory (Random Access Memory, RAM) or a Read-Only Memory (Read-Only Memory). Optionally, the memory 705 includes a non-transitory computer readable medium (non-transitory computer-readable storage medium). Memory 705 may be used to store instructions, programs, code, sets of codes, or instruction sets. The memory 705 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing the various method embodiments described above, etc.; the storage data area may store data or the like referred to in the above respective method embodiments. The memory 705 may also optionally be at least one storage device located remotely from the aforementioned central processor 701. As shown in fig. 7, an operating system, a network communication module, a user interface module, and an event camera control program may be included in the memory 705 as one type of computer storage medium.
In the electronic device 700 shown in fig. 7, the user interface 703 is mainly used for providing an input interface for a user, and acquiring data input by the user; and the central processor 701 may be used to call an event camera control program stored in the memory 705 and specifically perform the following operations:
If the light intensity value of the point cloud data output by the TOF module is monitored to be in the preset light intensity range, the point cloud data output by the TOF module is obtained; event data output by the event camera is determined according to the point cloud data.
Optionally, determining event data output by the event camera according to the point cloud data includes: converting the point cloud data to obtain conversion data, and taking the conversion data as event data output by an event camera; the data and the event data are converted into the same type of data.
Optionally, converting the point cloud data to obtain converted data includes: converting the point cloud data from a TOF module coordinate system corresponding to the TOF module to a camera pixel coordinate system corresponding to the event camera module based on a preset conversion matrix; and obtaining conversion data according to the converted point cloud data so that the conversion data and the event data are data under the same coordinate system.
Optionally, before converting the point cloud data to obtain the converted data, the method includes: determining a TOF module coordinate system corresponding to the TOF module and a camera pixel coordinate system corresponding to the event camera module; and calibrating the TOF module coordinate system and the camera pixel coordinate system to obtain a preset conversion matrix between the TOF module coordinate system and the camera pixel coordinate system.
Optionally, if the light intensity value of the point cloud data output by the TOF module is detected to be within the preset light intensity range, the method includes: if it is determined that the point cloud data with the light intensity value within the preset light intensity range exists in the point cloud data output by the TOF module, it is determined that the light intensity value of the point cloud data output by the TOF module is monitored within the preset light intensity range.
Optionally, acquiring the point cloud data output by the TOF module includes: if the number of the target point cloud data is determined to be larger than the preset number threshold, the target point cloud data in the point cloud data output by the TOF module is obtained.
Optionally, the method further comprises: if the light intensity value of the point cloud data output by the TOF module is not in the preset light intensity range, acquiring the data output by the event camera module, and determining the event data output by the event camera according to the data output by the event camera module.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of modules is merely a logical function division, and there may be additional divisions of actual implementation, e.g., multiple modules or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or modules, which may be in electrical, mechanical, or other forms.
The modules illustrated as separate components may or may not be physically separate, and components shown as modules may or may not be physical modules, i.e., may be located in one place, or may be distributed over a plurality of network modules. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional module in each embodiment of the present application may be integrated into one processing module, or each module may exist alone physically, or two or more modules may be integrated into one module. The integrated modules may be implemented in hardware or in software functional modules.
The integrated modules, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
It should be noted that, for the sake of simplicity of description, the foregoing method embodiments are all expressed as a series of combinations of actions, but it should be understood by those skilled in the art that the present application is not limited by the order of actions described, as some steps may be performed in other order or simultaneously in accordance with the present application. Further, those skilled in the art will appreciate that the embodiments described in the specification are all preferred embodiments, and that the acts and modules referred to are not necessarily all required for the present application.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to the related descriptions of other embodiments.
The foregoing describes a method, apparatus, storage medium and event camera for controlling an event camera according to the present application, and those skilled in the art should not understand the present application to limit the scope of the present application in view of the foregoing description of the embodiment of the present application.

Claims (11)

1. An event camera control method, applied to an event camera, where the event camera includes at least a Time Of Flight (TOF) module and an event camera module, the method includes:
If the light intensity value of the point cloud data output by the TOF module is monitored to be in a preset light intensity range, the point cloud data output by the TOF module is obtained;
and determining event data output by the event camera according to the point cloud data.
2. The method of claim 1, wherein the determining event data output by the event camera from the point cloud data comprises:
converting the point cloud data to obtain conversion data, and taking the conversion data as event data output by the event camera;
wherein the conversion data and the event data are the same type of data.
3. The method of claim 2, wherein converting the point cloud data to obtain converted data comprises:
converting the point cloud data from a TOF module coordinate system corresponding to the TOF module to a camera pixel coordinate system corresponding to the event camera module based on a preset conversion matrix;
and obtaining conversion data according to the converted point cloud data, so that the conversion data and the event data are data under the same coordinate system.
4. A method according to claim 3, wherein before converting the point cloud data to obtain converted data, the method comprises:
Determining a TOF module coordinate system corresponding to the TOF module and a camera pixel coordinate system corresponding to the event camera module;
and calibrating the TOF module coordinate system and the camera pixel coordinate system to obtain a preset conversion matrix between the TOF module coordinate system and the camera pixel coordinate system.
5. The method of claim 1, wherein if the detected light intensity value of the point cloud data output by the TOF module is within a preset light intensity range, the method comprises:
if it is determined that the point cloud data with the light intensity value within the preset light intensity range exists in the point cloud data output by the TOF module, it is determined that the light intensity value of the point cloud data output by the TOF module is monitored within the preset light intensity range.
6. The method of claim 5, wherein the acquiring the point cloud data output by the TOF module comprises:
and if the number of the target point cloud data is determined to be larger than a preset number threshold, acquiring the target point cloud data in the point cloud data output by the TOF module.
7. The method according to claim 1, wherein the method further comprises:
if the light intensity value of the point cloud data output by the TOF module is not in the preset light intensity range, acquiring the data output by the event camera module, and determining the event data output by the event camera according to the data output by the event camera module.
8. An event camera control device, characterized in that it is applied to event camera, be provided with TOF module and event camera module in the event camera, the device includes:
the light intensity judging module is used for acquiring the point cloud data output by the TOF module if the light intensity value of the point cloud data output by the TOF module is monitored to be in a preset light intensity range;
and the conversion module is used for determining event data output by the event camera according to the point cloud data.
9. A computer storage medium storing a plurality of instructions adapted to be loaded by a processor and to perform the steps of the method according to any one of claims 1 to 7.
10. An event camera is characterized by comprising a TOF module and an event camera module;
the event camera further comprises a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the method according to any of claims 1 to 7 when the program is executed.
11. An electronic device, comprising an event camera, wherein the event camera comprises a TOF module and an event camera module;
The electronic device further comprising a memory, a processor, and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the method according to any one of claims 1 to 7 when the program is executed.
CN202210119627.3A 2022-02-08 2022-02-08 Event camera control method, device, storage medium and event camera Pending CN116614711A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210119627.3A CN116614711A (en) 2022-02-08 2022-02-08 Event camera control method, device, storage medium and event camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210119627.3A CN116614711A (en) 2022-02-08 2022-02-08 Event camera control method, device, storage medium and event camera

Publications (1)

Publication Number Publication Date
CN116614711A true CN116614711A (en) 2023-08-18

Family

ID=87682351

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210119627.3A Pending CN116614711A (en) 2022-02-08 2022-02-08 Event camera control method, device, storage medium and event camera

Country Status (1)

Country Link
CN (1) CN116614711A (en)

Similar Documents

Publication Publication Date Title
EP3367661B1 (en) Method and system for using light emissions by a depth-sensing camera to capture video images under low-light conditions
EP2924992B1 (en) Method and apparatus for obtaining 3d image
CN105264401B (en) Interference reduction for TOF system
US10652513B2 (en) Display device, display system and three-dimension display method
WO2021120402A1 (en) Fused depth measurement apparatus and measurement method
EP3764345A1 (en) Ambient light collecting method, terminal and storage medium
US20190361259A1 (en) Semi-dense depth estimation from a dynamic vision sensor (dvs) stereo pair and a pulsed speckle pattern projector
WO2019221944A1 (en) Reduced power operation of time-of-flight camera
US20200389583A1 (en) Sensor auto-configuration
CN112596069A (en) Distance measuring method and system, computer readable medium and electronic device
CN112950694A (en) Image fusion method, single camera module, shooting device and storage medium
CN111965626A (en) Echo detection and correction method and device for laser radar and environment sensing system
US9851245B2 (en) Accumulating charge from multiple imaging exposure periods
Katz et al. Live demonstration: Behavioural emulation of event-based vision sensors
US10628951B2 (en) Distance measurement system applicable to different reflecting surfaces and computer system
CN116614711A (en) Event camera control method, device, storage medium and event camera
KR20190035358A (en) An electronic device controlling a camera based on an external light and control method
CN113296114B (en) DTOF depth image acquisition method and device, electronic equipment and medium
CN110545390A (en) Time-of-flight sensor and method
JP6106969B2 (en) Projection device, pointer device, and projection system
WO2022177707A1 (en) Image acquisition techniques with reduced noise using single photon avalanche diodes
CN113688900A (en) Radar and visual data fusion processing method, road side equipment and intelligent traffic system
JP2022188990A (en) Information processing device, information processing method, and program
CN113052884A (en) Information processing method, information processing apparatus, storage medium, and electronic device
CN108540726B (en) Method and device for processing continuous shooting image, storage medium and terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination