CN116862967A - Depth image acquisition device and method and multi-sensor fusion system - Google Patents

Depth image acquisition device and method and multi-sensor fusion system Download PDF

Info

Publication number
CN116862967A
CN116862967A CN202210302778.2A CN202210302778A CN116862967A CN 116862967 A CN116862967 A CN 116862967A CN 202210302778 A CN202210302778 A CN 202210302778A CN 116862967 A CN116862967 A CN 116862967A
Authority
CN
China
Prior art keywords
target
image data
depth image
local time
rgb
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210302778.2A
Other languages
Chinese (zh)
Inventor
包鼎华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202210302778.2A priority Critical patent/CN116862967A/en
Publication of CN116862967A publication Critical patent/CN116862967A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D21/00Measuring or testing not otherwise provided for
    • G01D21/02Measuring two or more variables by means not covered by a single other subclass
    • GPHYSICS
    • G04HOROLOGY
    • G04GELECTRONIC TIME-PIECES
    • G04G7/00Synchronisation

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

The depth image acquisition method can time the depth image acquisition device by specifying UTC time corresponding to a rising edge in a target pulse signal, so that the local time of the depth image acquisition device is synchronous with the UTC time, the accuracy of the local time corresponding to the depth image acquisition device is guaranteed, image data with accurate time stamps is generated, and therefore the data fusion of the image data generated by the depth image acquisition device and the data acquired by other sensors is facilitated, and the application range of the depth image acquisition device is widened.

Description

Depth image acquisition device and method and multi-sensor fusion system
Technical Field
The disclosure relates to the technical field of sensors, in particular to a depth image acquisition device, a depth image acquisition method and a multi-sensor fusion system.
Background
In the current depth image acquisition device (such as an RGB-D camera), the CPU local time is usually only available, the generated image data is generally not provided with a time stamp, and even if the image data with the time stamp is generated, the depth image acquisition device cannot be aligned with other sensors in time because the CPU local time which cannot guarantee the accuracy is only available, so that the depth image acquisition device cannot be fused with the acquired data of other sensors, which is not beneficial to expanding the application range of the depth image acquisition device.
Disclosure of Invention
In order to overcome the problems in the related art, the present disclosure provides a depth image acquisition device, a method, and a multi-sensor fusion system.
According to a first aspect of embodiments of the present disclosure, there is provided a depth image capturing apparatus, comprising a controller,
the controller is configured to receive a target pulse signal and UTC time corresponding to a designated rising edge in the target pulse signal, acquire a first local time for receiving the target rising edge in the target pulse signal, acquire a target UTC time corresponding to the target rising edge, acquire a current second local time, determine the current UTC time according to the first local time, the target UTC time and the second local time, and use the current UTC time as the current local time.
Optionally, the apparatus further comprises a positioning module coupled to the controller, the target pulse signal is a PPS signal, the designated rising edge is each rising edge in the PPS signal,
the positioning module is configured to provide the PPS signal and UTC time corresponding to each rising edge of the PPS signal to the controller.
Optionally, the positioning module includes a PPS signal output end and a UART signal output end, and the controller includes a first interrupt trigger end and a serial port input end; the PPS signal output end is connected with the first interrupt trigger end, and the UART signal output end is connected with the serial port input end;
The UART signal output end is configured to output positioning information corresponding to each rising edge in the PPS signal, wherein the positioning information comprises UTC time;
the controller is configured to obtain the target UTC time corresponding to the target rising edge by analyzing the target positioning information corresponding to the target rising edge output by the UART signal output end.
Optionally, the controller is configured to generate a first interrupt signal when the first interrupt trigger receives the target rising edge output by the PPS signal output end, and acquire a local time corresponding to the first interrupt signal, so as to obtain the first local time.
Optionally, the system further comprises a signal generation module, an RGB camera and a depth camera, wherein the signal generation module, the RGB camera and the depth camera are connected with the controller;
the signal generation module is configured to generate a synchronous frequency multiplication pulse signal corresponding to the PPS signal;
the RGB camera is configured to execute a first exposure operation when the synchronous frequency multiplication pulse signal is received;
the depth camera is configured to perform a second exposure operation when the synchronous frequency multiplication pulse signal is received;
the controller is further configured to record a current target local time in response to receiving the synchronous frequency multiplication pulse signal, generate target RGB image data with a timestamp corresponding to the target local time when the RGB camera is determined to generate RGB image data corresponding to the first exposure operation, and generate target depth image data with the timestamp corresponding to the target local time when the depth camera is determined to generate depth image data corresponding to the second exposure operation.
Optionally, the controller further includes a second interrupt trigger, the signal generating module includes a PPS signal input end and a synchronous frequency doubling pulse signal output end, the RGB camera includes a first exposure trigger and an RGB image data output end, and the depth camera includes a second exposure trigger and a depth image data output end;
the PPS signal output end is also connected with the PPS signal input end, the synchronous frequency multiplication pulse signal output end is respectively connected with the second interrupt trigger end, and the first exposure trigger end and the second exposure trigger end are connected.
Optionally, the controller is configured to:
when the synchronous frequency doubling pulse signal output by the synchronous frequency doubling pulse signal output end is received through the second interrupt trigger end, starting timing, determining that the RGB camera generates RGB image data corresponding to the first exposure operation under the condition that the timing time length is determined to be a first preset time length, and determining that the depth camera generates depth image data corresponding to the second exposure operation under the condition that the timing time length is determined to be a second preset time length.
Optionally, the RGB camera further comprises an RGB image data output, the depth camera further comprises a depth image data output, and the controller further comprises an RGB image data input and a depth image data input;
The RGB image data output end is connected with the RGB image data input end, and the depth image data output end is connected with the depth image data input end.
Optionally, the controller is configured to:
when the RGB camera is determined to generate the RGB image data corresponding to the first exposure operation, receiving the RGB image data corresponding to the first exposure operation output by the RGB image data output end in the RGB camera through the RGB image data input end, and receiving the depth image data corresponding to the second exposure operation output by the depth image data output end in the depth camera through the depth image data input end, and generating target data according to the target local time, wherein the target data comprises the target RGB image data and the target depth image data.
Optionally, the controller is configured to:
and generating target data comprising the target local time, the RGB image data and the depth image data according to a preset data frame format.
Optionally, the controller is configured to obtain a target difference value between the second local time and the first local time, obtain a sum value of the target UTC time and the target difference value, and use the sum value as the current UTC time.
According to a second aspect of embodiments of the present disclosure, there is provided a depth image acquisition method, the method comprising:
acquiring a target pulse signal and UTC time corresponding to a specified rising edge in the target pulse signal;
acquiring a first local time and a current second local time of a target rising edge in the target pulse signal;
determining target UTC time corresponding to the target rising edge;
and determining current UTC time according to the first local time, the target UTC time and the second local time, and taking the current UTC time as the current local time.
Optionally, the method is applied to a depth image acquisition device, the depth image acquisition device includes a positioning module, the target pulse signal is a PPS signal, the designated rising edge is each rising edge of the PPS signal, and the positioning module is configured to output the PPS signal and UTC time corresponding to each rising edge of the PPS signal;
the acquiring the target pulse signal and specifying UTC time corresponding to a rising edge in the target pulse signal includes:
and receiving the PPS signal output by the positioning module and UTC time corresponding to each rising edge in the PPS signal.
Optionally, the determining the target UTC time corresponding to the target rising edge includes:
and acquiring and analyzing target positioning information corresponding to the target rising edge to obtain the target UTC time corresponding to the target rising edge.
Optionally, acquiring the first local time when the target rising edge in the target pulse signal is received includes:
generating a first interrupt signal in response to receiving the target rising edge in the PPS signal;
and acquiring the local time corresponding to the first interrupt signal to obtain the first local time.
Optionally, the depth image acquisition device further comprises a signal generation module, an RGB camera and a depth camera; the signal generating module is configured to generate a synchronous frequency multiplication pulse signal corresponding to the PPS signal, the RGB camera is configured to perform a first exposure operation when the synchronous frequency multiplication pulse signal is received, and the depth camera is configured to perform a second exposure operation when the synchronous frequency multiplication pulse signal is received, where the method further includes:
recording the current target local time in response to receiving the synchronous frequency multiplication pulse signal;
generating target RGB image data with a timestamp corresponding to the target local time when the RGB camera generates RGB image data corresponding to the first exposure operation;
And generating target depth image data with a timestamp corresponding to the target local time when the depth camera generates the depth image data corresponding to the second exposure operation.
Optionally, the determining that the RGB camera generates the RGB image data corresponding to the first exposure operation includes:
when the synchronous frequency doubling pulse signal is received, starting timing, and determining that the RGB camera generates RGB image data corresponding to the first exposure operation under the condition that the timing time length is determined to be a first preset time length;
accordingly, the determining that the depth camera generates the depth image data corresponding to the second exposure operation includes:
and under the condition that the timing duration is determined to be a second preset duration, determining that the depth camera generates depth image data corresponding to the second exposure operation.
Optionally, the method further comprises:
when the RGB camera is determined to generate RGB image data corresponding to the first exposure operation, acquiring the RGB image data corresponding to the first exposure operation, and acquiring depth image data corresponding to the second exposure operation;
and generating target data according to the target local time, wherein the target data comprises the target RGB image data and the target depth image data.
Optionally, the generating target data according to the target local time, the RGB image data and the depth image data includes:
and generating target data comprising the target local time, the RGB image data and the depth image data according to a preset data frame format.
Optionally, the determining the current UTC time according to the first local time, the target UTC time, and the second local time includes:
acquiring a target difference value of the second local time and the first local time;
and obtaining the sum value of the target UTC time and the target difference value, and taking the sum value as the current UTC time.
According to a third aspect of embodiments of the present disclosure, a multi-sensor fusion system is provided, including the depth image capturing device provided in the first aspect above.
The technical scheme provided by the embodiment of the disclosure can comprise the following beneficial effects: the controller can acquire a target pulse signal and UTC time corresponding to a designated rising edge in the target pulse signal, acquire a first local time for receiving the target rising edge in the target pulse signal, acquire a target UTC time corresponding to the target rising edge, acquire a current second local time, determine the current UTC time according to the first local time, the target UTC time and the second local time, and take the current UTC time as the current local time, so that the time service can be performed on the depth image acquisition device through the UTC time corresponding to the designated rising edge, the local time of the depth image acquisition device is synchronous with the UTC time, the accuracy of the local time corresponding to the depth image acquisition device is guaranteed, the generation of image data with accurate time stamps is facilitated, the data fusion of the image data generated by the depth image acquisition device and the data acquired by other sensors is facilitated, and the application range of the depth image acquisition device is also facilitated to be enlarged.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a block diagram of a depth image capture device according to an exemplary embodiment;
FIG. 2 is a block diagram of a depth image capture device shown in another exemplary embodiment of the present disclosure;
FIG. 3 is a flow chart of a depth image acquisition method according to an exemplary embodiment of the present disclosure;
FIG. 4 is a flow chart of a depth image acquisition method according to the embodiment of FIG. 3 of the present disclosure;
FIG. 5 is a flow chart of another depth image acquisition method shown in accordance with the embodiment of FIG. 3 of the present disclosure;
fig. 6 is a block diagram illustrating another depth image capturing device according to an exemplary embodiment.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present disclosure as detailed in the accompanying claims.
In an exemplary embodiment of the present disclosure, there is provided a depth image capturing apparatus, including a controller,
the controller is configured to receive a target pulse signal and UTC (Universal Time Coordinated, coordinated universal time) time corresponding to a designated rising edge in the target pulse signal, acquire a first local time for receiving a target rising edge in the target pulse signal, acquire a target UTC time corresponding to the target rising edge, acquire a current second local time, determine the current UTC time according to the first local time, the target UTC time and the second local time, and use the current UTC time as the current local time.
The target Pulse signal may be a PPS (Pulse Per Second) signal, or may be other Pulse signals in the prior art, and the designated rising edge may be each rising edge in the target Pulse signal, or may be a designated rising edge determined at intervals of a preset number of rising edges, where the preset number belongs to the frequency range of the target Pulse signal.
As shown in fig. 1, fig. 1 is a block diagram of a depth image capturing apparatus according to an exemplary embodiment; the depth image acquisition device comprises a controller 101 and a positioning module 102 connected with the controller 101,
The positioning module 102 is configured to provide PPS signals and UTC time corresponding to each rising edge of the PPS signals to the controller 101;
the controller 101 is configured to obtain a first local time of a target rising edge in the PPS signal output by the positioning module 102, obtain a target UTC time corresponding to the target rising edge output by the positioning module 102, obtain a current second local time, determine a current UTC time according to the first local time, the target UTC time and the second local time, and use the current UTC time as the current local time.
The positioning module 102 may be a GPS (Global Positioning System ) module, or other positioning devices capable of providing UTC time, where in the case that the positioning module 102 is a GPS module, the controller 101 may obtain GPRMC (Global Positioning System Recommended Minimum Specific, recommended positioning information) data or GPGGA (Global Positioning System Fix Data (GGA), global positioning information) data output by the GPS module, and since the GPS module generally outputs one GPRMC data (or GPGGA data) at each rising edge of the PPS signal, the target UTC time corresponding to the rising edge of the target may be obtained by analyzing the GPRMC data or the GPGGA data. The target rising edge may be each rising edge in the PPS signal, or may be a target rising edge designated every predetermined time, for example, every 3 seconds.
It should be noted that, the GPGGA data is a GPS data output format statement, and generally includes 17 fields: the sentence marks the head, the world time, the latitude hemisphere, the longitude hemisphere, the positioning quality indication, the number of satellites, the horizontal precision factor, the ellipsoidal height, the height unit, the ground level height abnormal difference value, the height unit, the differential GPS data deadline, the differential reference base station label, the check sum end mark, and the separation is respectively carried out by commas.
In addition, the specific implementation manner of determining the current UTC time according to the first local time, the target UTC time and the second local time is as follows:
the controller 101 is configured to obtain a target difference value between the second local time and the first local time, and obtain a sum value of the target UTC time and the target difference value, and take the sum value as the current UTC time.
For example, the GPS module outputs a PPS signal (usually with a frequency of 1Hz, a second pulse) to access a main control CPU (i.e., a controller) of the RGB-D camera, when the main control CPU receives the PPS signal, triggers an interrupt message, and obtains the first local time T0 by recording a CPU timer time corresponding to the interrupt signal, the UART output signal output by the GPS module accesses a serial port input end of the main control CPU of the RGB-D camera, the main control CPU parses the GPRMC or GPGGA data in the UART output signal to obtain a UTC time corresponding to a rising edge of the PPS signal corresponding to the interrupt message as T1 (i.e., a target UTC time), reads the local time corresponding to the current CPU timer to obtain a second local time T2, calculates t1+ (T2-T0) to obtain a current UTC time T3, and uses T3 as the current local time, and further, before obtaining the current UTC time T3 as the current local time, the current UTC time T3 can be obtained first, and the current time and the preset time T3 can not be written as the current time at the current time T3, and the current time is different from the preset time T2 or the current time is less than the current time, and the current time is equal to the current time is different from the preset at the current time T3, and the current time is equal to the current time is less than the current threshold.
According to the technical scheme, the depth image acquisition device can be subjected to time service through the UTC time corresponding to each rising edge in the PPS signal in the positioning module, so that the local time of the depth image acquisition device is synchronous with the UTC time, the accuracy of the local time corresponding to the depth image acquisition device is guaranteed, the generation of image data with an accurate time stamp is facilitated, the data fusion of the image data generated by the depth image acquisition device and the data acquired by other sensors is facilitated, and the application range of the depth image acquisition device is also facilitated to be enlarged.
Optionally, the positioning module 102 may include a PPS signal output terminal and a UART (Universal Asynchronous Receiver/Transmitter, asynchronous transceiver transmission) signal output terminal, and the controller 101 includes a first interrupt trigger terminal and a serial input terminal; the PPS signal output end is connected with the first interrupt trigger end, and the UART signal output end is connected with the serial port input end;
the UART signal output end is configured to output positioning information corresponding to each rising edge in the PPS signal, wherein the positioning information comprises UTC time;
the controller 101 is configured to obtain the target UTC time corresponding to the target rising edge by analyzing the target positioning information corresponding to the target rising edge output by the UART signal output terminal.
The positioning information may be GPRMC data or GPGGA data, and because the formats of the GPRMC data and the GPGGA data are fixed, the UTC time fields in the GPRMC data and the GPGGA data are acquired and parsed, so that the UTC time corresponding to each rising edge in the PPS signal can be obtained, and thus, the UTC time corresponding to each rising edge in the PPS signal can be conveniently and rapidly acquired, which is beneficial to improving the time synchronization efficiency of the depth image acquisition device.
Optionally, the controller 101 is configured to generate a first interrupt signal when the first interrupt trigger receives the target rising edge output by the PPS signal output end, and acquire a local time corresponding to the first interrupt signal, so as to obtain the first local time.
It should be noted that, when the first interrupt trigger end receives the target rising edge output by the PPS signal output end, the first interrupt signal is generated, so that the controller 101 obtains the accurate local time when the target rising edge occurs by recording the time of the first interrupt signal, that is, obtains the first local time, and can effectively ensure the reliability of the first local time.
Optionally, fig. 2 is a block diagram of a depth image capturing device according to another exemplary embodiment of the present disclosure; as shown in fig. 2, the depth image capturing apparatus further includes a signal generating module 103, an rgb camera 104 and a depth camera 105 connected to the controller 101;
The signal generating module 103 is configured to generate a synchronous frequency multiplication pulse signal corresponding to the PPS signal;
the RGB camera 104 configured to perform a first exposure operation upon receiving the synchronous double frequency pulse signal;
the depth camera 105 is configured to perform a second exposure operation upon receiving the synchronous frequency multiplication pulse signal;
the controller 101 is further configured to record a current target local time in response to receiving the synchronous frequency doubling pulse signal, and to generate target RGB image data with the target local time corresponding time stamp when it is determined that the RGB camera 104 generates RGB image data corresponding to the first exposure operation, and to generate target depth image data with the target local time corresponding time stamp when it is determined that the depth camera 105 generates depth image data corresponding to the second exposure operation.
The signal generating module 103 may be an FPGA (Field Programmable Gate Array ) module, or may be another circuit module capable of generating a synchronous double frequency pulse signal corresponding to the PPS signal, and in the prior art, the number of circuit modules capable of generating a synchronous double frequency pulse signal corresponding to the PPS signal is large, which is not limited in this disclosure.
In addition, the synchronous multiplied pulse signal may be a target pulse signal that coincides with the PPS signal among a plurality of pulse signals included in each second, and the above-described reception of the synchronous multiplied pulse signal may be a rising edge or a falling edge in the synchronous multiplied pulse signal. The first exposure operation may be an operation to trigger the RGB camera 104 to start generating RGB image data (the operation may be generating a trigger signal, an instruction, a command, etc.), and the second exposure operation may be an operation to trigger the depth camera 105 to start generating depth image data.
According to the technical scheme, the image data with the time stamp can be generated according to the accurate exposure time, and data fusion with the acquired data of other sensors is facilitated.
Optionally, the controller 101 further includes a second interrupt trigger, the signal generating module includes a PPS signal input and a synchronous double frequency pulse signal output, the RGB camera 104 includes a first exposure trigger and an RGB image data output, and the depth camera 105 includes a second exposure trigger and a depth image data output;
the PPS signal output end is also connected with the PPS signal input end, the synchronous frequency multiplication pulse signal output end is respectively connected with the second interrupt trigger end, and the first exposure trigger end and the second exposure trigger end.
When determining that the second interrupt trigger end receives the rising edge (or may be the falling edge) of the synchronous double frequency pulse signal output by the synchronous double frequency pulse signal output end, the controller 101 may generate a second interrupt signal, and obtain the accurate time when receiving the rising edge (or the falling edge) of the synchronous double frequency pulse signal by recording the local time corresponding to the second interrupt signal.
In addition, the controller 101 may trigger the first exposure operation by sending a first preset trigger instruction (may be a high-low level or a code instruction) to the first exposure trigger end, and trigger the second exposure operation by sending a second preset trigger instruction (may be a high-low level or a code instruction) to the second exposure trigger end.
Optionally, the controller 101 is configured to:
when the synchronous frequency multiplication pulse signal output by the synchronous frequency multiplication pulse signal output end is received through the second interrupt trigger end, starting timing, determining that the RGB camera 104 generates RGB image data corresponding to the first exposure operation under the condition that the timing time length is determined to be a first preset time length, and determining that the depth camera 105 generates depth image data corresponding to the second exposure operation under the condition that the timing time length is determined to be a second preset time length.
The first preset duration may be the same as or different from the second preset duration, where the above-mentioned receiving, by the second interrupt trigger, the synchronous frequency-doubling pulse signal output by the synchronous frequency-doubling pulse signal output end may be determining whether to receive a trigger edge (rising edge or falling edge) of the synchronous frequency-doubling pulse signal output by the synchronous frequency-doubling pulse signal output end, and determining, when determining to receive a trigger edge of the synchronous frequency-doubling pulse signal output by the synchronous frequency-doubling pulse signal output end, to receive the synchronous frequency-doubling pulse signal output by the synchronous frequency-doubling pulse signal output end by the second interrupt trigger.
According to the technical scheme, when the first preset duration is away from the time (namely, the target local time) for triggering the first exposure operation, the RGB camera 104 is determined to generate the RGB image data corresponding to the first exposure operation, and when the second preset duration is away from the time (namely, the target local time) for triggering the second exposure operation, the depth camera 105 is determined to generate the depth image data corresponding to the second exposure operation, so that the target RGB image data with the target local time corresponding timestamp can be generated under the condition that the RGB image data is ensured to be completed, and the target depth image data with the target local time corresponding timestamp can be generated under the condition that the depth image data is ensured to be completed, and the reliability of generating the target RGB image data and the target depth image data can be effectively ensured.
Optionally, the RGB camera 104 further comprises an RGB image data output, the depth camera 105 further comprises a depth image data output, the controller 101 further comprises an RGB image data input and a depth image data input;
the RGB image data output end is connected with the RGB image data input end, and the depth image data output end is connected with the depth image data input end.
Wherein the RGB camera 104 may transmit the generated RGB image data to the RGB image data input of the controller 101 through the RGB image data output so that the controller 101 receives the RGB image data generated by the RGB camera 104; the depth camera 105 may transmit the generated depth image data to the controller 101 through the depth image data output, and the controller 101 receives the depth image data generated by the depth camera 105 through the depth image data input.
Optionally, the controller 101 is configured to:
when it is determined that the RGB camera 104 generates RGB image data corresponding to the first exposure operation, RGB image data corresponding to the first exposure operation output by the RGB image data output of the RGB camera 104 is received through the RGB image data input, and depth image data corresponding to the second exposure operation output by the depth image data output of the depth camera 105 is received through the depth image data input, and target data including the target RGB image data and the target depth image data is generated from the RGB image data and the depth image data according to the target local time.
Wherein, the implementation of generating the target data according to the target local time, the RGB image data and the depth image data may be:
the controller 101 is caused to generate target data including the target local time, the RGB image data, and the depth image data in a preset data frame format.
It should be noted that, the target data may be a piece of data satisfying a preset data frame format, where the data includes RGB image data and the depth image data, and a corresponding timestamp; the target data may be two pieces of data satisfying a preset data frame format, wherein one piece of data is target RGB image data, and the other piece of data is target depth image data.
In addition, it should be noted that the preset data frame format may include a frame header and data information, and may be any data frame format in the prior art, where the number of data frame formats in the prior art is large, and this disclosure will not be repeated here.
The PPS signal output by the GPS module is sent to the FPGA module, so that the FPGA module outputs a high-frequency signal (for example, may be 20Hz or 30 Hz) synchronized with the PPS signal, that is, the synchronous frequency-doubling pulse signal is obtained, where the high-frequency signal has a trigger edge (rising edge or falling edge) within 1 second and a trigger edge (rising edge or falling edge) of the PPS signal at the same time, the synchronous frequency-doubling pulse signal is connected to a second interrupt trigger end of the main control CPU (i.e., a controller), when the trigger edge (rising edge or falling edge) of the synchronous frequency-doubling pulse signal is received, a second interrupt signal is generated, and a target local time T4 corresponding to the second interrupt signal is read, and the synchronous frequency-doubling pulse signal is used to trigger the RGB camera and the depth camera to be simultaneously exposed, and after the exposure is finished, the main control CPU reads RGB image data generated by the RGB camera and the depth image data generated by the depth camera and prints a T4 timestamp, so as to generate target data including the RGB image data and the depth image data.
According to the technical scheme, the depth image acquisition device can generate target data with accurate time stamps, and reliable data basis can be provided for data fusion of multiple sensors.
FIG. 3 is a flow chart of a depth image acquisition method according to an exemplary embodiment of the present disclosure; as shown in fig. 3, the method may include the steps of:
step 301, acquiring a target pulse signal and UTC time corresponding to a specified rising edge in the target pulse signal.
The depth image acquisition method can be applied to a depth image acquisition device, the depth image acquisition device comprises a positioning module, the target pulse signal is a PPS signal, the appointed rising edge is each rising edge in the PPS signal, and the positioning module is configured to output the PPS signal and UTC time corresponding to each rising edge in the PPS signal.
In this step, the PPS signal output by the positioning module and UTC time corresponding to each rising edge in the PPS signal may be received.
Step 302, obtaining a first local time when a target rising edge in the target pulse signal is received and a current second local time.
In this step, an embodiment of obtaining the first local time when the target rising edge in the target pulse signal is received may be: generating a first interrupt signal in response to receiving the target rising edge in the PPS signal; and acquiring the local time corresponding to the first interrupt signal to obtain the first local time.
Step 303, determining a target UTC time corresponding to the target rising edge.
In this step, the target positioning information corresponding to the target rising edge may be obtained and parsed, so as to obtain the target UTC time corresponding to the target rising edge.
Step 304, determining a current UTC time according to the first local time, the target UTC time and the second local time, and taking the current UTC time as the current local time.
In this step, a target difference value between the second local time and the first local time may be obtained; and obtaining the sum value of the target UTC time and the target difference value, and taking the sum value as the current UTC time.
According to the technical scheme, the depth image acquisition device can be subjected to time service through the UTC time corresponding to each rising edge in the PPS signal in the positioning module, so that the local time of the depth image acquisition device is synchronous with the UTC time, the accuracy of the local time corresponding to the depth image acquisition device is guaranteed, the generation of image data with an accurate time stamp is facilitated, the data fusion of the image data generated by the depth image acquisition device and the data acquired by other sensors is facilitated, and the application range of the depth image acquisition device is also facilitated to be enlarged.
Optionally, the depth image acquisition device further comprises a signal generation module, an RGB camera and a depth camera; the signal generating module is configured to generate a synchronous frequency multiplication pulse signal corresponding to the PPS signal, the RGB camera is configured to perform a first exposure operation when receiving the synchronous frequency multiplication pulse signal, the depth camera is configured to perform a second exposure operation when receiving the synchronous frequency multiplication pulse signal, and the method may further include the following steps shown in fig. 4, and fig. 4 is a flowchart of a depth image acquisition method according to the embodiment shown in fig. 3 of the present disclosure; as shown in fig. 4, the method may further include:
in step 305, in response to receiving the synchronous multiplied pulse signal, the current target local time is recorded.
Step 306, generating target RGB image data with the target local time corresponding time stamp when determining that the RGB camera generates RGB image data corresponding to the first exposure operation.
The above embodiment of determining that the RGB camera generates the RGB image data corresponding to the first exposure operation may be that, when the synchronous frequency multiplication pulse signal is received, timing is started, and when the timing duration is determined to be a first preset duration, it is determined that the RGB camera generates the RGB image data corresponding to the first exposure operation.
In step 307, when it is determined that the depth camera generates depth image data corresponding to the second exposure operation, target depth image data with a timestamp corresponding to the target local time is generated.
In this step, the implementation manner of determining that the depth camera generates the depth image data corresponding to the second exposure operation may be that, when it is determined that the timing duration is the second preset duration, it is determined that the depth camera generates the depth image data corresponding to the second exposure operation.
According to the technical scheme, the target RGB image data with the target local time corresponding time stamp can be generated under the condition that the RGB image data is ensured to be completed, and the target depth image data with the target local time corresponding time stamp can be generated under the condition that the depth image data is ensured to be completed, so that the reliability of generating the target RGB image data and the target depth image data can be effectively ensured.
FIG. 5 is a flow chart of another depth image acquisition method shown in accordance with the embodiment of FIG. 3 of the present disclosure; the method may further comprise:
step 308, when it is determined that the RGB camera generates RGB image data corresponding to the first exposure operation, acquiring RGB image data corresponding to the first exposure operation, and acquiring depth image data corresponding to the second exposure operation.
Step 309, generating target data according to the target local time, the RGB image data and the depth image data.
Wherein the target data includes the target RGB image data and the target depth image data.
In this step, target data including the target local time, the RGB image data, and the depth image data may be generated in a preset data frame format.
According to the technical scheme, the depth image acquisition device can generate target data with accurate time stamps, and reliable data basis can be provided for data fusion of multiple sensors.
The detailed description of the steps in the above embodiments has been described in detail in the embodiments of the apparatus, and will not be explained in detail here.
In a multi-sensor fusion system according to another exemplary embodiment of the present disclosure, the multi-sensor fusion system includes the depth image acquisition device (RGB-D camera) provided in fig. 1 or fig. 2, and a plurality of multi-line lidars and a host, where the multi-sensor fusion system may further include an additional FPGA module and a GPS module, configured to output high-frequency control signals to the multi-line lidars and the plurality of RGB-D cameras, control the multi-line lidars and the plurality of RGB-D cameras to operate, and the host is configured to fuse radar signals provided by the multi-line lidars with image data acquired by the depth image acquisition device, so that the acquired radar signals and the image data are aligned in time, and an alignment effect can be evaluated through RTK results provided by the GPS, thereby being beneficial to improving the fusion effect.
Fig. 6 is a block diagram illustrating another depth image capturing device according to an exemplary embodiment. For example, apparatus 700 may be a mobile phone, computer, digital broadcast terminal, messaging device, game console, tablet device, medical device, exercise device, personal digital assistant, or the like.
Referring to fig. 6, an apparatus 700 may include one or more of the following components: a processing component 702, a memory 704, a power component 706, a multimedia component 708, an audio component 710, an input/output (I/O) interface 712, a sensor component 714, and a communication component 716.
The processing component 702 generally controls overall operation of the apparatus 700, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 702 may include one or more processors 720 to execute instructions to perform all or part of the steps of the depth image acquisition method described above. Further, the processing component 702 can include one or more modules that facilitate interaction between the processing component 702 and other components. For example, the processing component 702 may include a multimedia module to facilitate interaction between the multimedia component 708 and the processing component 702.
The memory 704 is configured to store various types of data to support operations at the apparatus 700. Examples of such data include instructions for any application or method operating on the apparatus 700, contact data, phonebook data, messages, pictures, videos, and the like. The memory 704 may be implemented by any type or combination of volatile or nonvolatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
The power component 706 provides power to the various components of the device 700. Power component 706 can include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for device 700.
The multimedia component 708 includes a screen between the device 700 and the user that provides an output interface. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or sliding action, but also the duration and pressure associated with the touch or sliding operation. In some embodiments, the multimedia component 708 includes a front-facing camera and/or a rear-facing camera. The front-facing camera and/or the rear-facing camera may receive external multimedia data when the apparatus 700 is in an operational mode, such as a photographing mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have focal length and optical zoom capabilities.
The audio component 710 is configured to output and/or input audio signals. For example, the audio component 710 includes a Microphone (MIC) configured to receive external audio signals when the device 700 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may be further stored in the memory 704 or transmitted via the communication component 716. In some embodiments, the audio component 710 further includes a speaker for outputting audio signals.
The I/O interface 712 provides an interface between the processing component 702 and peripheral interface modules, which may be a keyboard, click wheel, buttons, etc. These buttons may include, but are not limited to: homepage button, volume button, start button, and lock button.
The sensor assembly 714 includes one or more sensors for providing status assessment of various aspects of the apparatus 700. For example, the sensor assembly 714 may detect an on/off state of the device 700, a relative positioning of the assemblies, such as a display and keypad of the device 700, a change in position of the device 700 or a component of the device 700, the presence or absence of user contact with the device 700, an orientation or acceleration/deceleration of the device 700, and a change in temperature of the device 700. The sensor assembly 714 may include a proximity sensor configured to detect the presence of nearby objects without any physical contact. The sensor assembly 714 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 714 may also include an acceleration sensor, a gyroscopic sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 716 is configured to facilitate communication between the apparatus 700 and other devices in a wired or wireless manner. The apparatus 700 may access a wireless network based on a communication standard, such as WiFi,2G or 3G, or a combination thereof. In one exemplary embodiment, the communication component 716 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 716 further includes a Near Field Communication (NFC) module to facilitate short range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 700 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic elements for performing the above-described depth image acquisition methods.
In an exemplary embodiment, a non-transitory computer readable storage medium is also provided, such as memory 704 including instructions executable by processor 720 of apparatus 700 to perform the depth image acquisition method described above. For example, the non-transitory computer readable storage medium may be ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (21)

1. A depth image acquisition device is characterized by comprising a controller,
the controller is configured to receive a target pulse signal and UTC time corresponding to a designated rising edge in the target pulse signal, acquire a first local time for receiving the target rising edge in the target pulse signal, acquire a target UTC time corresponding to the target rising edge, acquire a current second local time, determine the current UTC time according to the first local time, the target UTC time and the second local time, and use the current UTC time as the current local time.
2. The depth image capture device of claim 1, further comprising a positioning module coupled to the controller, the target pulse signal being a PPS signal, the designated rising edge being each rising edge of the PPS signal;
the positioning module is configured to provide the PPS signal and UTC time corresponding to each rising edge of the PPS signal to the controller.
3. The depth image capture device of claim 2, wherein the positioning module comprises a PPS signal output and a UART signal output, the controller comprising a first interrupt trigger and a serial input; the PPS signal output end is connected with the first interrupt trigger end, and the UART signal output end is connected with the serial port input end;
the UART signal output end is configured to output positioning information corresponding to each rising edge in the PPS signal, wherein the positioning information comprises UTC time;
the controller is configured to obtain the target UTC time corresponding to the target rising edge by analyzing the target positioning information corresponding to the target rising edge output by the UART signal output end.
4. The depth image capturing device according to claim 3, wherein the controller is configured to generate a first interrupt signal when the first interrupt trigger receives the target rising edge output by the PPS signal output end, and acquire a local time corresponding to the first interrupt signal, so as to obtain the first local time.
5. A depth image capture device as recited in claim 3, further comprising a signal generation module, RGB camera and depth camera coupled to the controller;
the signal generation module is configured to generate a synchronous frequency multiplication pulse signal corresponding to the PPS signal;
the RGB camera is configured to execute a first exposure operation when the synchronous frequency multiplication pulse signal is received;
the depth camera is configured to perform a second exposure operation when the synchronous frequency multiplication pulse signal is received;
the controller is further configured to record a current target local time in response to receiving the synchronous frequency multiplication pulse signal, generate target RGB image data with a timestamp corresponding to the target local time when the RGB camera is determined to generate RGB image data corresponding to the first exposure operation, and generate target depth image data with the timestamp corresponding to the target local time when the depth camera is determined to generate depth image data corresponding to the second exposure operation.
6. The depth image capture device of claim 5, wherein the controller further comprises a second interrupt trigger, the signal generation module comprises a PPS signal input and a synchronous double frequency pulse signal output, the RGB camera comprises a first exposure trigger and an RGB image data output, and the depth camera comprises a second exposure trigger and a depth image data output;
the PPS signal output end is also connected with the PPS signal input end, the synchronous frequency multiplication pulse signal output end is respectively connected with the second interrupt trigger end, and the first exposure trigger end and the second exposure trigger end are connected.
7. The depth image capture device of claim 6, wherein the controller is configured to:
when the synchronous frequency doubling pulse signal output by the synchronous frequency doubling pulse signal output end is received through the second interrupt trigger end, starting timing, determining that the RGB camera generates RGB image data corresponding to the first exposure operation under the condition that the timing time length is determined to be a first preset time length, and determining that the depth camera generates depth image data corresponding to the second exposure operation under the condition that the timing time length is determined to be a second preset time length.
8. The depth image capture device of claim 7 wherein the RGB camera further comprises an RGB image data output, the depth camera further comprises a depth image data output, the controller further comprises an RGB image data input and a depth image data input;
the RGB image data output end is connected with the RGB image data input end, and the depth image data output end is connected with the depth image data input end.
9. The depth image capture device of claim 8, wherein the controller is configured to:
when the RGB camera is determined to generate the RGB image data corresponding to the first exposure operation, receiving the RGB image data corresponding to the first exposure operation output by the RGB image data output end in the RGB camera through the RGB image data input end, and receiving the depth image data corresponding to the second exposure operation output by the depth image data output end in the depth camera through the depth image data input end, and generating target data according to the target local time, wherein the target data comprises the target RGB image data and the target depth image data.
10. The depth image capture device of claim 9, wherein the controller is configured to:
and generating target data comprising the target local time, the RGB image data and the depth image data according to a preset data frame format.
11. The depth image capture device of any one of claims 1-10, wherein the controller is configured to obtain a target difference value for the second local time and the first local time, and to obtain a sum of the target UTC time and the target difference value, and to take the sum as the current UTC time.
12. A depth image acquisition method, the method comprising:
acquiring a target pulse signal and UTC time corresponding to a specified rising edge in the target pulse signal;
acquiring a first local time and a current second local time of a target rising edge in the target pulse signal;
determining target UTC time corresponding to the target rising edge;
and determining current UTC time according to the first local time, the target UTC time and the second local time, and taking the current UTC time as the current local time.
13. The method of claim 12, wherein the method is applied to a depth image acquisition device comprising a positioning module, the target pulse signal being a PPS signal, the designated rising edge being each rising edge of the PPS signal, the positioning module configured to output the PPS signal and UTC time corresponding to each rising edge of the PPS signal;
the acquiring the target pulse signal and specifying UTC time corresponding to a rising edge in the target pulse signal includes:
and receiving the PPS signal output by the positioning module and UTC time corresponding to each rising edge in the PPS signal.
14. The method of claim 13, wherein the determining the target UTC time for the target rising edge comprises:
and acquiring and analyzing target positioning information corresponding to the target rising edge to obtain the target UTC time corresponding to the target rising edge.
15. The method of claim 13, wherein obtaining a first local time at which a target rising edge in the target pulse signal is received comprises:
generating a first interrupt signal in response to receiving the target rising edge in the PPS signal;
And acquiring the local time corresponding to the first interrupt signal to obtain the first local time.
16. The method of claim 13, wherein the depth image capture device further comprises a signal generation module, an RGB camera, and a depth camera; the signal generating module is configured to generate a synchronous frequency multiplication pulse signal corresponding to the PPS signal, the RGB camera is configured to perform a first exposure operation when the synchronous frequency multiplication pulse signal is received, and the depth camera is configured to perform a second exposure operation when the synchronous frequency multiplication pulse signal is received, where the method further includes:
recording the current target local time in response to receiving the synchronous frequency multiplication pulse signal;
generating target RGB image data with a timestamp corresponding to the target local time when the RGB camera generates RGB image data corresponding to the first exposure operation;
and generating target depth image data with a timestamp corresponding to the target local time when the depth camera generates the depth image data corresponding to the second exposure operation.
17. The method of claim 16, wherein the determining that the RGB camera generated RGB image data corresponding to the first exposure operation comprises:
When the synchronous frequency doubling pulse signal is received, starting timing, and determining that the RGB camera generates RGB image data corresponding to the first exposure operation under the condition that the timing time length is determined to be a first preset time length;
accordingly, the determining that the depth camera generates the depth image data corresponding to the second exposure operation includes:
and under the condition that the timing duration is determined to be a second preset duration, determining that the depth camera generates depth image data corresponding to the second exposure operation.
18. The method of claim 16, wherein the method further comprises:
when the RGB camera is determined to generate RGB image data corresponding to the first exposure operation, acquiring the RGB image data corresponding to the first exposure operation, and acquiring depth image data corresponding to the second exposure operation;
and generating target data according to the target local time, wherein the target data comprises the target RGB image data and the target depth image data.
19. The method of claim 18, wherein generating target data from the RGB image data and the depth image data according to the target local time comprises:
And generating target data comprising the target local time, the RGB image data and the depth image data according to a preset data frame format.
20. The method of any of claims 12-19, wherein said determining a current UTC time from the first local time, the target UTC time, and the second local time comprises:
acquiring a target difference value of the second local time and the first local time;
and obtaining the sum value of the target UTC time and the target difference value, and taking the sum value as the current UTC time.
21. A multi-sensor fusion system comprising the depth image capture device of any one of claims 1 to 11.
CN202210302778.2A 2022-03-24 2022-03-24 Depth image acquisition device and method and multi-sensor fusion system Pending CN116862967A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210302778.2A CN116862967A (en) 2022-03-24 2022-03-24 Depth image acquisition device and method and multi-sensor fusion system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210302778.2A CN116862967A (en) 2022-03-24 2022-03-24 Depth image acquisition device and method and multi-sensor fusion system

Publications (1)

Publication Number Publication Date
CN116862967A true CN116862967A (en) 2023-10-10

Family

ID=88220330

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210302778.2A Pending CN116862967A (en) 2022-03-24 2022-03-24 Depth image acquisition device and method and multi-sensor fusion system

Country Status (1)

Country Link
CN (1) CN116862967A (en)

Similar Documents

Publication Publication Date Title
US11057853B2 (en) Methods and apparatus for indicating and determining synchronization block, and base station and user equipment
RU2747844C1 (en) Method and device for information configuration, method and device for determining time-frequency position, and also base station
US11082603B2 (en) Terminal, focusing method and apparatus, and computer readable storage medium
RU2665304C2 (en) Method and apparatus for setting photographing parameter
US10292004B2 (en) Method, device and medium for acquiring location information
US10423379B2 (en) Method and device for screen-off display
US11361586B2 (en) Method for sending warning information, storage medium and terminal
EP3147802B1 (en) Method and apparatus for processing information
US11949979B2 (en) Image acquisition method with augmented reality anchor, device, apparatus and storage medium
CN107635074B (en) Method, apparatus and computer-readable storage medium for controlling alarm
US20200187262A1 (en) Unmanned aerial vehicle management method and apparatus, and communication connection establishment method and apparatus
US20210022069A1 (en) Method and apparatus for indicating position of cell-defining synchronization signal block and searching for the same, and base station
US11832120B2 (en) Information recording method and information recording apparatus
CN110312300B (en) Control method, control device and storage medium
CN112950712B (en) Positioning method and device, electronic equipment and storage medium
US11917562B2 (en) Vehicle-to-everything synchronization method and device
US20170034347A1 (en) Method and device for state notification and computer-readable storage medium
US11297626B2 (en) Information indication method and apparatus, base station and user equipment
CN116862967A (en) Depth image acquisition device and method and multi-sensor fusion system
CN111538543B (en) Lost article searching method, lost article searching device and storage medium
US20240045076A1 (en) Communication methods and apparatuses, and storage medium
CN115552879A (en) Anchor point information processing method, device, equipment and storage medium
CN111650554A (en) Positioning method and device, electronic equipment and storage medium
CN112824933A (en) Distance measuring method, distance measuring device and electronic equipment
CN112804462B (en) Multi-point focusing imaging method and device, mobile terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination