CN117394938A - Camera, multi-sensor fusion system and autonomous mobile device - Google Patents

Camera, multi-sensor fusion system and autonomous mobile device Download PDF

Info

Publication number
CN117394938A
CN117394938A CN202210760513.7A CN202210760513A CN117394938A CN 117394938 A CN117394938 A CN 117394938A CN 202210760513 A CN202210760513 A CN 202210760513A CN 117394938 A CN117394938 A CN 117394938A
Authority
CN
China
Prior art keywords
time
camera
time service
signal
output end
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210760513.7A
Other languages
Chinese (zh)
Inventor
包鼎华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202210760513.7A priority Critical patent/CN117394938A/en
Publication of CN117394938A publication Critical patent/CN117394938A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04JMULTIPLEX COMMUNICATION
    • H04J3/00Time-division multiplex systems
    • H04J3/02Details
    • H04J3/06Synchronising arrangements
    • H04J3/0635Clock or time synchronisation in a network
    • H04J3/0638Clock or time synchronisation among nodes; Internode synchronisation

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The disclosure relates to a camera, a multi-sensor fusion system and an autonomous mobile device. The multi-sensor fusion system comprises an image acquisition device, wherein the image acquisition device comprises a plurality of cameras, the cameras comprise at least one main camera, and the at least one main camera is electrically connected with each other; the main camera comprises a time service module, wherein the time service module comprises a time service signal output end, and the time service signal output end is used for outputting a time service signal; at least one slave camera electrically connected with each master camera; and outputting the time service signals to other master cameras and each slave camera through a time service module of one master camera so that the other master cameras and at least one slave camera update the local time to be consistent with the local time of the master camera outputting the time service signals according to the time service signals.

Description

Camera, multi-sensor fusion system and autonomous mobile device
Technical Field
The disclosure relates to the technical field of terminals, and in particular relates to a camera, a multi-sensor fusion system and autonomous mobile equipment.
Background
Currently, for some unmanned devices, it is generally necessary to configure a plurality of sensors of the same type or a plurality of sensors of different types to detect information, and acquire travel information required for the unmanned device by fusing the detected information. The accuracy of the travel information is closely related to the fusion accuracy of the detection information between the plurality of sensors, and therefore, how to keep the local times of the plurality of sensors consistent is a technical problem to be solved by those skilled in the art.
Disclosure of Invention
The present disclosure provides a multi-sensor fusion system and autonomous mobile device that address deficiencies in the related art.
According to a first aspect of embodiments of the present disclosure, there is provided a multi-sensor fusion system comprising:
the image acquisition device, the image acquisition device includes a plurality of cameras, a plurality of cameras include:
at least one primary camera, the at least one primary camera being electrically connected to each other; the main camera comprises a time service module, wherein the time service module comprises a time service signal output end, and the time service signal output end is used for outputting a time service signal;
at least one slave camera electrically connected with each master camera;
and outputting the time service signals to other master cameras and each slave camera through a time service module of one master camera so that the other master cameras and at least one slave camera update the local time to be consistent with the local time of the master camera outputting the time service signals according to the time service signals.
Optionally, the main camera is one; and outputting the time service signals to each slave camera through the time service module of the master camera so that the local time of the at least one slave camera is updated to be consistent with the local time of the master camera according to the time service signals.
Optionally, the slave camera includes a time service signal input end, and the time service signal input end is connected with the time service signal output end and is used for receiving the time service signal.
Optionally, the main camera includes a first processor, the first processor includes the time service module, the time service signal output end includes a pulse signal output end and a serial port signal output end, the pulse signal output end is used for outputting a pulse signal, and the serial port output end is used for outputting a serial port signal; the time service signal comprises the pulse signal and the serial port signal;
the slave camera comprises a second processor, the second processor comprises a time service signal input end, the time service signal input end comprises a pulse signal input end and a serial port signal input end, and the serial port signal input end is connected with the serial port signal output end and is used for receiving the serial port signal; the pulse signal input end is connected with the pulse signal output end and is used for receiving the pulse signal.
Optionally, the second processor is configured to perform the following operations:
recording a first local moment of the slave camera when the pulse signal is received;
analyzing according to the serial port signal to obtain a first moment corresponding to the main camera when the pulse signal output end outputs the pulse signal;
recording a second local time of the slave camera when the first time is acquired;
determining a second moment corresponding to the main camera and the second local moment according to the first local moment, the second local moment and the first moment;
and according to the second moment, updating the current local moment of the slave camera to be consistent with the second moment.
Optionally, the second processor includes a computing module, where the computing module is configured to perform the following operations:
calculating to obtain a difference value between the second local time and the first local time;
and calculating the sum of the difference and the first moment to obtain the second moment.
Optionally, the second processor is configured to update the current local time of the slave camera when the difference between the second local time and the second time is greater than a preset threshold; or (b)
The second processor is used for updating the current local time of the slave camera according to the set time period.
Optionally, the device further comprises a trigger module arranged in the image acquisition device, and the trigger module is electrically connected with the cameras;
the trigger module comprises a trigger signal generation module and a trigger signal output end, and the trigger signal generation module is used for generating a first trigger signal; the trigger signal output end is connected with the trigger signal generation module and is used for outputting the first trigger signal;
and the triggering module is used for outputting the first triggering signal to at least one camera to execute exposure operation when the local time updates of the cameras are consistent, so that the image acquisition device executes image acquisition operation to acquire image information.
Optionally, the trigger signal generating module includes a first input end, and the first input end is connected with the time service signal output end;
the trigger signal output terminal includes:
the first output end is connected with the main camera and is used for outputting the first trigger signal to the main camera;
the second output end is connected with the slave camera and is used for outputting the first trigger signal to the slave camera.
Optionally, the method further comprises:
the host is electrically connected with the image acquisition device and is used for receiving the image information acquired by the image acquisition device and processing the image information according to the time stamp fusion when the image acquisition device executes the image acquisition operation.
Optionally, the method further comprises:
the laser radar device is electrically connected with the image acquisition device and the host, and outputs the time service signal and a second trigger signal to the laser radar device through a time service module of the image acquisition device so that the laser radar device updates the local time to be consistent with the local time of the image acquisition device according to the time service signal, and performs ranging operation according to the second trigger signal to obtain distance information;
and the host machine fuses and processes the distance information and the image information according to the time stamp after the local time is updated by the laser radar device.
Optionally, the ranging operation performed by the laser radar device and the image acquisition operation performed by the image acquisition device are performed at the same time.
According to a second aspect of embodiments of the present disclosure, there is provided an autonomous mobile device comprising:
the multi-sensor fusion system of any of the above embodiments.
According to a third aspect of embodiments of the present disclosure, there is provided a camera, comprising: the time service module comprises a time service signal output end, and the time service signal output end is used for outputting a time service signal to external equipment so that the external equipment can update the local time according to the time service signal.
The technical scheme provided by the embodiment of the disclosure can comprise the following beneficial effects:
according to the embodiment, through the technical scheme, one master camera can generate time signals to other master cameras or each slave camera, and the cameras receiving the time signals update the local time to be consistent with the local time of the master camera outputting the time signals, so that the local time of a plurality of cameras can be kept consistent, the fusion error of information among different cameras caused by the time difference of the local time of the plurality of cameras is reduced or avoided, and the fusion of the image information of the same moment before the plurality of cameras is facilitated, and the fusion precision is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a block diagram illustrating a structure of an image pickup apparatus according to an exemplary embodiment;
FIG. 2 is a block diagram illustrating yet another image acquisition apparatus according to an exemplary embodiment;
FIG. 3 is a schematic diagram illustrating a connection frame between a master camera and a slave camera according to an exemplary embodiment;
fig. 4 is a block diagram illustrating a multi-sensor fusion system including a host and a lidar device according to an example embodiment.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present disclosure as detailed in the accompanying claims.
The terminology used in the present disclosure is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used in this disclosure and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any or all possible combinations of one or more of the associated listed items.
It should be understood that although the terms first, second, third, etc. may be used in this disclosure to describe various information, these information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present disclosure. The word "if" as used herein may be interpreted as "at … …" or "at … …" or "responsive to a determination", depending on the context.
Fig. 1 is a block diagram illustrating a multi-sensor fusion system according to an exemplary embodiment. The multi-sensor fusion system includes an image acquisition device 10. The image capturing mechanism 10 comprises a plurality of cameras including at least one main camera 11, the at least one main camera 11 being electrically connected to each other, and in the embodiment shown in fig. 1, the image capturing mechanism 10 comprises two main cameras 11, the two main cameras 11 being electrically connected to each other. Wherein, the electric connection comprises communication connection, wired connection, wireless connection and the like. The main camera 11 includes a timing module 111, and the timing module includes a timing signal output 1111, where the timing signal output 1111 is configured to output a timing signal.
The plurality of cameras comprises at least one slave camera 12, the slave camera 12 being electrically connected to each of said master cameras 11. In the embodiment shown in fig. 1, the image acquisition device 10 comprises two slave cameras 12, each slave camera 12 being electrically connected to two master cameras 11.
And outputting time signals to other master cameras 11 and each slave camera 12 through a time service module of one master camera 11 so that the local time of the other master cameras 11 and the at least one slave camera 12 is updated to be consistent with the local time of the master camera 11 outputting the time service signals according to the time service signals. That is, one of the master cameras 11 may output a timing signal to the remaining master cameras 11, and one of the master cameras 11 may also generate a trigger signal to one or more of the slave cameras 12.
As can be seen from the above embodiments, one master camera 11 may generate a time service signal to other master cameras 11 or each slave camera 12, and the camera receiving the time service signal updates the local time to be consistent with the local time of the master camera 11 outputting the time service signal, so that the local times of the multiple cameras can be kept consistent, and the fusion error of information between different cameras caused by the time difference of the local times of the multiple cameras can be reduced or avoided, which is beneficial to fusing the image information of the same moment before the multiple cameras, and improves the fusion precision.
Fig. 2 shows a schematic block diagram of one master camera 11 and a plurality of slave cameras 12 electrically connected. As shown in fig. 2, there are two slave cameras 12, and a time service signal is output to each slave camera 12 by the time service module 111 of the master camera 11 so that the local time of at least one slave camera 12 is updated to be identical to the local time of the master camera 11 according to the time service signal. In this way, the local time of the cameras in the image acquisition device 10 is kept consistent, and time signals are output to the slave cameras 12 through one master camera 11, so that the connection is simple and the operation is convenient.
Referring to fig. 2 and 3 in combination, the slave camera 12 includes a timing signal input 123, and the timing signal input 123 is connected to the timing signal output 1111 for receiving the timing signal. Thus, the time service signal is convenient to transmit.
The main camera 11 comprises a first processor 112, the first processor 112 comprises a time service module 111, a time service signal output end 1111 comprises a pulse signal output end 1112 and a serial port signal output end 1113, the pulse signal output end 1112 is used for outputting a pulse signal, and the serial port signal output end 1113 is used for outputting a serial port signal; the time service signal comprises a pulse signal and a serial port signal. In some embodiments, the pulse signal output 1112 and the serial signal output 1113 may be provided at ports of the first processor 112.
The Pulse signal output terminal 1112 may include a PPS (Pulse Per Second) Pulse signal output terminal, and accordingly, the Pulse signal may include a PPS Pulse signal. The serial signal may include GPRMC data or GPGGA data output by the timing module 111, and the timing module 111 may output one GPRMC data or GPGGA data after outputting one pulse signal. The GPGGA data may be in a GPS (Global Positioning System ) data output format statement, typically comprising 17 fields: the sentence marks the head, the world time, the latitude hemisphere, the longitude hemisphere, the positioning quality indication, the number of satellites, the horizontal precision factor, the ellipsoidal height, the height unit, the ground level height abnormal difference value, the height unit, the differential GPS data deadline, the differential reference base station label, the check sum end mark, and are separated by commas respectively, wherein the fields where the data does not exist can be subjected to white-keeping processing.
The slave camera 12 comprises a second processor 122, the second processor 122 comprises a time service signal input end 123, the time service signal input end 123 comprises a pulse signal input end 1231 and a serial port signal input end 1232, and the serial port signal input end 1232 is connected with a serial port signal output end 1113 and is used for receiving serial port signals; the pulse signal input terminal 1231 is connected to the pulse signal output terminal 1112 for receiving the pulse signal.
The second processor 122 performs the following operations:
the first local time from the camera 12 when the pulse signal is received is recorded. When the pulse signal output terminal 1112 outputs the pulse signal according to the serial signal analysis, the main camera 11 corresponds to the first moment. The second local time from the camera 12 when the first time is acquired is recorded. The second time corresponding to the second local time is determined by the master camera 11 according to the first local time, the second local time, and the first time. According to the second time instant, the current local time instant from the camera 12 is updated to coincide with the second time instant. In this way, the local times of the slave camera 12 and the master camera 11 are kept consistent.
Alternatively, the local time of the slave camera 12 and the master camera 11 may be kept consistent by the target edge of the pulse signal. The target edge may comprise a rising or falling edge of the pulse signal. The second processor 122 may generate a first interrupt signal at the target edge of the received pulse signal through the pulse signal input terminal 1231, and the second processor 122 may obtain the accurate local time when the target edge occurs by recording the time of the first interrupt signal, that is, obtain the first local time, so as to effectively ensure the reliability of the first local time. The second processor 122 may obtain the first time of the target edge by parsing the GPRMC data or GPGGA data. The second processor 122 may obtain the first time corresponding to the rising edge by parsing the GPRMC data or the GPGGA data when the target edge is the rising edge, and may obtain the first time corresponding to the falling edge by parsing the GPRMC data or the GPGGA data when the target edge is the falling edge.
In the above embodiment, the second processor 122 may further include a calculation module for performing the following operations: calculating to obtain a difference value between the second local time and the first local time; and calculating the sum of the difference value and the first moment to obtain a second moment. Assuming that the first local time is T1, the first time is T2, and the second local time is T3, the second processor 122 needs to determine a second time corresponding to the second local time T3 is T4. In some embodiments, the difference between the first local time T1 and the second local time T3 recorded based on the local time before the update of the second processor 122 may be defined as the difference between the first time T2 and the second time T4 corresponding to the second local time T3. Thus, the calculation module may calculate the second time instant T4, i.e. t4=t2+ (T3-T1), based on the sum of the difference between the second local time instant T3 and the first local time instant T1 and the first time instant T2. In other embodiments, since there may be a certain error between the local time of the pre-update processor 22 and the time of the first depth camera 1, the difference between the first local time T1 and the second local time T3 may be calibrated and then the second time T4 may be calculated together with the first time T3. The calibration method may be that the difference between the first local time T1 and the second local time T3 is multiplied by a weight, or may be that the difference between the first local time T1 and the second local time T3 is subtracted or added with a calibration value, which may be obtained based on a test, and this disclosure is not limited thereto.
Further, since the timing module 111 continuously sends the pulse signal to the second processor 122 at a certain frequency, in fact, in some cases, when the error of the local time of the second processor 122 is within the allowable range, the local time update may not be performed, so that the resource waste of the second processor 122 may be reduced. Therefore, the second processor 122 may further consider that the error of the local time currently used by the second processor 122 exceeds the allowable range when the difference between the second local time T3 and the second time T4 is greater than the preset threshold, and thus update the local time according to the second time T4. In some embodiments, the second processor 122 may also be configured to update the current local time from the camera 12 based on the set time period. For example, the time period may be 1min,3min,5min, etc., thereby reducing the resource waste of the second processor 122.
Still referring to fig. 3, the multi-cage sensing system further includes a trigger module 13 disposed in the image capturing device 10, where the trigger module 13 is electrically connected to the plurality of cameras. The trigger module 13 may be electrically connected to both the master camera 11 and the slave camera 12. The trigger module 13 may be provided at the master camera 11 or the slave camera 12 or separately.
The trigger module 13 includes a trigger signal generating module 131 and a trigger signal output end 132, where the trigger signal generating module 131 is configured to generate a first trigger signal, and the first trigger signal may be used to trigger the multiple cameras to perform an exposure operation. The trigger signal output end 132 is connected to the trigger signal generating module 131, and is configured to output a first trigger signal. The triggering module 13 is configured to output a first triggering signal to at least one of the cameras to perform an exposure operation when local times of the plurality of cameras are updated to be consistent, so that the image capturing device performs an image capturing operation to obtain image information. In some embodiments, the triggering module 13 may expose multiple cameras or part of cameras at the same time, and may be specifically selected according to requirements.
The first trigger signal output from the trigger signal output end 132 may be output to the main camera 11, and after the main camera 11 receives the first trigger signal, the exposure operation may be performed according to the received first trigger signal to obtain corresponding image information; similarly, the slave camera 12 may also trigger the slave camera 12 to perform the exposure operation after the trigger signal generating module 131 generates the first trigger signal, and when the first trigger signal is sent to the slave camera 12, the corresponding image information is obtained. In this way, the master camera 11 and the slave camera 12 can be triggered simultaneously by the first trigger signal output by the trigger signal output end 132, so that errors between trigger moments among a plurality of cameras can be reduced, fusion of image information of the same moment among the plurality of cameras can be facilitated, and fusion precision is improved.
In the above embodiments, the trigger signal generation module 131 further includes a first input 1311, where the first input 1311 is connected to the timing signal output 1111. The first input end 1311 may be specifically connected to the pulse signal output end 1112 of the time service signal output end 1111, and configured to receive the pulse signal, where the trigger signal generating module 131 generates the first trigger signal according to the pulse signal, and the pulse signal output by the pulse signal output end 1112 is described above and will not be described herein.
The trigger signal output terminal 132 may include a first output terminal 1321 and a second output terminal 1322, where the first output terminal 1321 is connected to the main camera 11, and the first output terminal 1321 is configured to output a first trigger signal to the main camera 11. The second output terminal 1322 is connected to the slave camera 12, and the second output terminal 1322 is configured to output the first trigger signal to the slave camera 12. In some embodiments, the master camera 11 and the slave camera 12 may each include one or more of a depth lens 113, an RGB lens 114, a wide angle lens, a tele lens, and the like, which the present disclosure is not limited to.
The main camera 11 is described as including the depth lens 113 and the RGB lens 114. The first trigger signal generated by the trigger signal generating module 131 is output to the depth lens 113 and the RGB lens 114 of the main camera 11, and the first processor 112 receives the image information acquired by the depth lens 113 and the image information acquired by the RGB lens 114. Therefore, the error of the trigger time among a plurality of lenses included in the same camera is reduced, and meanwhile, the error of the trigger time among a plurality of lenses included in different cameras can be reduced. The trigger signal output 132 and the trigger signal generating module 131 may be integrated into one integrated module, or may be separate two modules, which is not limited by the present disclosure.
Alternatively, the first trigger signal may include a synchronous frequency-doubling pulse signal, and specifically, the trigger signal generating module 131 may generate, based on the received pulse signal, a synchronous frequency-doubling pulse signal corresponding to the pulse signal, for example, a synchronous high-frequency signal that may be 20Hz or 30 Hz. The trigger signal generating module 131 may be an FPGA (Field Programmable Gate Array ) module, or may be another circuit module capable of generating a synchronous frequency multiplication pulse signal corresponding to the signal, which is not limited in this disclosure. Compared with the scheme that the first rising edge is used as the trigger time when the camera is automatically triggered in other technical schemes, the embodiment of triggering the depth lens 113 and the RGB lens 114 by the synchronous frequency doubling pulse signals can control the depth lens 113 and the RGB lens 114 to be triggered when the rising edge or the falling edge of the synchronous frequency doubling pulse signals are received, so that the phase control is realized.
The first processor 112 records a time stamp, when the first trigger signal is a synchronous frequency doubling pulse signal, the synchronous frequency doubling pulse signal is connected to the second input end 115 of the first processor 112, when a trigger edge (rising edge or falling edge) of the synchronous frequency doubling pulse signal is received, a second interrupt signal is generated, a local time corresponding to the second interrupt signal is read, and the local time corresponding to the second interrupt signal is recorded as a time stamp based on the local time of the image information. Similarly, the manner in which the second processor 122 records the time stamp of the image information may refer to the above embodiment, and will not be described in detail herein.
As shown in fig. 4, the multi-sensor fusion system may further include a host 20, where the host 20 may be electrically connected to the image capturing device 10, and the host 20 is configured to receive the image information obtained by the image capturing device 10 and fuse the image information according to a time stamp when the image capturing device 10 performs the image capturing operation. The host 20 may be electrically connected to each of the plurality of cameras in the image capturing apparatus 10, and in particular may be connected to the first processor 112 of the master camera 11 and the second processor 122 of the slave camera 12 to receive image information captured by the first processor 112 and the second processor 122.
The host 20 may be in communication connection with the master camera 11 and the slave camera 12, respectively, for example, in the embodiment provided in the present disclosure, the communication connection may be performed through a USB data line, and in other embodiments, the communication connection between the host 3 and the master camera 11 and the slave camera 12 may also be implemented through a wireless communication manner. The embodiment of the master camera 11 recording the time stamp of the image information and the embodiment of the slave camera 12 recording the time stamp of the image information can be seen in detail in the foregoing examples. In this way, when the local time alignment of the master camera 11 and the slave camera 12 is performed and the same time stamp is triggered, the image information with the same time stamp can be fused, so that the fusion precision is improved.
In some alternative embodiments, still referring to fig. 4, the multi-sensor fusion system may include a lidar device 15, where the lidar device 15 is electrically connected to the image capturing device 10 and the host 20, and outputs a time service signal and a second trigger signal to the lidar device 15 through the time service module 111 of the image capturing device 10, so that the lidar device 15 updates the local time to be consistent with the local time of the image capturing device 10 according to the time service signal, and performs a ranging operation according to the second trigger signal to obtain distance information. Specifically, the lidar device 15 may be connected to the pulse signal output terminal 1112 and the serial signal output terminal 1113 of the timing module 111, so that the timing module 111 may be used to update the local time of the lidar device 15, and the updating manner may refer to the embodiment in which the timing module 111 updates the local time of the slave camera 12 in the foregoing embodiment, which is not described herein again. The second trigger signal may be a pulse signal output by the pulse signal output terminal 1112, and the lidar device 15 performs a ranging operation to obtain distance information according to the received pulse signal output by the pulse signal output terminal 1112.
The host 20 fuses the processing distance information and the image information according to the time stamp after the local time is updated by the laser radar device 15. Thus, the content richness of the fusion image is improved.
In some embodiments, the ranging operation performed by the lidar device 15 is performed at the same time as the image acquisition operation performed by the image acquisition device 10. That is, the timing at which the trigger signal output terminal 132 outputs the first trigger signal to expose the plurality of cameras is the same as the timing at which the time service module 111 outputs the second trigger signal to the lidar device to trigger the ranging operation. Therefore, the time of the laser radar device 15 is updated by the time service module 111, and the multiple cameras and the laser radar device 15 are triggered simultaneously, so that the host 20 can fuse the image information and the distance information based on the same time stamp, and the fusion accuracy can be improved.
It should be noted that the embodiment shown in fig. 4 is merely illustrative, and in other embodiments, the multi-sensor fusion system may further include other sensors, such as a microphone module or an IMU sensor, which is not limited in this disclosure. Each type of sensor may include one or more, and this disclosure is not limited in this regard.
Based on the technical solution of the present disclosure, there is further provided an autonomous mobile device, which may include the multi-sensor fusion system described in any of the foregoing embodiments, and the autonomous mobile device may include an autonomous vehicle or an unmanned aerial vehicle, which is not limited in this disclosure.
Based on the technical scheme of the disclosure, there is further provided a camera, the camera includes a time service module 111, the time service module 111 includes a time service signal output end 1111, and the time service signal output end 1111 is configured to output a time service signal to an external device, so that the external device updates a local time according to the time service signal.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any adaptations, uses, or adaptations of the disclosure following the general principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (14)

1. A multi-sensor fusion system, comprising: the image acquisition device, the image acquisition device includes a plurality of cameras, a plurality of cameras include:
at least one primary camera, the at least one primary camera being electrically connected to each other; the main camera comprises a time service module, wherein the time service module comprises a time service signal output end, and the time service signal output end is used for outputting a time service signal;
at least one slave camera electrically connected with each master camera;
and outputting the time service signals to other master cameras and each slave camera through a time service module of one master camera so that the other master cameras and at least one slave camera update the local time to be consistent with the local time of the master camera outputting the time service signals according to the time service signals.
2. The multi-sensor fusion system of claim 1, wherein the primary camera is one; and outputting the time service signals to each slave camera through the time service module of the master camera so that the local time of the at least one slave camera is updated to be consistent with the local time of the master camera according to the time service signals.
3. The multi-sensor fusion system of claim 2, wherein the slave camera includes a time signal input coupled to the time signal output for receiving the time signal.
4. The multi-sensor fusion system of claim 3, wherein the primary camera comprises a first processor, the first processor comprises the time service module, the time service signal output end comprises a pulse signal output end and a serial port signal output end, the pulse signal output end is used for outputting a pulse signal, and the serial port output end is used for outputting a serial port signal; the time service signal comprises the pulse signal and the serial port signal;
the slave camera comprises a second processor, the second processor comprises a time service signal input end, the time service signal input end comprises a pulse signal input end and a serial port signal input end, and the serial port signal input end is connected with the serial port signal output end and is used for receiving the serial port signal; the pulse signal input end is connected with the pulse signal output end and is used for receiving the pulse signal.
5. The multi-sensor fusion system of claim 4, wherein the second processor is configured to:
recording a first local moment of the slave camera when the pulse signal is received;
analyzing according to the serial port signal to obtain a first moment corresponding to the main camera when the pulse signal output end outputs the pulse signal;
recording a second local time of the slave camera when the first time is acquired;
determining a second moment corresponding to the main camera and the second local moment according to the first local moment, the second local moment and the first moment;
and according to the second moment, updating the current local moment of the slave camera to be consistent with the second moment.
6. The multi-sensor fusion system of claim 5, wherein the second processor comprises a computing module to:
calculating to obtain a difference value between the second local time and the first local time;
and calculating the sum of the difference and the first moment to obtain the second moment.
7. The multi-sensor fusion system of claim 5, wherein the second processor is configured to update the current local time of the slave camera when a difference between the second local time and the second time is greater than a preset threshold; or (b)
The second processor is used for updating the current local time of the slave camera according to the set time period.
8. The multi-sensor fusion system of claim 1, further comprising a trigger module disposed within the image acquisition device, the trigger module electrically connected to the plurality of cameras;
the trigger module comprises a trigger signal generation module and a trigger signal output end, and the trigger signal generation module is used for generating a first trigger signal; the trigger signal output end is connected with the trigger signal generation module and is used for outputting the first trigger signal;
and the triggering module is used for outputting the first triggering signal to at least one camera to execute exposure operation when the local time updates of the cameras are consistent, so that the image acquisition device executes image acquisition operation to acquire image information.
9. The multi-sensor fusion system of claim 8, wherein the trigger signal generation module comprises a first input connected to the time service signal output;
the trigger signal output terminal includes:
the first output end is connected with the main camera and is used for outputting the first trigger signal to the main camera;
the second output end is connected with the slave camera and is used for outputting the first trigger signal to the slave camera.
10. The multi-sensor fusion system of claim 8, further comprising:
the host is electrically connected with the image acquisition device and is used for receiving the image information acquired by the image acquisition device and processing the image information according to the time stamp fusion when the image acquisition device executes the image acquisition operation.
11. The multi-sensor fusion system of claim 10, further comprising:
the laser radar device is electrically connected with the image acquisition device and the host, and outputs the time service signal and a second trigger signal to the laser radar device through a time service module of the image acquisition device so that the laser radar device updates the local time to be consistent with the local time of the image acquisition device according to the time service signal, and performs ranging operation according to the second trigger signal to obtain distance information;
and the host machine fuses and processes the distance information and the image information according to the time stamp after the local time is updated by the laser radar device.
12. The multi-sensor fusion system of claim 11, wherein the lidar device performs ranging operations at the same time as the image acquisition device performs image acquisition operations.
13. An autonomous mobile device, comprising:
the multi-sensor fusion system of any one of claims 1-12.
14. A camera, comprising: the time service module comprises a time service signal output end, and the time service signal output end is used for outputting a time service signal to external equipment so that the external equipment can update the local time according to the time service signal.
CN202210760513.7A 2022-06-29 2022-06-29 Camera, multi-sensor fusion system and autonomous mobile device Pending CN117394938A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210760513.7A CN117394938A (en) 2022-06-29 2022-06-29 Camera, multi-sensor fusion system and autonomous mobile device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210760513.7A CN117394938A (en) 2022-06-29 2022-06-29 Camera, multi-sensor fusion system and autonomous mobile device

Publications (1)

Publication Number Publication Date
CN117394938A true CN117394938A (en) 2024-01-12

Family

ID=89436025

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210760513.7A Pending CN117394938A (en) 2022-06-29 2022-06-29 Camera, multi-sensor fusion system and autonomous mobile device

Country Status (1)

Country Link
CN (1) CN117394938A (en)

Similar Documents

Publication Publication Date Title
CN111381487B (en) Multi-sensor synchronous time service system, method and device and electronic equipment
CN107707626B (en) Data acquisition card, data acquisition system and data acquisition method based on FPGA
CN112383675A (en) Time synchronization method and device and terminal equipment
CN111220197B (en) Test system and test method for lane line deviation alarm system
CN102607527A (en) UAV (unmanned aerial vehicle) aerial photography measurement method and UAV aerial photography measurement system
CN104748730A (en) Device and method for determining exposure moment of aerial survey camera in unmanned aerial vehicle
CN107345808B (en) Exposure time acquisition method, pulse signal acquisition device, unmanned aerial vehicle and flash initiator
EP2790425A1 (en) Time calibration method and device
CN114025055A (en) Data processing method, device, system, equipment and storage medium
CN113959457B (en) Positioning method and device for automatic driving vehicle, vehicle and medium
US10697771B2 (en) Survey system
WO2020133105A1 (en) Timing method, switching method, apparatus, control system, and unmanned aerial vehicle
US20180231380A1 (en) Surveying device and survey system
CN111556226A (en) Camera system
JP6075377B2 (en) COMMUNICATION DEVICE, COMMUNICATION SYSTEM, COMMUNICATION METHOD, AND PROGRAM
KR101402088B1 (en) Method for estimating location based on technology convergence of satellite navigation system and a vision system
CN112564883B (en) Time synchronization apparatus, method and automatic driving system
CN117394938A (en) Camera, multi-sensor fusion system and autonomous mobile device
CN104931921A (en) Ultra-shortwave direction-finding station flight correction data acquisition and processing method
CN112995524A (en) High-precision acquisition vehicle, and photo exposure information generation system, method and synchronization device thereof
CN110967036B (en) Test method and device for navigation product
KR102102398B1 (en) Apparatus and method for making navigation performance evaluation in real time
CN112489447A (en) Method, device and system for detecting vehicle running speed and electronic equipment
CN116859407A (en) Multi-sensor fusion system and autonomous mobile device
CN110906922A (en) Unmanned aerial vehicle pose information determining method and device, storage medium and terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination