CN113132551A - Synchronous control method and device of multi-camera system and electronic equipment - Google Patents

Synchronous control method and device of multi-camera system and electronic equipment Download PDF

Info

Publication number
CN113132551A
CN113132551A CN201911398576.7A CN201911398576A CN113132551A CN 113132551 A CN113132551 A CN 113132551A CN 201911398576 A CN201911398576 A CN 201911398576A CN 113132551 A CN113132551 A CN 113132551A
Authority
CN
China
Prior art keywords
camera module
coordinate system
light spot
camera
under
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911398576.7A
Other languages
Chinese (zh)
Other versions
CN113132551B (en
Inventor
王正
章炳刚
周劲蕾
田新蕾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Sunny Optical Intelligent Technology Co Ltd
Original Assignee
Zhejiang Sunny Optical Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Sunny Optical Intelligent Technology Co Ltd filed Critical Zhejiang Sunny Optical Intelligent Technology Co Ltd
Priority to CN201911398576.7A priority Critical patent/CN113132551B/en
Publication of CN113132551A publication Critical patent/CN113132551A/en
Application granted granted Critical
Publication of CN113132551B publication Critical patent/CN113132551B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/04Synchronising
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The application relates to a synchronous control method and a synchronous control device of a multi-camera system and an electronic device. The synchronous control method realizes self-adaptive synchronous regulation through closed-loop feedback of the video stream synchronization result so as to obtain higher synchronization precision.

Description

Synchronous control method and device of multi-camera system and electronic equipment
Technical Field
The application relates to the field of camera modules, in particular to a synchronous control method, a synchronous control device and electronic equipment of a multi-camera system, which can enable a plurality of camera modules in the multi-camera system to be exposed simultaneously so as to ensure that video streams collected by the plurality of camera modules are synchronous.
Background
With the development of computer vision technology, optical three-dimensional measurement technology is gradually mature, and gradually appears in application scenes such as gesture control, 3D modeling, automobile radar, robot vision system and the like. In these application scenarios, different cameras are used to acquire depth images and/or color images and/or infrared images. For example, a plurality of multi-camera systems of the same type or different types, such as a binocular camera (RGB camera + RGB camera), a TOF camera + RGB camera, an IR camera + RGB camera, and the like, are used to acquire corresponding image data and fuse different image data.
Multi-camera systems typically require synchronization of image data acquired by the various cameras for processing by back-end applications. Currently, a multi-camera system uses a hardware synchronization method to realize synchronization control. However, this hardware synchronization method can perform only rough synchronization and is not accurate enough.
Accordingly, there is a need for an improved synchronization control scheme for a multi-camera system.
Disclosure of Invention
The present application provides a synchronous control method, a synchronous control device and an electronic device of a multi-camera system, which can enable a plurality of camera modules to be exposed simultaneously to ensure that video streams collected by the plurality of camera modules are synchronous.
Another object of the present application is to provide a synchronous control method, a synchronous control device and an electronic apparatus for a multi-camera system, wherein a plurality of camera modules are calibrated synchronously by the synchronous control method, so as to obtain high synchronization precision.
Another object of the present application is to provide a synchronization control method, a synchronization control apparatus and an electronic device for a multi-camera system, wherein the synchronization control method implements adaptive synchronization adjustment through closed-loop feedback of video stream synchronization results to obtain higher synchronization accuracy.
In order to achieve at least one of the above objects, the present application provides a synchronous control method for a multi-camera system, wherein the multi-camera system includes a plurality of camera modules, and the method includes:
sending a synchronization signal to the plurality of camera modules to start the plurality of camera modules simultaneously, wherein the plurality of camera modules are set to acquire images of light spots on a parallel line of a central point connecting line set by the plurality of camera modules at the same frame rate, and the light spots move along the parallel line at a specific speed;
respectively extracting light spot images respectively collected by the plurality of camera modules under the same frame number;
based on the corresponding relation among the space coordinate systems set by the plurality of camera modules, obtaining a series of displacement differences among the light spots in the light spot images acquired by the plurality of camera modules under the same frame number;
acquiring an average value of the series of displacement differences, and acquiring a corresponding time difference based on the moving speed of the light spot and the average value of the displacement differences; and
and responding to the time difference meeting the preset time precision, and saving time sequence configuration parameters of the plurality of camera modules, wherein the time sequence configuration parameters are used for controlling the starting and exposure of the camera modules.
In one or more embodiments of the present application, the method further comprises:
responding to the time difference being larger than the preset time precision, and adjusting the time sequence configuration parameters of the plurality of camera modules based on the time difference;
starting a new round of synchronous control process until the time difference in the new round of synchronous control process meets the preset time precision; and
and responding to the time difference meeting the preset time precision, and saving the time sequence configuration parameters of the plurality of camera modules.
In one or more embodiments of the present application, the plurality of camera modules include a first camera module and a second camera module, wherein the first camera module is selected from any one of an infrared camera module, an RGB camera module, and a TOF camera module, and the second camera module is selected from any one of an infrared camera module, an RGB camera module, and a TOF camera module.
In one or more embodiments of the present application, obtaining a series of displacement differences between light spots in the light spot images acquired by the plurality of camera modules under the same frame number based on a correspondence between spatial coordinate systems set by the plurality of camera modules includes:
converting coordinates of light spots in the light spot images acquired by the second camera module under the same number under the second space coordinate system into corresponding coordinates under the first space coordinate system based on a corresponding relationship between the first space coordinate system set by the first camera module and the second space coordinate system set by the second camera module; and
and obtaining a series of displacement differences between the light spots in the light spot images acquired by the first and second camera modules under the same frame number based on the coordinates of the light spots in the light spot images acquired by the first camera module under a first spatial coordinate system and the corresponding coordinates of the light spots in the light spot images acquired by the second camera module under the first spatial coordinate system.
In one or more embodiments of the present application, obtaining a series of displacement differences between light spots in the light spot images acquired by the plurality of camera modules under the same frame number based on a correspondence between spatial coordinate systems set by the plurality of camera modules includes:
converting the coordinates of the light spots in the light spot images collected by the first camera module under the same number under the first space coordinate system into corresponding coordinates under a second space coordinate system based on the corresponding relationship between the first space coordinate system set by the first camera module and the second space coordinate system set by the second camera module; and
and obtaining a series of displacement differences between the light spots in the light spot images acquired by the first and second camera modules under the same frame number based on the coordinates of the light spots in the light spot images acquired by the second camera module under a second spatial coordinate system and the corresponding coordinates of the light spots in the light spot images acquired by the first camera module under the second spatial coordinate system.
In one or more embodiments of the present application, the method further comprises:
reading pixel coordinates of light spots in the light spot image acquired by the second camera module under a second pixel coordinate system set by the second camera module; and
and obtaining the coordinates of the light spot in the light spot image acquired by the second camera module under the second space coordinate system based on the corresponding relation between the second pixel coordinate system set by the second camera module and the second space coordinate system.
In one or more embodiments of the present application, the method further comprises:
reading pixel coordinates of light spots in the light spot image collected by the first camera module under a first pixel coordinate system set by the first camera module; and
and obtaining the coordinates of the light spot in the light spot image acquired by the first camera module under the first space coordinate system based on the corresponding relation between the first pixel coordinate system set by the first camera module and the first space coordinate system.
According to another aspect of the present application, there is also provided a device for synchronously controlling a plurality of camera modules, which comprises
The system comprises a synchronization unit, a control unit and a control unit, wherein the synchronization unit is used for sending a synchronization signal to a plurality of camera modules so as to start the camera modules at the same time, the camera modules are arranged to acquire images of light spots on parallel lines of central point connecting lines set by the camera modules at the same frame rate, and the light spots move along the parallel lines at a specific speed;
the extraction unit is used for respectively extracting the light spot images respectively collected by the plurality of camera modules under the same frame number;
a displacement difference obtaining unit, configured to obtain a series of displacement differences between the light spots in the light spot images acquired by the plurality of camera modules under the same frame number, based on a correspondence between spatial coordinate systems set by the plurality of camera modules;
the time difference acquisition unit is used for acquiring the average value of the series of displacement differences and acquiring the corresponding time difference based on the moving speed of the light spot and the average value of the displacement differences; and
and the synchronization determining unit is used for responding to the time difference meeting the preset time precision and storing the time sequence configuration parameters of the plurality of camera modules, wherein the time sequence configuration parameters are used for controlling the starting and the exposure of the camera modules.
In one or more embodiments of the present application, the synchronization determining unit is further configured to:
responding to the time difference being larger than the preset time precision, and adjusting the time sequence configuration parameters of the plurality of camera modules based on the time difference;
starting a new round of synchronous control process until the time difference in the new round of synchronous control process meets the preset time precision; and
and responding to the time difference meeting the preset time precision, and saving the time sequence configuration parameters of the plurality of camera modules.
In one or more embodiments of the present application, the plurality of camera modules include a first camera module and a second camera module, wherein the first camera module is selected from any one of an infrared camera module, an RGB camera module, and a TOF camera module, and the second camera module is selected from any one of an infrared camera module, an RGB camera module, and a TOF camera module.
In one or more embodiments of the present application, the displacement difference obtaining unit is further configured to:
converting coordinates of light spots in the light spot images acquired by the second camera module under the same number under the second space coordinate system into corresponding coordinates under the first space coordinate system based on a corresponding relationship between the first space coordinate system set by the first camera module and the second space coordinate system set by the second camera module; and
and obtaining a series of displacement differences between the light spots in the light spot images acquired by the first and second camera modules under the same frame number based on the coordinates of the light spots in the light spot images acquired by the first camera module under a first spatial coordinate system and the corresponding coordinates of the light spots in the light spot images acquired by the second camera module under the first spatial coordinate system.
In one or more embodiments of the present application, the displacement difference obtaining unit is further configured to:
converting the coordinates of the light spots in the light spot images collected by the first camera module under the same number under the first space coordinate system into corresponding coordinates under a second space coordinate system based on the corresponding relationship between the first space coordinate system set by the first camera module and the second space coordinate system set by the second camera module; and
and obtaining a series of displacement differences between the light spots in the light spot images acquired by the first and second camera modules under the same frame number based on the coordinates of the light spots in the light spot images acquired by the second camera module under a second spatial coordinate system and the corresponding coordinates of the light spots in the light spot images acquired by the first camera module under the second spatial coordinate system.
In one or more embodiments of the present application, the displacement difference obtaining unit is further configured to:
reading pixel coordinates of light spots in the light spot image acquired by the second camera module under a second pixel coordinate system set by the second camera module; and
and obtaining the coordinates of the light spot in the light spot image acquired by the second camera module under the second space coordinate system based on the corresponding relation between the second pixel coordinate system set by the second camera module and the second space coordinate system.
In one or more embodiments of the present application, the displacement difference obtaining unit is further configured to:
reading pixel coordinates of light spots in the light spot image collected by the first camera module under a first pixel coordinate system set by the first camera module; and
and obtaining the coordinates of the light spot in the light spot image acquired by the first camera module under the first space coordinate system based on the corresponding relation between the first pixel coordinate system set by the first camera module and the first space coordinate system.
According to another aspect of the present application, there is also provided an electronic device comprising a processor and a memory, wherein computer program instructions are stored in the memory, which, when executed by the processor, cause the processor to perform the synchronization control method as described above.
Further objects and advantages of the present application will become apparent from an understanding of the ensuing description and drawings.
These and other objects, features and advantages of the present application will become more fully apparent from the following detailed description, the accompanying drawings and the claims.
Drawings
Fig. 1 illustrates an effect diagram of controlling synchronization of a multi-camera system by a hardware synchronization method in the prior art.
Fig. 2 illustrates a flow diagram of a synchronization control method of a multi-camera system according to an embodiment of the present application.
Fig. 3 illustrates a schematic diagram of hardware synchronization of a plurality of camera modules by a main control chip according to an embodiment of the present application.
FIG. 4 illustrates a schematic diagram of a synchronous control system according to an embodiment of the present application.
Fig. 5 illustrates a timing diagram of the synchronous control of the plurality of camera modules after being processed by the synchronous control method according to the embodiment of the present application.
Fig. 6 illustrates another flow chart of the synchronization control method according to the embodiment of the application.
Fig. 7 illustrates a block diagram schematic diagram of a synchronization control apparatus for a plurality of camera modules according to an embodiment of the present application.
FIG. 8 illustrates a schematic diagram of an electronic device according to an embodiment of the application.
Detailed Description
The following description is presented to disclose the application and to enable any person skilled in the art to practice the application. The preferred embodiments in the following description are given by way of example only, and other obvious variations will occur to those skilled in the art. The underlying principles of the application, as defined in the following description, may be applied to other embodiments, variations, modifications, equivalents, and other technical solutions without departing from the spirit and scope of the application.
It will be understood by those skilled in the art that in the present disclosure, the terms "longitudinal," "lateral," "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," and the like are used in an orientation or positional relationship indicated in the drawings for ease of description and simplicity of description, and do not indicate or imply that the referenced devices or components must be in a particular orientation, constructed and operated in a particular orientation, and thus the above terms are not to be considered limiting of the present application.
It is understood that the terms "a" and "an" should be interpreted as meaning that a number of one element or element is one in one embodiment, while a number of other elements is one in another embodiment, and the terms "a" and "an" should not be interpreted as limiting the number.
Summary of the application
As mentioned above, multi-camera vision systems typically require synchronization of image data acquired by the various cameras to facilitate processing by backend applications. Currently, a multi-camera system uses a hardware synchronization method to realize synchronization control. However, this hardware synchronization method can perform only rough synchronization and is not accurate enough.
Specifically, the reason why the hardware synchronization method can achieve only coarse synchronization is that: the preparation time from the reception of the synchronization signal to the start of exposure varies from camera to camera. That is, even if the times of the synchronization signals received by the different cameras are identical, the image data acquired by the different cameras are not synchronized due to the inconsistency of the preparation times of the cameras. Fig. 1 illustrates an effect diagram of controlling synchronization of a multi-camera system by a hardware synchronization method in the prior art. As shown in fig. 1, the multi-camera system includes a first camera module and a second camera module, and a time difference Δ t exists between image data collected by the first camera module and the second camera module due to the fact that the first camera module and the second camera module are inconsistent in waiting time from receiving a synchronization signal to starting exposure.
In view of the above technical problems, a basic idea of the present application is to provide a synchronization control scheme for a multi-camera system, which achieves adaptive synchronization adjustment by closed-loop feedback of video stream synchronization results acquired by a plurality of camera modules to obtain higher synchronization accuracy.
Based on this, the present application provides a synchronous control method of a multi-camera system, which first sends a synchronous signal to the plurality of camera modules to start the plurality of camera modules simultaneously, wherein the plurality of camera modules are configured to acquire images of light spots located on a parallel line of a center point connecting line set by the plurality of camera modules at the same frame rate, wherein the light spots move along the parallel line at a specific speed; further, light spot images respectively collected by the plurality of camera modules under the same frame number are respectively extracted; then, based on the corresponding relation among the space coordinate systems set by the plurality of camera modules, obtaining a series of displacement differences among the light spots in the light spot images collected by the plurality of camera modules under the same frame number; then, obtaining an average value of the series of displacement differences, and obtaining a corresponding time difference based on the moving speed of the light spot and the average value of the displacement differences; and then, responding to the time difference meeting the preset time precision, and saving the time sequence configuration parameters of the plurality of camera modules, wherein the time sequence configuration parameters are used for controlling the starting and exposure of the camera modules. Therefore, the self-adaptive synchronous adjustment is realized through the closed-loop feedback of the synchronous result of the video streams collected by the plurality of camera modules so as to obtain higher synchronous precision.
Having described the general principles of the present application, various non-limiting embodiments of the present application will now be described with reference to the accompanying drawings.
Exemplary synchronization control method
Fig. 2 illustrates a flow diagram of a synchronization control method of a multi-camera system according to an embodiment of the present application. As shown in fig. 2, the synchronization control method includes: s210, sending a synchronization signal to the plurality of camera modules 100 to start the plurality of camera modules 100 at the same time, wherein the plurality of camera modules 100 are configured to acquire images of light spots located on a parallel line of center points connecting lines set by the plurality of camera modules 100 at the same frame rate, and the light spots 20 move along the parallel line at a specific speed; s220, respectively extracting light spot images respectively acquired by the plurality of camera modules 100 under the same frame number; s230, obtaining a series of displacement differences between the light spots 20 in the light spot images acquired by the plurality of camera modules 100 under the same frame number based on the corresponding relationship between the spatial coordinate systems set by the plurality of camera modules 100; s240, obtaining an average value of the series of displacement differences, and obtaining a corresponding time difference based on the moving speed of the light spot 20 and the average value of the displacement differences; and S250, responding to the time difference meeting the preset time precision, and saving time sequence configuration parameters of the plurality of camera modules 100, wherein the time sequence configuration parameters are used for controlling the starting and exposure of the camera modules 100.
In step S210, a synchronization signal is sent to the plurality of camera modules 100 to start the plurality of camera modules 100 at the same time, wherein the plurality of camera modules 100 are configured to acquire images of light spots located on a parallel line of a center-point connecting line set by the plurality of camera modules 100 at the same frame rate, and the light spots 20 move along the parallel line at a specific speed. Here, in the embodiment of the present application, the type of the camera module 100 and the number of the camera modules 100 included in the multi-camera system 10 are not limited by the present application. For example, in a possible implementation manner, the multi-camera system 10 includes 2 camera modules 100 (a first camera module 100 and a second camera module 100), where the first camera module 100 is selected from any one of the infrared camera module 100, the RGB camera module 100 and the TOF camera module 100, and the second camera module 100 is selected from any one of the infrared camera module 100, the RGB camera module 100 and the TOF camera module 100. It should be understood by those skilled in the art that the first camera module 100 and the second camera module 100 can also be implemented as other types of camera modules 100, which is not limited by the present application. Meanwhile, it should be understood by those skilled in the art that the multi-camera system 10 may include a greater number of camera modules 100, for example, 3 camera modules 100 (for example, 2 RGB camera modules 100+1 TOF camera module 100, etc.), which is not limited by the present disclosure.
In step S210, the multi-camera system 10 is synchronized by the master control chip 30. Specifically, fig. 3 illustrates a schematic diagram of hardware synchronization of a plurality of camera modules by a main control chip 30 according to an embodiment of the present disclosure, as shown in fig. 3, the main control chip 30 is connected to a synchronization pin of each of the camera modules 100 through an I/O pin, and further, the main control chip 30 sends a synchronization signal to the corresponding plurality of camera modules 100 through the I/O pin, so that the plurality of camera modules 100 sequentially and synchronously acquire image data (here, in the embodiment of the present disclosure, the image data includes images and videos). That is, in step S210, the multi-camera system 10 is synchronized by means of hardware synchronization. As described above, the preparation time from the reception of the synchronization signal to the start of exposure is different for different image capture modules 100, and the hardware synchronization is often not high in synchronization accuracy, so after the hardware synchronization, detection and feedback adjustment are performed based on the effect of the hardware synchronization to improve the synchronization accuracy.
Specifically, in order to detect the synchronization effect of the multi-camera system 10 after the hardware synchronization, in the embodiment of the present application, the plurality of camera modules 100 are controlled to acquire images of light spots located on a parallel line of a center-point connecting line set by the plurality of camera modules 100 at the same frame rate, wherein the light spots 20 move along the parallel line at a specific speed. FIG. 4 illustrates a schematic diagram of a synchronous control system according to an embodiment of the present application. As shown in figure 4 of the drawings,the center point connecting line set by the plurality of camera modules 100 is L, and the parallel line parallel to the center point connecting line is L1Wherein when the light spot 20 is on the parallel line L1When moving, the plurality of camera modules 100 can acquire the image of the light spot 20. Preferably, for the purpose of facilitating subsequent calculations, said parallel lines L1Is directly opposite to the center point connecting line, so that the parallel line L1Representing said parallel line L exactly to said centroid connecting line1The plane defined by the connection with the center point is perpendicular to the imaging plane of the camera module 100.
It should be noted that if the light spot 20 and the plurality of camera modules 100 are arranged in the manner shown in fig. 4, when the synchronization accuracy of the plurality of camera modules 100 is high enough, the displacement difference between the light spots 20 in the light spot images captured by the plurality of camera modules 100 under the same frame number should be close to 0. In other words, in the embodiment of the present application, the hardware synchronization effect can be measured by the displacement difference between the light spots 20 in the light spot images acquired by the plurality of camera modules 100 under the same frame number.
In step S220 and step S230, light spot images respectively acquired by the plurality of camera modules 100 under the same frame number are respectively extracted; and obtaining a series of displacement differences between the light spots 20 in the light spot images acquired by the plurality of camera modules 100 under the same frame number based on the corresponding relationship between the spatial coordinate systems set by the plurality of camera modules 100.
For convenience of explanation, the principle and process of solving the displacement difference will be described by taking the multi-camera system 10 including 2 camera modules 100 (the first camera module 100 and the second camera module 100) as an example.
Accordingly, in the embodiment of the present application, the process of solving the displacement difference includes: converting coordinates of the light spots in the light spot images acquired by the second camera module 100 under the same number under the second spatial coordinate system into corresponding coordinates under the first spatial coordinate system based on a corresponding relationship between the first spatial coordinate system set by the first camera module 100 and the second spatial coordinate system set by the second camera module 100; next, based on the coordinates of the light spot 20 in the first spatial coordinate system in the light spot image acquired by the first camera module 100 and the corresponding coordinates of the light spot 20 in the first spatial coordinate system in the light spot image acquired by the second camera module 100, a series of displacement differences between the light spots 20 in the light spot images acquired by the first and second camera modules 100 under the same frame number are obtained.
That is, in the above embodiment, the coordinates of the light spot in the light spot image acquired by the second camera module 100 in the second spatial coordinate system are mapped into the first spatial coordinate system. It should be understood that, if the hardware synchronization of the multi-camera module 100 is good, the coordinates after mapping should be consistent with the coordinates of the light spot 20 in the first spatial coordinate system in the light spot image acquired by the first camera module 100, that is, the displacement difference is 0. That is, the synchronization effect of the multi-camera module 100 can be measured by the displacement difference.
It is also possible that, in another example of the embodiment of the present application, the process of solving for the displacement difference includes: converting coordinates of the light spots in the light spot images acquired by the first camera module 100 under the same number under the first spatial coordinate system into corresponding coordinates under the second spatial coordinate system based on a corresponding relationship between the first spatial coordinate system set by the first camera module 100 and the second spatial coordinate system set by the second camera module 100; further, based on the coordinates of the light spot 20 in the second spatial coordinate system in the light spot image acquired by the second camera module 100 and the corresponding coordinates of the light spot 20 in the second spatial coordinate system in the light spot image acquired by the first camera module 100, a series of displacement differences between the light spots 20 in the light spot images acquired by the first and second camera modules 100 under the same frame number are obtained.
That is, in this example, the coordinates of the light spot in the light spot image acquired by the first camera module 100 in the first spatial coordinate system are mapped into the second spatial coordinate system. It should be understood that, if the hardware synchronization of the multi-camera module 100 is good, the coordinates after mapping should be consistent with the coordinates of the light spot 20 in the second spatial coordinate system in the light spot image acquired by the second camera module 100, that is, the displacement difference is 0. That is, the synchronization effect of the multi-camera module 100 can be measured by the displacement difference.
In an implementation, the coordinates of the light spot in the light spot image acquired by the first camera module 100 in the first spatial coordinate system and the coordinates of the light spot 20 in the light spot image acquired by the second camera module 100 in the second spatial coordinate system may be obtained as follows. Correspondingly, in this embodiment of the application, the process of obtaining the coordinates of the light spot in the light spot image collected by the first camera module 100 under the first spatial coordinate system includes: reading pixel coordinates of a light spot in the light spot image acquired by the first camera module 100 in a first pixel coordinate system set by the first camera module 100; then, based on the corresponding relationship between the first pixel coordinate system set by the first camera module 100 and the first spatial coordinate system, the coordinates of the light spot in the light spot image acquired by the first camera module 100 in the first spatial coordinate system are obtained. Accordingly, in this embodiment of the application, the process of acquiring the coordinates of the light spot 20 in the light spot image acquired by the second camera module 100 under the second spatial coordinate system includes: firstly, reading pixel coordinates of light spots in the light spot image acquired by the second camera module 100 under a second pixel coordinate system set by the second camera module 100; then, based on the corresponding relationship between the second pixel coordinate system set by the second camera module 100 and the second spatial coordinate system, the coordinates of the light spot in the light spot image acquired by the second camera module 100 in the second spatial coordinate system are obtained.
It should be understood that, in the above solving process, the key point is to use the corresponding relationship between the pixel coordinate system and the space coordinate system, and the coordinates of the light spot in the image under the pixel coordinate system can be directly read, so that the coordinates of the light spot under the space coordinate system can be solved through the coordinates of the light spot under the pixel coordinate system.
Of course, those skilled in the art should understand that in the embodiment of the present application, the coordinates of the light spot 20 in the corresponding spatial coordinate system may also be obtained by other ways, which is not limited by the present application.
In step S240, an average value of the series of displacement differences is obtained, and a corresponding time difference is obtained based on the moving speed of the light spot 20 and the average value of the displacement differences. Accordingly, a series of displacement differences at the same frame number can be obtained through steps S220 and S230: Δ x1, Δ x2, …, Δ xn (where n denotes the number of camera modules 100 minus 1, i.e. if the multi-camera module 100 comprises 3 camera modules 100, under the same frame number 2 displacement differences:Δx1 and Δ x2 can be obtained). And because the light spot 20 moves along the parallel line at a specific speed v, a time difference Δ t2 for measuring the synchronization effect can be obtained based on the average value of the specific speed and the displacement difference.
In step S250, in response to that the time difference satisfies a preset time precision, saving time sequence configuration parameters of the plurality of camera modules 100, where the time sequence configuration parameters are used to control the start and exposure of the camera modules 100. Correspondingly, the time difference is compared with a preset time precision, and if the time difference meets the preset time precision, the time sequence configuration parameters of the multiple camera modules 100 are saved, that is, the synchronization effect of the multiple camera modules 100 meets the preset requirement. For example, the time difference Δ t2 is compared with one fifth of the required time precision, and if the time difference Δ t2 meets the one fifth of the required time precision, the time sequence configuration parameters of the multiple camera modules 100 are saved, that is, the synchronization effect of the multiple camera modules 100 meets the preset requirement.
And if the time difference does not meet the requirement, adjusting the time sequence configuration parameters of the plurality of camera modules 100 according to the time difference, and restarting the new round of synchronous control process until the time difference in the new round of synchronous control process meets the preset time precision. After the timing configuration parameters of the camera modules 100 are adjusted, the pin operation time intervals of the camera modules 100 are fine-tuned to optimize the synchronization effect.
Fig. 5 illustrates a timing diagram of the synchronous control of the plurality of camera modules 100 after being processed by the synchronous control method according to the embodiment of the present application. As shown in fig. 5, after the precise synchronous calibration is completed, the main control chip reads the calibrated time fine tuning parameter Δ t, and sends a synchronization signal to the second camera module 100 after delaying the synchronization signal sent by the first camera module 100 by the time Δ t, so that the first camera module 100 and the second camera module 100 are exposed at the same time, and synchronous exposure of each frame of picture is achieved.
In summary, the synchronization control method according to the embodiment of the present application is clarified, and the adaptive synchronization adjustment is realized through the closed-loop feedback of the synchronization result of the video streams acquired by the plurality of camera modules 100, so as to obtain higher synchronization precision.
Fig. 6 illustrates another flow chart of the synchronization control method according to the embodiment of the application. As shown in fig. 6, the synchronization control method includes: firstly, reading equipment calibration data, and analyzing the mapping relation of space points of a coordinate system of a plurality of cameras from the calibration data; further, simultaneously starting a plurality of the camera modules 100 to take pictures at the same high frame rate; then, generating a light spot, positioning the light spot on a parallel line of central point connecting lines of a plurality of cameras, and moving the light spot 20 on the parallel line at a constant speed v to respectively acquire and store respective video frame data; however, the processor extracts images of the same frame number, and calculates displacement differences Δ x1, Δ x2, …, and Δ xn of the light spots in the plurality of image capturing modules 100 according to the spatial point mapping relationship; next, the average x of the n sets of displacement differences is calculatedavgAnd calculating a time difference delta t2 according to the relation of speed, time and displacement; however, comparing Δ t2 with one fifth of the required time accuracy, if one fifth of the required time accuracy is not met, the second phase of synchronization is re-performed according to the pin operation time intervals of Δ t2 fine tuning TOF and RGB; finally, the pins are savedAnd the time sequence configuration parameter B is transferred into the storage equipment of the equipment to complete the second-stage synchronization.
Exemplary synchronization control device
According to another aspect of the present application, the present application further provides a synchronization control apparatus 700.
Fig. 7 illustrates a block diagram schematic of a synchronization control apparatus for a multi-camera system according to an embodiment of the present application. As shown in fig. 7, the synchronization control apparatus 700 includes: a synchronization unit 710, configured to send a synchronization signal to the multiple camera modules to start the multiple camera modules simultaneously, where the multiple camera modules are configured to acquire images of light spots located on a parallel line of a center-point connecting line set by the multiple camera modules at a same frame rate, where the light spots move along the parallel line at a specific speed; the extracting unit 720 is configured to extract the light spot images respectively acquired by the plurality of camera modules under the same frame number; a displacement difference obtaining unit 730, configured to obtain a series of displacement differences between the light spots in the light spot images acquired by the plurality of camera modules under the same frame number, based on a correspondence relationship between the spatial coordinate systems set by the plurality of camera modules; a time difference obtaining unit 740, configured to obtain an average value of the series of displacement differences, and obtain a corresponding time difference based on the moving speed of the light spot and the average value of the displacement differences; and a synchronization determining unit 750, configured to save a time sequence configuration parameter of the plurality of camera modules in response to that the time difference satisfies a preset time precision, where the time sequence configuration parameter is used to control the start and exposure of the camera modules.
In the synchronization control apparatus 700, in an embodiment of the present application, the synchronization determining unit 750 is further configured to: responding to the time difference being larger than the preset time precision, and adjusting the time sequence configuration parameters of the plurality of camera modules based on the time difference; starting a new round of synchronous control process until the time difference in the new round of synchronous control process meets the preset time precision; and responding to the time difference meeting the preset time precision, and saving the time sequence configuration parameters of the plurality of camera modules.
In the synchronous control device 700, in an embodiment of the present application, the plurality of camera modules include a first camera module and a second camera module, wherein the first camera module is selected from any one of an infrared camera module, an RGB camera module, and a TOF camera module, and the second camera module is selected from any one of an infrared camera module, an RGB camera module, and a TOF camera module.
In the above synchronous control device 700, in an embodiment of the present application, the displacement difference obtaining unit 730 is further configured to: converting coordinates of light spots in the light spot images acquired by the second camera module under the same number under the second space coordinate system into corresponding coordinates under the first space coordinate system based on a corresponding relationship between the first space coordinate system set by the first camera module and the second space coordinate system set by the second camera module; and obtaining a series of displacement differences between the light spots in the light spot images acquired by the first and second camera modules under the same frame number based on the coordinates of the light spots in the light spot image acquired by the first camera module under the first spatial coordinate system and the corresponding coordinates of the light spots in the light spot image acquired by the second camera module under the first spatial coordinate system. The type of the driving chip 31 is selected to be the MPQ4425M driving chip 31.
In the above synchronous control device 700, in an embodiment of the present application, the displacement difference obtaining unit 730 is further configured to: converting the coordinates of the light spots in the light spot images collected by the first camera module under the same number under the first space coordinate system into corresponding coordinates under a second space coordinate system based on the corresponding relationship between the first space coordinate system set by the first camera module and the second space coordinate system set by the second camera module; and obtaining a series of displacement differences between the light spots in the light spot images acquired by the first and second camera modules under the same frame number based on the coordinates of the light spots in the light spot images acquired by the second camera module under a second spatial coordinate system and the corresponding coordinates of the light spots in the light spot images acquired by the first camera module under the second spatial coordinate system.
In the above synchronous control device 700, in an embodiment of the present application, the displacement difference obtaining unit 730 is further configured to: reading pixel coordinates of light spots in the light spot image acquired by the second camera module under a second pixel coordinate system set by the second camera module; and obtaining the coordinates of the light spot in the light spot image acquired by the second camera module under the second space coordinate system based on the corresponding relation between the second pixel coordinate system set by the second camera module and the second space coordinate system.
In the above synchronous control device 700, in an embodiment of the present application, the displacement difference obtaining unit 730 is further configured to: reading pixel coordinates of light spots in the light spot image collected by the first camera module under a first pixel coordinate system set by the first camera module; and obtaining the coordinates of the light spot in the light spot image acquired by the first camera module under the first space coordinate system based on the corresponding relation between the first pixel coordinate system set by the first camera module and the first space coordinate system.
Here, it can be understood by those skilled in the art that the specific functions and operations of the respective units and modules in the above-described synchronization control apparatus 700 have been described in detail in the synchronization control method described above with reference to fig. 2 to 6, and thus, a repetitive description thereof will be omitted.
Illustrative electronic device
Next, an electronic apparatus according to an embodiment of the present application is described with reference to fig. 8.
FIG. 8 illustrates a block diagram of an electronic device in accordance with an embodiment of the present application.
As shown in fig. 8, the electronic device 10 includes one or more processors 11 and memory 12.
The processor 11 may be a Central Processing Unit (CPU) or other form of processing unit having data processing capabilities and/or instruction execution capabilities, and may control other components in the electronic device 10 to perform desired functions.
Memory 12 may include one or more computer program products that may include various forms of computer-readable storage media, such as volatile memory and/or non-volatile memory. The volatile memory may include, for example, Random Access Memory (RAM), cache memory (cache), and/or the like. The non-volatile memory may include, for example, Read Only Memory (ROM), hard disk, flash memory, etc. One or more computer program instructions may be stored on the computer-readable storage medium and executed by the processor 11 to implement the synchronization control methods of the various embodiments of the present application described above and/or other desired functions. Various content such as calibration parameters may also be stored in the computer readable storage medium.
In one example, the electronic device 10 may further include: an input device 13 and an output device 14, which are interconnected by a bus system and/or other form of connection mechanism (not shown).
The input device 13 may be, for example, a keyboard, a mouse, or the like.
The output device 14 can output various information including timing charts to the outside. The output devices 14 may include, for example, a display, speakers, a printer, and a communication network and its connected remote output devices, among others.
Of course, for simplicity, only some of the components of the electronic device 10 relevant to the present application are shown in fig. 8, and components such as buses, input/output interfaces, and the like are omitted. In addition, the electronic device 10 may include any other suitable components depending on the particular application.
Illustrative computer program product
In addition to the above-described methods and apparatus, embodiments of the present application may also be a computer program product comprising computer program instructions that, when executed by a processor, cause the processor to perform the steps in the synchronization control method according to various embodiments of the present application described in the "exemplary methods" section of this specification, supra.
The computer program product may write program code for carrying out operations for embodiments of the present application in any combination of one or more programming languages, including an object oriented programming language such as Java, C + +, or the like, as well as conventional procedural programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server.
Furthermore, embodiments of the present application may also be a computer-readable storage medium having stored thereon computer program instructions that, when executed by a processor, cause the processor to perform steps in a synchronization control method according to various embodiments of the present application described in the "exemplary methods" section above of this specification.
The computer-readable storage medium may take any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may include, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The foregoing describes the general principles of the present application in conjunction with specific embodiments, however, it is noted that the advantages, effects, etc. mentioned in the present application are merely examples and are not limiting, and they should not be considered essential to the various embodiments of the present application. Furthermore, the foregoing disclosure of specific details is for the purpose of illustration and description and is not intended to be limiting, since the foregoing disclosure is not intended to be exhaustive or to limit the disclosure to the precise details disclosed.
The block diagrams of devices, apparatuses, systems referred to in this application are only given as illustrative examples and are not intended to require or imply that the connections, arrangements, configurations, etc. must be made in the manner shown in the block diagrams. These devices, apparatuses, devices, systems may be connected, arranged, configured in any manner, as will be appreciated by those skilled in the art. Words such as "including," "comprising," "having," and the like are open-ended words that mean "including, but not limited to," and are used interchangeably therewith. The words "or" and "as used herein mean, and are used interchangeably with, the word" and/or, "unless the context clearly dictates otherwise. The word "such as" is used herein to mean, and is used interchangeably with, the phrase "such as but not limited to".
It should also be noted that in the devices, apparatuses, and methods of the present application, the components or steps may be decomposed and/or recombined. These decompositions and/or recombinations are to be considered as equivalents of the present application.
The previous description of the disclosed aspects is provided to enable any person skilled in the art to make or use the present application. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects without departing from the scope of the application. Thus, the present application is not intended to be limited to the aspects shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing description has been presented for purposes of illustration and description. Furthermore, the description is not intended to limit embodiments of the application to the form disclosed herein. While a number of example aspects and embodiments have been discussed above, those of skill in the art will recognize certain variations, modifications, alterations, additions and sub-combinations thereof.

Claims (15)

1. A synchronous control method for a multi-camera system, wherein the multi-camera system includes a plurality of camera modules, comprising:
sending a synchronization signal to the plurality of camera modules to start the plurality of camera modules simultaneously, wherein the plurality of camera modules are set to acquire images of light spots on a parallel line of a central point connecting line set by the plurality of camera modules at the same frame rate, and the light spots move along the parallel line at a specific speed;
respectively extracting light spot images respectively collected by the plurality of camera modules under the same frame number;
based on the corresponding relation among the space coordinate systems set by the plurality of camera modules, obtaining a series of displacement differences among the light spots in the light spot images acquired by the plurality of camera modules under the same frame number;
acquiring an average value of the series of displacement differences, and acquiring a corresponding time difference based on the moving speed of the light spot and the average value of the displacement differences; and
and responding to the time difference meeting the preset time precision, and saving time sequence configuration parameters of the plurality of camera modules, wherein the time sequence configuration parameters are used for controlling the starting and exposure of the camera modules.
2. The synchronization control method according to claim 1, further comprising:
responding to the time difference being larger than the preset time precision, and adjusting the time sequence configuration parameters of the plurality of camera modules based on the time difference;
starting a new round of synchronous control process until the time difference in the new round of synchronous control process meets the preset time precision; and
and responding to the time difference meeting the preset time precision, and saving the time sequence configuration parameters of the plurality of camera modules.
3. The synchronous control method according to claim 1 or 2, wherein the plurality of camera modules comprise a first camera module and a second camera module, wherein the first camera module is selected from any one of an infrared camera module, an RGB camera module, and a TOF camera module, and the second camera module is selected from any one of an infrared camera module, an RGB camera module, and a TOF camera module.
4. The synchronous control method according to claim 3, wherein obtaining a series of displacement differences between the light spots in the light spot images acquired by the plurality of camera modules under the same frame number based on the correspondence between the spatial coordinate systems set by the plurality of camera modules comprises:
converting coordinates of light spots in the light spot images acquired by the second camera module under the same number under the second space coordinate system into corresponding coordinates under the first space coordinate system based on a corresponding relationship between the first space coordinate system set by the first camera module and the second space coordinate system set by the second camera module; and
and obtaining a series of displacement differences between the light spots in the light spot images acquired by the first and second camera modules under the same frame number based on the coordinates of the light spots in the light spot images acquired by the first camera module under a first spatial coordinate system and the corresponding coordinates of the light spots in the light spot images acquired by the second camera module under the first spatial coordinate system.
5. The synchronous control method according to claim 3, wherein obtaining a series of displacement differences between the light spots in the light spot images acquired by the plurality of camera modules under the same frame number based on the correspondence between the spatial coordinate systems set by the plurality of camera modules comprises:
converting the coordinates of the light spots in the light spot images collected by the first camera module under the same number under the first space coordinate system into corresponding coordinates under a second space coordinate system based on the corresponding relationship between the first space coordinate system set by the first camera module and the second space coordinate system set by the second camera module; and
and obtaining a series of displacement differences between the light spots in the light spot images acquired by the first and second camera modules under the same frame number based on the coordinates of the light spots in the light spot images acquired by the second camera module under a second spatial coordinate system and the corresponding coordinates of the light spots in the light spot images acquired by the first camera module under the second spatial coordinate system.
6. The synchronization control method according to claim 4 or 5, further comprising:
reading pixel coordinates of light spots in the light spot image acquired by the second camera module under a second pixel coordinate system set by the second camera module; and
and obtaining the coordinates of the light spot in the light spot image acquired by the second camera module under the second space coordinate system based on the corresponding relation between the second pixel coordinate system set by the second camera module and the second space coordinate system.
7. The synchronization control method according to claim 4 or 5, further comprising:
reading pixel coordinates of light spots in the light spot image collected by the first camera module under a first pixel coordinate system set by the first camera module; and
and obtaining the coordinates of the light spot in the light spot image acquired by the first camera module under the first space coordinate system based on the corresponding relation between the first pixel coordinate system set by the first camera module and the first space coordinate system.
8. The utility model provides a be used for a plurality of modules synchronous control device that make a video recording which characterized in that includes:
the system comprises a synchronization unit, a control unit and a control unit, wherein the synchronization unit is used for sending a synchronization signal to a plurality of camera modules so as to start the camera modules at the same time, the camera modules are arranged to acquire images of light spots on parallel lines of central point connecting lines set by the camera modules at the same frame rate, and the light spots move along the parallel lines at a specific speed;
the extraction unit is used for respectively extracting the light spot images respectively collected by the plurality of camera modules under the same frame number;
a displacement difference obtaining unit, configured to obtain a series of displacement differences between the light spots in the light spot images acquired by the plurality of camera modules under the same frame number, based on a correspondence between spatial coordinate systems set by the plurality of camera modules;
the time difference acquisition unit is used for acquiring the average value of the series of displacement differences and acquiring the corresponding time difference based on the moving speed of the light spot and the average value of the displacement differences; and
and the synchronization determining unit is used for responding to the time difference meeting the preset time precision and storing the time sequence configuration parameters of the plurality of camera modules, wherein the time sequence configuration parameters are used for controlling the starting and the exposure of the camera modules.
9. The synchronization control apparatus of claim 8, the synchronization determination unit, further configured to:
responding to the time difference being larger than the preset time precision, and adjusting the time sequence configuration parameters of the plurality of camera modules based on the time difference;
starting a new round of synchronous control process until the time difference in the new round of synchronous control process meets the preset time precision; and
and responding to the time difference meeting the preset time precision, and saving the time sequence configuration parameters of the plurality of camera modules.
10. The synchronous control device according to claim 8 or 9, wherein the plurality of camera modules comprises a first camera module and a second camera module, wherein the first camera module is selected from any one of an infrared camera module, an RGB camera module, and a TOF camera module, and the second camera module is selected from any one of an infrared camera module, an RGB camera module, and a TOF camera module.
11. The synchronous control device according to claim 10, wherein the displacement difference acquisition unit is further configured to:
converting coordinates of light spots in the light spot images acquired by the second camera module under the same number under the second space coordinate system into corresponding coordinates under the first space coordinate system based on a corresponding relationship between the first space coordinate system set by the first camera module and the second space coordinate system set by the second camera module; and
and obtaining a series of displacement differences between the light spots in the light spot images acquired by the first and second camera modules under the same frame number based on the coordinates of the light spots in the light spot images acquired by the first camera module under a first spatial coordinate system and the corresponding coordinates of the light spots in the light spot images acquired by the second camera module under the first spatial coordinate system.
12. The synchronous control device according to claim 10, wherein the displacement difference acquisition unit is further configured to:
converting the coordinates of the light spots in the light spot images collected by the first camera module under the same number under the first space coordinate system into corresponding coordinates under a second space coordinate system based on the corresponding relationship between the first space coordinate system set by the first camera module and the second space coordinate system set by the second camera module; and
and obtaining a series of displacement differences between the light spots in the light spot images acquired by the first and second camera modules under the same frame number based on the coordinates of the light spots in the light spot images acquired by the second camera module under a second spatial coordinate system and the corresponding coordinates of the light spots in the light spot images acquired by the first camera module under the second spatial coordinate system.
13. The synchronous control according to claim 11 or 12, wherein the displacement difference acquisition unit is further configured to:
reading pixel coordinates of light spots in the light spot image acquired by the second camera module under a second pixel coordinate system set by the second camera module; and
and obtaining the coordinates of the light spot in the light spot image acquired by the second camera module under the second space coordinate system based on the corresponding relation between the second pixel coordinate system set by the second camera module and the second space coordinate system.
14. The synchronous control device according to claim 11 or 12, wherein the displacement difference acquisition unit is further configured to:
reading pixel coordinates of light spots in the light spot image collected by the first camera module under a first pixel coordinate system set by the first camera module; and
and obtaining the coordinates of the light spot in the light spot image acquired by the first camera module under the first space coordinate system based on the corresponding relation between the first pixel coordinate system set by the first camera module and the first space coordinate system.
15. An electronic device, comprising:
a processor; and
memory, wherein in the memory computer program instructions are stored which, when executed by the processor, cause the processor to carry out the synchronization control method according to any one of claims 1-7.
CN201911398576.7A 2019-12-30 2019-12-30 Synchronous control method and synchronous control device for multi-camera system and electronic equipment Active CN113132551B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911398576.7A CN113132551B (en) 2019-12-30 2019-12-30 Synchronous control method and synchronous control device for multi-camera system and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911398576.7A CN113132551B (en) 2019-12-30 2019-12-30 Synchronous control method and synchronous control device for multi-camera system and electronic equipment

Publications (2)

Publication Number Publication Date
CN113132551A true CN113132551A (en) 2021-07-16
CN113132551B CN113132551B (en) 2023-08-08

Family

ID=76768092

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911398576.7A Active CN113132551B (en) 2019-12-30 2019-12-30 Synchronous control method and synchronous control device for multi-camera system and electronic equipment

Country Status (1)

Country Link
CN (1) CN113132551B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107087120A (en) * 2017-06-22 2017-08-22 中国科学院计算技术研究所 It is a kind of to be used for the method and system of synchronous many ccd video cameras
US20170302826A1 (en) * 2016-04-15 2017-10-19 General Electric Company Synchronous sampling methods for infrared cameras
CN107948463A (en) * 2017-11-30 2018-04-20 北京图森未来科技有限公司 A kind of camera synchronous method, apparatus and system
CN110248111A (en) * 2018-04-28 2019-09-17 Oppo广东移动通信有限公司 Control method, apparatus, electronic equipment and the computer readable storage medium of shooting

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170302826A1 (en) * 2016-04-15 2017-10-19 General Electric Company Synchronous sampling methods for infrared cameras
CN107087120A (en) * 2017-06-22 2017-08-22 中国科学院计算技术研究所 It is a kind of to be used for the method and system of synchronous many ccd video cameras
CN107948463A (en) * 2017-11-30 2018-04-20 北京图森未来科技有限公司 A kind of camera synchronous method, apparatus and system
CN110248111A (en) * 2018-04-28 2019-09-17 Oppo广东移动通信有限公司 Control method, apparatus, electronic equipment and the computer readable storage medium of shooting

Also Published As

Publication number Publication date
CN113132551B (en) 2023-08-08

Similar Documents

Publication Publication Date Title
US10334151B2 (en) Phase detection autofocus using subaperture images
US10122998B2 (en) Real time sensor and method for synchronizing real time sensor data streams
CN108028887B (en) Photographing focusing method, device and equipment for terminal
WO2018153313A1 (en) Stereoscopic camera and height acquisition method therefor and height acquisition system
US8441518B2 (en) Imaging apparatus, imaging control method, and recording medium
EP3771198B1 (en) Target tracking method and device, movable platform and storage medium
US11769266B2 (en) Depth image engine and depth image calculation method
US11238273B2 (en) Data processing method and apparatus, electronic device and storage medium
WO2022183685A1 (en) Target detection method, electronic medium and computer storage medium
US11019325B2 (en) Image processing method, computer device and readable storage medium
JP2019190974A (en) Calibration device, calibration method and program
JP2006226965A (en) Image processing system, computer program and image processing method
CN113281780B (en) Method and device for marking image data and electronic equipment
WO2022147655A1 (en) Positioning method and apparatus, spatial information acquisition method and apparatus, and photographing device
CN113159161A (en) Target matching method and device, equipment and storage medium
CN113132551A (en) Synchronous control method and device of multi-camera system and electronic equipment
CN111089579B (en) Heterogeneous binocular SLAM method and device and electronic equipment
JP6483661B2 (en) Imaging control apparatus, imaging control method, and program
CN111179331A (en) Depth estimation method, depth estimation device, electronic equipment and computer-readable storage medium
CN114627174A (en) Depth map generation system and method and autonomous mobile device
CN111212239B (en) Exposure time length adjusting method and device, electronic equipment and storage medium
JPH09145368A (en) Moving and tracing method for object by stereoscopic image
CN114445591A (en) Map construction method, system, device and computer storage medium
CN109587303B (en) Electronic equipment and mobile platform
CN115690469A (en) Binocular image matching method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant