CN113177440A - Image synchronization method and device, electronic equipment and computer storage medium - Google Patents

Image synchronization method and device, electronic equipment and computer storage medium Download PDF

Info

Publication number
CN113177440A
CN113177440A CN202110382446.5A CN202110382446A CN113177440A CN 113177440 A CN113177440 A CN 113177440A CN 202110382446 A CN202110382446 A CN 202110382446A CN 113177440 A CN113177440 A CN 113177440A
Authority
CN
China
Prior art keywords
images
same moving
image
moving object
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110382446.5A
Other languages
Chinese (zh)
Other versions
CN113177440B (en
Inventor
解德鹏
李若岱
朱旭荣
王杨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Yuanluobu Intelligent Technology Co ltd
Original Assignee
Shenzhen Sensetime Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Sensetime Technology Co Ltd filed Critical Shenzhen Sensetime Technology Co Ltd
Priority to CN202110382446.5A priority Critical patent/CN113177440B/en
Publication of CN113177440A publication Critical patent/CN113177440A/en
Priority to PCT/CN2022/083308 priority patent/WO2022213833A1/en
Priority to TW111112467A priority patent/TW202240462A/en
Application granted granted Critical
Publication of CN113177440B publication Critical patent/CN113177440B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • Image Analysis (AREA)
  • Studio Devices (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

The embodiment of the disclosure provides an image synchronization method, an image synchronization device, an electronic device and a computer storage medium, wherein the method comprises the following steps: acquiring at least two images acquired by at least two image acquisition devices, wherein the image acquisition devices corresponding to the images in the at least two images are different; carrying out target identification on the at least two images, and determining the positions of the same moving target in the at least two images; determining the delay time between the at least two images according to the positions of the same moving target in the at least two images and the predetermined motion state information of the same moving target; and according to the delay time, performing time synchronization processing on the at least two image acquisition devices.

Description

Image synchronization method and device, electronic equipment and computer storage medium
Technical Field
The present disclosure relates to computer vision processing technologies, and in particular, to an image synchronization method and apparatus, an electronic device, and a computer storage medium.
Background
At present, for an image acquisition system integrating multiple image acquisition devices, image delay measurement between different image acquisition devices is a key problem of system integration, and in order to determine image delay time between different image acquisition devices, in the related art, videos between different image acquisition devices can be recorded or screened generally, and then image delay between different image acquisition devices is determined by a video image number frame or human eye observation method, so that the problems of time and labor consumption, low efficiency, low accuracy and the like exist.
Disclosure of Invention
Embodiments of the present disclosure are intended to provide a technical solution for image synchronization.
The embodiment of the disclosure provides an image synchronization method, which includes:
acquiring at least two images acquired by at least two image acquisition devices, wherein the image acquisition devices corresponding to the images in the at least two images are different
Carrying out target identification on the at least two images, and determining the positions of the same moving target in the at least two images;
determining the delay time between the at least two images according to the positions of the same moving target in the at least two images and the predetermined motion state information of the same moving target;
and according to the delay time, performing time synchronization processing on the at least two image acquisition devices.
In some embodiments, the method further comprises:
and acquiring the motion state information of the same motion target in uniform motion.
In some embodiments, the obtaining motion state information of the same moving object in uniform motion includes:
and acquiring first speed information of the same moving target in uniform circular motion, and taking the first speed information as the motion state information.
In some embodiments, the determining a delay time between the at least two images according to the position of the same moving object in the at least two images and the predetermined motion state information of the same moving object includes:
determining a polar angle of the same moving object in each image of the at least two images according to the position of the same moving object in the at least two images, wherein the polar angle of the same moving object in each image represents: an included angle between a connecting line of motion center points of uniform circular motion from the same moving object in each image and a reference direction;
determining a polar angle difference of two polar angles according to the two polar angles of any two images of the same moving target in the at least two images;
determining a delay time between any two images of the at least two images according to the polar angle difference and the first rate information.
In some embodiments, the obtaining the motion state information of the uniform motion object in uniform motion includes:
and acquiring second speed information of the same moving object in uniform linear motion, and taking the second speed information as the motion state information.
In some embodiments, the determining a delay time between the at least two images according to the position of the same moving object in the at least two images and the predetermined motion state information of the same moving object includes:
determining the relative distance between any two images of the same moving target in the at least two images according to the positions of the same moving target in the at least two images;
and determining the delay time between any two images of the at least two images according to the relative distance and the second speed information.
In some embodiments, the method further comprises:
and obtaining the movement speed change information of the same moving target, and taking the movement speed change information as the movement state information.
In some embodiments, the at least two image capturing devices comprise thermal imagers, and the moving object comprises a hollowed-out portion.
An embodiment of the present disclosure further provides an image synchronization apparatus, including:
the system comprises an acquisition module, a processing module and a display module, wherein the acquisition module is used for acquiring at least two images acquired by at least two image acquisition devices, and the image acquisition devices corresponding to the images in the at least two images are different;
the first processing module is used for carrying out target identification on the at least two images and determining the positions of the same moving target in the at least two images; determining the delay time between the at least two images according to the positions of the same moving target in the at least two images and the predetermined motion state information of the same moving target;
and the second processing module is used for carrying out time synchronization processing on the at least two image acquisition devices according to the delay time.
The disclosed embodiments also provide an electronic device comprising a processor and a memory for storing a computer program capable of running on the processor; wherein,
the processor is configured to run the computer program to perform any one of the image synchronization methods described above.
The disclosed embodiments also provide a computer storage medium having a computer program stored thereon, which when executed by a processor implements any of the image synchronization methods described above.
In the image synchronization method, the image synchronization device, the electronic device and the computer storage medium provided by the embodiment of the disclosure, at least two images acquired by at least two image acquisition devices are acquired, wherein the image acquisition devices corresponding to the images in the at least two images are different; carrying out target identification on the at least two images, and determining the positions of the same moving target in the at least two images; determining the delay time between the at least two images according to the positions of the same moving target in the at least two images and the predetermined motion state information of the same moving target; and according to the delay time, performing time synchronization processing on the at least two image acquisition devices.
It can be seen that, according to the embodiment of the present disclosure, the delay time of at least two images can be determined according to the position difference of the same moving object in the at least two images and the predetermined moving state information of the same moving object, thereby implementing the time synchronization of the at least two image capturing devices, and since the image delay between different image capturing devices does not need to be determined by the video image number frame or the human eye observation method, the delay time of the at least two images can be determined rapidly and accurately, thereby improving the statistical efficiency of the image delay between different image capturing devices, and compared with the scheme of determining the image delay between different image capturing devices by the video image number frame or the human eye observation method in the related art, the method has the characteristics of high accuracy, easy implementation, and the like.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a flow chart of an image synchronization method of an embodiment of the present disclosure;
FIG. 2A is a diagram of a turntable image acquired by an image acquisition device according to an embodiment of the present disclosure;
FIG. 2B is a turntable image captured using another infrared camera in an embodiment of the present disclosure;
FIG. 3A is a turntable image captured with an RGB camera in an embodiment of the disclosure;
FIG. 3B is a turntable image captured by an infrared camera in an embodiment of the present disclosure;
FIG. 3C is a turntable image acquired by a thermal infrared imager in an embodiment of the disclosure;
FIG. 4 is a schematic diagram illustrating a structure of an image synchronization apparatus according to an embodiment of the disclosure;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the disclosure.
Detailed Description
The present disclosure will be described in further detail below with reference to the accompanying drawings and examples. It is to be understood that the examples provided herein are merely illustrative of the present disclosure and are not intended to limit the present disclosure. In addition, the embodiments provided below are some embodiments for implementing the disclosure, not all embodiments for implementing the disclosure, and the technical solutions described in the embodiments of the disclosure may be implemented in any combination without conflict.
It should be noted that, in the embodiments of the present disclosure, the terms "comprises," "comprising," or any other variation thereof are intended to cover a non-exclusive inclusion, so that a method or apparatus including a series of elements includes not only the explicitly recited elements but also other elements not explicitly listed or inherent to the method or apparatus. Without further limitation, the use of the phrase "including a. -. said." does not exclude the presence of other elements (e.g., steps in a method or elements in a device, such as portions of circuitry, processors, programs, software, etc.) in the method or device in which the element is included.
For example, the image synchronization method provided by the embodiment of the present disclosure includes a series of steps, but the image synchronization method provided by the embodiment of the present disclosure is not limited to the described steps, and similarly, the image synchronization apparatus provided by the embodiment of the present disclosure includes a series of modules, but the apparatus provided by the embodiment of the present disclosure is not limited to include the explicitly described modules, and may also include modules that are required to be configured to acquire related information or perform processing based on the information.
The term "and/or" herein is merely an associative relationship that describes an associated object, meaning that three relationships may exist, e.g., a and/or C, may mean: a exists alone, A and C exist simultaneously, and C exists alone. In addition, the term "at least one" herein means any combination of any one or more of a plurality, for example, including at least one of A, C, D, and may mean including any one or more elements selected from the group consisting of A, C and D.
The disclosed embodiments may be implemented in computer systems comprising terminals and/or servers and may be operational with numerous other general purpose or special purpose computing system environments or configurations. Here, the terminal may be a thin client, a thick client, a hand-held or laptop device, a microprocessor-based system, a set-top box, a programmable consumer electronics, a network personal computer, a small computer system, etc., and the server may be a server computer system, a small computer system, a mainframe computer system, a distributed cloud computing environment including any of the above, etc.
The electronic devices of the terminal, server, etc. may be described in the general context of computer system-executable instructions, such as program modules, being executed by a computer system. Generally, program modules may include routines, programs, objects, components, logic, data structures, etc. that perform particular tasks or implement particular abstract data types. The computer system/server may be practiced in distributed cloud computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed cloud computing environment, program modules may be located in both local and remote computer system storage media including memory storage devices.
In some embodiments of the present disclosure, an image synchronization method is provided, which may be applied to an electronic device integrated with a multi-view camera system, and the embodiments of the present disclosure may be applied to security monitoring, non-contact human body temperature measurement, unmanned aerial vehicle shooting, fast temperature measurement entrance guard passage and other scenes.
Fig. 1 is a flowchart of an image synchronization method according to an embodiment of the present disclosure, and as shown in fig. 1, the flowchart may include:
step 101: the method comprises the steps of obtaining at least two images collected by at least two image collecting devices, wherein the image collecting devices corresponding to the images in the at least two images are different.
In an embodiment of the present disclosure, the at least two image capturing devices may be devices in a multi-view camera, and the at least two image capturing devices may include at least two of the following devices: the device comprises equipment for collecting Red-Green-Blue (RGB) images, an infrared camera and a thermal imager.
In an embodiment of the present disclosure, the at least two images represent images captured by different image capturing devices; exemplarily, in a case that the at least two images are two images, the image capturing devices corresponding to the two images are any two devices among a device for capturing RGB images, an infrared camera, and a thermal imager, respectively; illustratively, in the case that at least two images are three images, the image capturing devices corresponding to the three images are a device for capturing RGB images, an infrared camera and a thermal imager, respectively.
In an embodiment of the present disclosure, the at least two image capturing devices are configured to capture images of a same scene, and the images captured by the at least two image capturing devices may include a same object, and the same object may be an object in a moving state or an object in a static state.
In some embodiments, in the electronic device integrated with the multi-view camera system, a shooting instruction may be issued to the at least two image capturing devices at the same time, and the at least two image capturing devices capture images according to the received shooting instruction; it can be understood that, due to factors such as jitter existing in a communication link of each component in the electronic device, there may be a time delay in at least two acquired images, and at this time, it is necessary to determine a delay time between two images in the at least two images, so as to perform synchronization processing on the at least two image acquisition devices, so that time synchronization is maintained in images acquired from the at least two image acquisition devices subsequently.
In some embodiments, after the at least two images are acquired, the at least two images may be stored in a memory of the electronic device for later recall.
Step 102: and carrying out target identification on the at least two images, and determining the positions of the same moving target in the at least two images.
In the embodiment of the present disclosure, the same moving object belongs to the same object; in some embodiments, the same moving object may be a pre-specified object. For example, the moving object may be a pointer, a remote control car, or other objects that can move according to a preset rule.
In some embodiments, object recognition may be performed on each of the at least two images to identify the location of the same moving object in each image.
In some embodiments, features may be extracted from each image based on machine vision techniques, such that the same moving object is identified from the extracted features in each image, and the position of the same moving object in each image is determined.
For example, each image may be input to a pre-trained neural network, and based on the processing of each image by the neural network, a corresponding moving object may be identified, thereby determining the position of the moving object in each image.
Step 103: and determining the delay time between the at least two images according to the positions of the same moving object in the at least two images and the predetermined motion state information of the same moving object.
In the embodiment of the present disclosure, the motion state information of the same moving object may be motion speed information, motion rate information, motion position information, and the like; in some embodiments, the manner of movement of the moving object may be specified so that the speed, velocity, or position of the moving object at each time instance may be determined.
In one example, the target may be specified to perform uniform motion at a preset rate, where the uniform motion means a constant-rate motion in which the motion direction may be consistent or changed; for example, the uniform motion may be uniform circular motion, uniform linear motion, or other uniform motion along a regular route.
In another example, the target may be specified to move at a preset rate of change.
For example, the target may be driven to move by a motor or the like, so that the motor may drive the target to move according to a specified movement manner by issuing a corresponding driving instruction to the motor.
In some embodiments, in the case that the number of the at least two images is 2, a delay time between the two images may be determined; and under the condition that the number of the at least two images is more than 2, the delay time between the at least two images is the delay time between every two at least two images.
Step 104: and according to the delay time, performing time synchronization processing on at least two image acquisition devices.
In the embodiment of the present disclosure, time synchronization processing may be performed between the at least two image capturing devices according to the delay time, so that time consistency of images subsequently captured by the at least two image capturing devices can be maintained, that is, time synchronization of images subsequently acquired from the at least two image capturing devices can be maintained.
Illustratively, the image captured by the image capturing device E lags behind the image captured by the image capturing device F, and the delay time of the image captured by the image capturing device E with respect to the image captured by the image capturing device F is 30 μ s, the image subsequently captured by the image capturing device F may be delayed by 30 μ s, so as to achieve time synchronization of the image capturing device E and the image capturing device F.
Illustratively, the image captured by the image capturing device P precedes the image captured by the image capturing device Q, and the delay time of the image captured by the image capturing device Q relative to the image captured by the image capturing device P is 40 μ s, then the image subsequently captured by the image capturing device P may be delayed by 40 μ s, so as to achieve time synchronization of the image capturing device P and the image capturing device Q.
In practical applications, the steps 101 to 104 may be implemented by a Processor in an electronic Device, where the Processor may be at least one of an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a Central Processing Unit (CPU), a controller, a microcontroller, and a microprocessor.
It can be seen that, according to the embodiment of the present disclosure, the delay time of at least two images can be determined according to the position difference of the same moving object in the at least two images and the predetermined moving state information of the same moving object, thereby implementing the time synchronization of the at least two image capturing devices, and since the image delay between different image capturing devices does not need to be determined by the video image number frame or the human eye observation method, the delay time of the at least two images can be determined rapidly and accurately, thereby improving the statistical efficiency of the image delay between different image capturing devices, and compared with the scheme of determining the image delay between different image capturing devices by the video image number frame or the human eye observation method in the related art, the method has the characteristics of high accuracy, easy implementation, and the like.
In some embodiments, the motion state information of the same moving object may be in uniform motion.
It can be understood that, because the motion law of the uniform motion is simple, the delay time between at least two images is easy to determine according to the motion state information of the object moving at the uniform motion.
In some embodiments, the obtaining motion state information of the same moving object in uniform motion may include: and acquiring first speed information of the same moving target in uniform circular motion, and taking the first speed information as the motion state information.
Illustratively, the first rate information is preset rate information, and it can be seen that, in the embodiment of the present disclosure, by controlling the moving object to perform uniform circular motion at the first rate, it is convenient to determine the moving state information of the moving object before performing time synchronization on the at least two image capturing devices.
In some embodiments, the determining the delay time between the at least two images according to the position of the same moving object in the at least two images and the predetermined motion state information of the same moving object may include:
determining the polar angle of the same moving object in each image of the at least two images according to the position of the same moving object in the at least two images, wherein the polar angle of the same moving object in each image represents: an included angle between a connecting line of motion center points of uniform circular motion from the same moving object in each image and a reference direction;
determining a polar angle difference of two polar angles according to the two polar angles of any two images of the same moving target in at least two images;
a delay time between at least two images is determined based on the polar angle difference and the first rate information.
In the embodiment of the present disclosure, the reference direction may represent a direction of a line passing through a motion center point of the preset uniform circular motion, and in a polar coordinate system, the reference direction may represent a direction of a polar axis.
Exemplarily, referring to fig. 2A, a connection line between the first moving object 201 and a moving center point of the uniform circular motion in the image acquired by the image acquisition device a1 is a first connection line 203, and an included angle between the first connection line 203 and the polar axis 202 is 36 degrees; referring to fig. 2B, a connection line between the first moving object 201 and the moving center point of the uniform circular motion in the image acquired by the image acquisition device a2 is a second connection line 204, and an included angle between the second connection line 204 and the polar axis 202 is 74 degrees, so that polar angles of the first moving object 201 in the two images are 36 degrees and 74 degrees, respectively, and thus, it can be determined that a polar angle difference of the first moving object 201 in the two images is 38 degrees.
In some embodiments, the delay time between any two images may be calculated according to the following formula:
Figure BDA0003013542310000101
wherein, Δ t represents the delay time between any two images, θ represents the polar angle difference of the position of the moving target in any two images relative to the motion center point of the uniform circular motion, and r represents the rotation speed, and the unit is rpm.
It is understood that if the time of one rotation of the moving object is less than the time delay of any two images, the polar angle difference cannot accurately reflect the time delay of the two images.
In order to accurately and reliably calculate the polar angle difference, the time of the moving target rotating for one circle needs to be greater than or equal to the time delay of any two images, and for example, the value range of the time delay of any two images can be estimated according to historical data, so that the rotating speed of the uniform-speed circular motion is set according to the value range, and the time of the moving target rotating for one circle is greater than or equal to the time delay of any two images.
It can be seen that the embodiment of the disclosure can determine the delay time between at least two images according to the polar angle difference and the first rate information, and has the characteristics of accuracy and high efficiency compared with a scheme of determining the image delay between different image acquisition devices by a video image frame counting or human eye observation method in the related art.
In some embodiments, the at least two image acquisition devices include an RGB camera, an infrared camera, and a thermal infrared imager, and before image synchronization, a background heat source may be turned on, and a turntable capable of performing uniform circumferential motion is placed in front of the background heat source, so that the RGB camera, the infrared camera, and the thermal infrared imager can directly shoot the turntable; in the embodiment of the present disclosure, referring to fig. 3A to 3C, a pointer 302 indicating a rotation angle of the dial is disposed on the dial 301, and the dial 301 may further have scales to facilitate reading the rotation angle of the pointer 302, where the pointer 302 is the same moving target.
When the control turntable 301 moves at a constant speed at a set rotating speed, the RGB camera, the infrared camera and the thermal infrared imager are controlled to shoot, wherein an image shot by the RGB camera is an RGB image, an image shot by the infrared camera is an infrared image, and an image shot by the thermal infrared imager is a thermodynamic diagram; for example, the set rotation speed may be 1000 revolutions per second, that is, the time for one revolution of the turntable is 1 millisecond.
The position of the pointer 302 in the RGB image is determined by performing object recognition and corner detection on the RGB image, and referring to fig. 3A, the position of the pointer 302 is a position rotated by 36 degrees from the true north direction. The position of the pointer 302 in the infrared image is determined by performing object recognition and corner detection on the infrared image, and referring to fig. 3B, the position of the pointer 302 is a position rotated by 54 degrees from the due north direction. The thermodynamic diagram is subjected to target recognition, and the position of the pointer 302 in the thermodynamic diagram is determined, and referring to fig. 3C, the position of the pointer 302 is rotated by 72 degrees from the due north direction.
After determining the position of the pointer 302 in the RGB image, the infrared image, and the thermodynamic diagram, the polar angle difference of the pointer may be determined between two of the RGB image, the infrared image, and the thermodynamic diagram, and further, the delay time between two of the RGB image, the infrared image, and the thermodynamic diagram may be determined.
Illustratively, referring to fig. 3A to 3C, the polar angle difference of the pointer in the RGB image and the infrared image with respect to the moving center point is-18 degrees, and in case that the set rotation speed may be 1000 revolutions per second, it may be determined that the delay time of the RGB image with respect to the infrared image is-50 microseconds according to equation (1), that is, the RGB camera is slower than each frame image of the infrared camera by 50 microseconds
The polar angle difference of the pointer in the RGB image and the thermodynamic diagram relative to the moving center point is-36 degrees, and then under the condition that the set rotating speed can be 1000 revolutions per second, according to the formula (1), the delay time of the RGB image relative to the thermodynamic diagram can be determined to be-100 microseconds, that is, the RGB camera is 100 microseconds slower than each frame image of the thermal infrared imager.
The polar angle difference of the pointer in the infrared image and the thermodynamic diagram relative to the moving center point is-18 degrees, and under the condition that the set rotating speed can be 1000 revolutions per second, according to the formula (1), the delay time of the infrared image relative to the thermodynamic diagram can be determined to be-50 microseconds, namely, the infrared camera is slower than each frame of image of the thermal infrared imager by 50 microseconds.
In some embodiments, second speed information of the same moving object in uniform linear motion may be obtained, and the second speed information may be used as the motion state information.
Illustratively, the second rate information is preset rate information, and it can be seen that in the embodiment of the present disclosure, by controlling the moving object to perform a uniform linear motion at the second rate, it is convenient to determine the moving state information of the moving object before performing time synchronization on the at least two image capturing devices.
In some embodiments, the determining the delay time between the at least two images according to the position of the same moving object in the at least two images and the predetermined motion state information of the same moving object may include:
determining the relative distance of the same moving target in any two images of the at least two images according to the positions of the same moving target in the at least two images;
a delay time between any two of the at least two images is determined based on the relative distance and the second rate information.
In some embodiments, the delay time between any two images may be calculated according to the following formula:
Figure BDA0003013542310000121
where Δ t represents a delay time between any two images, l represents a relative distance of the same moving object in any two images, and v represents a second velocity.
In some embodiments, the at least two image capturing devices include an RGB camera, an infrared camera, and a thermal infrared imager, and accordingly, the at least two images are an RGB image, an infrared image, and a thermodynamic diagram, and a distance l from a starting point of a moving object in the RGB image is provided1Red, redThe distance between the moving object in the outer image and the starting point is l2The distance between the moving object and the starting point in the thermodynamic diagram is l3
It can be seen that the relative distance of the moving object in the RGB image and the infrared image is l1-l2According to the formula (2), the delay time of the RGB image and the infrared image can be calculated; the relative distance of the moving object in the RGB image and the thermodynamic diagram is l1-l3According to the formula (2), the delay time of the RGB image and the thermodynamic diagram can be calculated; the relative distance of the moving object in the infrared image and the thermodynamic diagram is l2-l3The delay time of the infrared image and the thermodynamic diagram can be calculated according to the formula (2).
It can be seen that the embodiment of the present disclosure can determine the delay time between at least two images according to the relative distance and the second rate information, and has the characteristics of accuracy and high efficiency compared with a scheme of determining the image delay between different image acquisition devices through a video image frame counting or human eye observation method in the related art.
In some embodiments, the movement speed change information of the same moving object may be acquired, and the movement speed change information may be used as the movement state information.
For example, the movement pattern of the moving object at each time may be specified, so that the speed of the moving object at each time may be determined in advance, that is, the movement speed variation information of the moving object may be determined.
It can be seen that, in the embodiment of the present disclosure, by controlling the moving object to move with reference to the preset moving speed change information, it is convenient to determine the moving state information of the moving object before performing time synchronization on the at least two image capturing devices, and thus, it is beneficial to determine the delay time of the at least two images.
In some embodiments, where the at least two image acquisition devices comprise thermal imagers, the moving object may comprise a hollowed-out portion; before image synchronization is carried out, a background heat source can be started, and a moving target is placed in front of the background heat source, so that the moving target can be observed visually conveniently in an image collected by a thermal imager due to the fact that the moving target comprises a hollow part.
Illustratively, when the moving target is located on the turntable, hollowed scale marks can be added on the turntable, so that the position of the moving target can be observed visually in an image collected by the thermal imager.
In the scenes of entrance guard passage, security protection and rapid temperature measurement, a multi-view camera system can be integrated in electronic equipment so as to obtain RGB images, infrared images and thermodynamic diagrams, time delay exists among the RGB images, the infrared images and the thermodynamic diagrams, and the problems of binocular distance measurement difference and abnormal target mapping exist in rapid moving objects, so that the problems of abnormal temperature measurement, living body and distance position estimation functions occur, and therefore the time delay among the images acquired by the multi-view camera needs to be determined and subsequently processed. For the problem, in the embodiment of the present disclosure, the delay time of at least two images is determined according to the position difference of the same moving object in the at least two images and the predetermined moving state information of the same moving object, so as to implement time synchronization of the at least two image capturing devices.
It will be understood by those skilled in the art that in the method of the present invention, the order of writing the steps does not imply a strict order of execution and any limitations on the implementation, and the specific order of execution of the steps should be determined by their function and possible inherent logic.
On the basis of the image synchronization method provided by the foregoing embodiment, the embodiment of the present disclosure provides an image synchronization apparatus.
Fig. 4 is a schematic diagram illustrating a configuration of an image synchronization apparatus according to an embodiment of the disclosure, and as shown in fig. 4, the apparatus may include an obtaining module 400, a first processing module 401, and a second processing module 402, wherein,
an obtaining module 400, configured to obtain at least two images acquired by at least two image acquisition devices, where an image acquisition device corresponding to each of the at least two images is different;
a first processing module 401, configured to perform target identification on at least two images, and determine positions of a same moving target in the at least two images; determining the delay time between at least two images according to the positions of the same moving target in at least two images and the predetermined motion state information of the same moving target;
and a second processing module 402, configured to perform time synchronization processing on the at least two image capturing devices according to the delay time.
In some embodiments, the first processing module 401 is further configured to obtain motion state information of the same moving object in uniform motion.
In some embodiments, the first processing module 401 is configured to obtain motion state information of the same moving object in uniform motion, and includes:
and acquiring first speed information of the same moving target in uniform circular motion, and taking the first speed information as motion state information.
In some embodiments, the first processing module 401 is configured to determine a delay time between at least two images according to a position of the same moving object in the at least two images and predetermined motion state information of the same moving object, and includes:
determining the polar angle of the same moving object in each image of the at least two images according to the position of the same moving object in the at least two images, wherein the polar angle of the same moving object in each image represents: an included angle between a connecting line of motion center points of uniform circular motion from the same moving object in each image and a reference direction;
determining a polar angle difference of two polar angles according to the two polar angles of any two images of the same moving target in at least two images;
a delay time between any two of the at least two images is determined based on the polar angle difference and the first rate information.
In some embodiments, the first processing module 401 is configured to obtain motion state information of the same moving object in uniform motion, and includes:
and acquiring second speed information of the same moving object in uniform linear motion, and taking the second speed information as motion state information.
In some embodiments, the first processing module 401 is configured to determine a delay time between at least two images according to a position of the same moving object in the at least two images and predetermined motion state information of the same moving object, and includes:
determining the relative distance of the same moving target in any two images of the at least two images according to the positions of the same moving target in the at least two images;
a delay time between any two of the at least two images is determined based on the relative distance and the second rate information.
In some embodiments, the first processing module 401 is further configured to obtain movement speed variation information of the same moving object, and use the movement speed variation information as the movement state information.
In some embodiments, the at least two image capture devices comprise thermal imagers and the moving object comprises a hollowed-out portion.
In practical applications, the obtaining module 400, the first processing module 401, and the second processing module 402 may all be implemented by a processor in an electronic device, where the processor may be at least one of an ASIC, a DSP, a DSPD, a PLD, an FPGA, a CPU, a controller, a microcontroller, and a microprocessor.
In addition, each functional module in this embodiment may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware or a form of a software functional module.
Based on the understanding that the technical solution of the present embodiment essentially or a part contributing to the prior art, or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, and include several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (processor) to execute all or part of the steps of the method of the present embodiment. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
Specifically, the computer program instructions corresponding to an image synchronization method in the present embodiment may be stored on a storage medium such as an optical disc, a hard disc, a usb disk, or the like, and when the computer program instructions corresponding to an image synchronization method in the storage medium are read or executed by an electronic device, any of the image synchronization methods of the foregoing embodiments is implemented.
Based on the same technical concept of the foregoing embodiment, referring to fig. 5, it shows an electronic device 5 provided by the embodiment of the present disclosure, which may include: a memory 501 and a processor 502; wherein,
the memory 501 is used for storing computer programs and data;
the processor 502 is configured to execute the computer program stored in the memory to implement any one of the image synchronization methods of the foregoing embodiments.
In practical applications, the memory 501 may be a volatile memory (volatile memory), such as a RAM; or a non-volatile memory (non-volatile memory) such as a ROM, a flash memory (flash memory), a Hard Disk (Hard Disk Drive, HDD) or a Solid-State Drive (SSD); or a combination of the above types of memories and provides instructions and data to the processor 502.
The processor 502 may be at least one of ASIC, DSP, DSPD, PLD, FPGA, CPU, controller, microcontroller, and microprocessor. It is understood that the electronic devices for implementing the above-described processor functions may be other devices, and the embodiments of the present disclosure are not particularly limited.
In some embodiments, functions of or modules included in the apparatus provided in the embodiments of the present disclosure may be used to execute the method described in the above method embodiments, and specific implementation thereof may refer to the description of the above method embodiments, and for brevity, will not be described again here.
The foregoing description of the various embodiments is intended to highlight various differences between the embodiments, and the same or similar parts may be referred to each other, which are not repeated herein for brevity
The methods disclosed in the method embodiments provided by the present disclosure may be combined arbitrarily without conflict to obtain new method embodiments.
Features disclosed in the various product embodiments provided by the disclosure may be combined in any combination to yield new product embodiments without conflict.
The features disclosed in the various method or apparatus embodiments provided by the present disclosure may be combined arbitrarily, without conflict, to arrive at new method embodiments or apparatus embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present disclosure may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (e.g., a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present disclosure.
While the embodiments of the present disclosure have been described in connection with the drawings, the present disclosure is not limited to the specific embodiments described above, which are intended to be illustrative rather than limiting, and it will be apparent to those of ordinary skill in the art in light of the present disclosure that many more modifications can be made without departing from the spirit of the disclosure and the scope of the appended claims.

Claims (11)

1. An image synchronization method, characterized in that the method comprises:
acquiring at least two images acquired by at least two image acquisition devices, wherein the image acquisition devices corresponding to the images in the at least two images are different;
carrying out target identification on the at least two images, and determining the positions of the same moving target in the at least two images;
determining the delay time between the at least two images according to the positions of the same moving target in the at least two images and the predetermined motion state information of the same moving target;
and according to the delay time, performing time synchronization processing on the at least two image acquisition devices.
2. The method of claim 1, further comprising:
and acquiring the motion state information of the same motion target in uniform motion.
3. The method according to claim 2, wherein the obtaining the motion state information of the same moving object in uniform motion comprises:
and acquiring first speed information of the same moving target in uniform circular motion, and taking the first speed information as the motion state information.
4. The method according to claim 3, wherein the determining the delay time between the at least two images according to the position of the same moving object in the at least two images and the predetermined motion state information of the same moving object comprises:
determining a polar angle of the same moving object in each image of the at least two images according to the position of the same moving object in the at least two images, wherein the polar angle of the same moving object in each image represents: an included angle between a connecting line of motion center points of uniform circular motion from the same moving object in each image and a reference direction;
determining a polar angle difference of two polar angles according to the two polar angles of any two images of the same moving target in the at least two images;
determining a delay time between any two images of the at least two images according to the polar angle difference and the first rate information.
5. The method according to claim 2, wherein the obtaining the motion state information of the same moving object in uniform motion comprises:
and acquiring second speed information of the same moving object in uniform linear motion, and taking the second speed information as the motion state information.
6. The method according to claim 5, wherein the determining the delay time between the at least two images according to the position of the same moving object in the at least two images and the predetermined motion state information of the same moving object comprises:
determining the relative distance between any two images of the same moving target in the at least two images according to the positions of the same moving target in the at least two images;
and determining the delay time between any two images of the at least two images according to the relative distance and the second speed information.
7. The method of claim 1, further comprising:
and obtaining the movement speed change information of the same moving target, and taking the movement speed change information as the movement state information.
8. The method of any of claims 1 to 7, wherein the at least two image capture devices comprise thermal imagers and the moving object comprises a hollowed-out portion.
9. An image synchronization apparatus, characterized in that the apparatus comprises:
the system comprises an acquisition module, a processing module and a display module, wherein the acquisition module is used for acquiring at least two images acquired by at least two image acquisition devices, and the image acquisition devices corresponding to the images in the at least two images are different;
the first processing module is used for carrying out target identification on the at least two images and determining the positions of the same moving target in the at least two images; determining the delay time between the at least two images according to the positions of the same moving target in the at least two images and the predetermined motion state information of the same moving target;
and the second processing module is used for carrying out time synchronization processing on the at least two image acquisition devices according to the delay time.
10. An electronic device comprising a processor and a memory for storing a computer program operable on the processor; wherein,
the processor is configured to run the computer program to perform the method of any one of claims 1 to 8.
11. A computer storage medium on which a computer program is stored, characterized in that the computer program realizes the method of any one of claims 1 to 8 when executed by a processor.
CN202110382446.5A 2021-04-09 2021-04-09 Image synchronization method, device, electronic equipment and computer storage medium Active CN113177440B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202110382446.5A CN113177440B (en) 2021-04-09 2021-04-09 Image synchronization method, device, electronic equipment and computer storage medium
PCT/CN2022/083308 WO2022213833A1 (en) 2021-04-09 2022-03-28 Method and apparatus for image synchronization, electronic device, and computer storage medium
TW111112467A TW202240462A (en) 2021-04-09 2022-03-31 Methods, apparatuses, electronic devices and computer storage media for image synchronization

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110382446.5A CN113177440B (en) 2021-04-09 2021-04-09 Image synchronization method, device, electronic equipment and computer storage medium

Publications (2)

Publication Number Publication Date
CN113177440A true CN113177440A (en) 2021-07-27
CN113177440B CN113177440B (en) 2024-10-29

Family

ID=76924699

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110382446.5A Active CN113177440B (en) 2021-04-09 2021-04-09 Image synchronization method, device, electronic equipment and computer storage medium

Country Status (3)

Country Link
CN (1) CN113177440B (en)
TW (1) TW202240462A (en)
WO (1) WO2022213833A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113923317A (en) * 2021-08-30 2022-01-11 珠海视熙科技有限公司 Camera frame synchronization test method, device and storage medium
WO2022213833A1 (en) * 2021-04-09 2022-10-13 上海商汤智能科技有限公司 Method and apparatus for image synchronization, electronic device, and computer storage medium

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101271567A (en) * 2007-03-20 2008-09-24 凌阳科技股份有限公司 Image comparison method and system
CN106233722A (en) * 2014-03-20 2016-12-14 高途乐公司 The automatic alignment of the imageing sensor in multicamera system
CN106303495A (en) * 2015-06-30 2017-01-04 深圳创锐思科技有限公司 The synthetic method of panoramic stereo image, device and mobile terminal thereof
US20170078646A1 (en) * 2014-06-20 2017-03-16 Panasonic Intellectual Property Management Co., Ltd. Image processing method and image processing system
WO2018153211A1 (en) * 2017-02-22 2018-08-30 中兴通讯股份有限公司 Method and apparatus for obtaining traffic condition information, and computer storage medium
CN109194436A (en) * 2018-11-01 2019-01-11 百度在线网络技术(北京)有限公司 Sensor time stabs synchronous detecting method, device, equipment, medium and vehicle
US20190199890A1 (en) * 2017-12-22 2019-06-27 Canon Kabushiki Kaisha Image capturing apparatus capable of time code synchronization, control method of the same, storage medium, and image capturing system
WO2019237992A1 (en) * 2018-06-15 2019-12-19 Oppo广东移动通信有限公司 Photographing method and device, terminal and computer readable storage medium
CN110751685A (en) * 2019-10-21 2020-02-04 广州小鹏汽车科技有限公司 Depth information determination method, determination device, electronic device and vehicle
CN111815675A (en) * 2020-06-30 2020-10-23 北京市商汤科技开发有限公司 Target object tracking method and device, electronic equipment and storage medium
CN111985300A (en) * 2020-06-29 2020-11-24 魔门塔(苏州)科技有限公司 Automatic driving dynamic target positioning method and device, electronic equipment and storage medium
CN112113582A (en) * 2019-06-21 2020-12-22 上海商汤临港智能科技有限公司 Time synchronization processing method, electronic device, and storage medium
CN112543261A (en) * 2020-12-08 2021-03-23 浙江大华技术股份有限公司 Image quality improving method and device and computer readable storage medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130120528A1 (en) * 2011-01-09 2013-05-16 Thomson Licensing Video processing apparatus and method for detecting a temporal synchronization mismatch
US11132533B2 (en) * 2017-06-07 2021-09-28 David Scott Dreessen Systems and methods for creating target motion, capturing motion, analyzing motion, and improving motion
CN111208963B (en) * 2020-01-10 2022-10-18 井冈山电器有限公司 Video synchronous display method and system
CN113177440B (en) * 2021-04-09 2024-10-29 上海元罗卜智能科技有限公司 Image synchronization method, device, electronic equipment and computer storage medium

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101271567A (en) * 2007-03-20 2008-09-24 凌阳科技股份有限公司 Image comparison method and system
CN106233722A (en) * 2014-03-20 2016-12-14 高途乐公司 The automatic alignment of the imageing sensor in multicamera system
US20170078646A1 (en) * 2014-06-20 2017-03-16 Panasonic Intellectual Property Management Co., Ltd. Image processing method and image processing system
CN106303495A (en) * 2015-06-30 2017-01-04 深圳创锐思科技有限公司 The synthetic method of panoramic stereo image, device and mobile terminal thereof
WO2018153211A1 (en) * 2017-02-22 2018-08-30 中兴通讯股份有限公司 Method and apparatus for obtaining traffic condition information, and computer storage medium
US20190199890A1 (en) * 2017-12-22 2019-06-27 Canon Kabushiki Kaisha Image capturing apparatus capable of time code synchronization, control method of the same, storage medium, and image capturing system
WO2019237992A1 (en) * 2018-06-15 2019-12-19 Oppo广东移动通信有限公司 Photographing method and device, terminal and computer readable storage medium
CN109194436A (en) * 2018-11-01 2019-01-11 百度在线网络技术(北京)有限公司 Sensor time stabs synchronous detecting method, device, equipment, medium and vehicle
CN112113582A (en) * 2019-06-21 2020-12-22 上海商汤临港智能科技有限公司 Time synchronization processing method, electronic device, and storage medium
CN110751685A (en) * 2019-10-21 2020-02-04 广州小鹏汽车科技有限公司 Depth information determination method, determination device, electronic device and vehicle
CN111985300A (en) * 2020-06-29 2020-11-24 魔门塔(苏州)科技有限公司 Automatic driving dynamic target positioning method and device, electronic equipment and storage medium
CN111815675A (en) * 2020-06-30 2020-10-23 北京市商汤科技开发有限公司 Target object tracking method and device, electronic equipment and storage medium
CN112543261A (en) * 2020-12-08 2021-03-23 浙江大华技术股份有限公司 Image quality improving method and device and computer readable storage medium

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
YI-TING CHEN等: "An FPGA Implementation of a Robot Control System with an Integrated 3D Vision System", SMART SCIENCE, vol. 3, no. 2, 4 January 2016 (2016-01-04), pages 100 - 107 *
徐晋等: "大学物理 上", 31 March 2020, 重庆大学出版社, pages: 1 - 6 *
陈勇: "基于双目视觉的移动机器人同步定位与构图算法研究", 中国优秀硕士学位论文全文数据库信息科技辑, vol. 2020, 15 July 2020 (2020-07-15), pages 138 - 1072 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022213833A1 (en) * 2021-04-09 2022-10-13 上海商汤智能科技有限公司 Method and apparatus for image synchronization, electronic device, and computer storage medium
CN113923317A (en) * 2021-08-30 2022-01-11 珠海视熙科技有限公司 Camera frame synchronization test method, device and storage medium
CN113923317B (en) * 2021-08-30 2022-11-15 珠海视熙科技有限公司 Camera frame synchronization test method, device and storage medium

Also Published As

Publication number Publication date
WO2022213833A1 (en) 2022-10-13
CN113177440B (en) 2024-10-29
TW202240462A (en) 2022-10-16

Similar Documents

Publication Publication Date Title
CN106657779B (en) Surrounding shooting method and device and unmanned aerial vehicle
EP3028252B1 (en) Rolling sequential bundle adjustment
EP3089449B1 (en) Method for obtaining light-field data using a non-light-field imaging device, corresponding device, computer program product and non-transitory computer-readable carrier medium
WO2022213833A1 (en) Method and apparatus for image synchronization, electronic device, and computer storage medium
CN103841374B (en) Display method and system for video monitoring image
CN112166459A (en) Three-dimensional environment modeling based on multi-camera convolver system
CN110268445A (en) It is calibrated automatically using the camera of gyroscope
KR101530255B1 (en) Cctv system having auto tracking function of moving target
CN111093050B (en) Target monitoring method and device
CN111768486B (en) Monocular camera three-dimensional reconstruction method and system based on rotating refraction sheet
CN109696158A (en) Distance measurement method, distance-measuring device and electronic equipment
CN111488835B (en) Identification method and device for staff
CN103797357A (en) Trigger for blade imaging based on a controller
CN110910459A (en) Camera device calibration method and device and calibration equipment
CN111047622B (en) Method and device for matching objects in video, storage medium and electronic device
CN111213365A (en) Shooting control method and controller
CN112514366A (en) Image processing method, image processing apparatus, and image processing system
WO2020255628A1 (en) Image processing device, and image processing program
CN113936042B (en) Target tracking method and device and computer readable storage medium
CN112272267A (en) Shooting control method, shooting control device and electronic equipment
CN108298101B (en) Cloud deck rotation control method and device and unmanned aerial vehicle
EP3367353B1 (en) Control method of a ptz camera, associated computer program product and control device
CN114125303B (en) Test image acquisition method, device, equipment and medium
CN111433819A (en) Target scene three-dimensional reconstruction method and system and unmanned aerial vehicle
CN109374919B (en) Method and device for determining moving speed based on single shooting device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40049355

Country of ref document: HK

TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20240416

Address after: 200030, Units 6-77, 6th Floor, No. 1900 Hongmei Road, Xuhui District, Shanghai

Applicant after: Shanghai Yuanluobu Intelligent Technology Co.,Ltd.

Country or region after: China

Address before: 518000 Room 201, building A, 1 front Bay Road, Shenzhen Qianhai cooperation zone, Shenzhen, Guangdong

Applicant before: SHENZHEN SENSETIME TECHNOLOGY Co.,Ltd.

Country or region before: China

GR01 Patent grant