CN113973179A - Method, device, equipment and medium for controlling image output time sequence - Google Patents

Method, device, equipment and medium for controlling image output time sequence Download PDF

Info

Publication number
CN113973179A
CN113973179A CN202111250085.5A CN202111250085A CN113973179A CN 113973179 A CN113973179 A CN 113973179A CN 202111250085 A CN202111250085 A CN 202111250085A CN 113973179 A CN113973179 A CN 113973179A
Authority
CN
China
Prior art keywords
image sensing
image
sensing modules
processing module
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111250085.5A
Other languages
Chinese (zh)
Inventor
卓康
何云
王俊杰
宋博
张建国
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Image Design Technology Co Ltd
Original Assignee
Chengdu Image Design Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Image Design Technology Co Ltd filed Critical Chengdu Image Design Technology Co Ltd
Priority to CN202111250085.5A priority Critical patent/CN113973179A/en
Publication of CN113973179A publication Critical patent/CN113973179A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

The invention provides a method, a device and a medium for controlling image output time sequence, wherein the method can be applied to an electronic device with an image processing module and N image sensing modules, and comprises the following steps: the image processing module sends a synchronous signal to the N image sensing modules; and the N image sensing modules output images to the image processing module after a fixed time length when receiving the rising edge of the synchronous signal. The time sequence control method is used for restraining the time deviation of the output images when the plurality of image sensing modules work simultaneously, so that the consumption of the storage space of the image processing module caused by the cache data is reduced.

Description

Method, device, equipment and medium for controlling image output time sequence
Technical Field
The present invention relates to the field of semiconductor integrated circuit design, and in particular, to a method, an apparatus, a device, and a medium for controlling an image output timing.
Background
The video fusion technology is a development direction of the virtual reality technology at the present stage, and is widely applied to an auxiliary driving system and a monitoring system. The video fusion technology can reconstruct a scene three-dimensional scene in real time in the auxiliary driving and safety monitoring, increase the interactivity between a scene model and a real scene in an auxiliary driving system and a monitoring system, reduce the uncertainty in the system and greatly improve the accuracy in the system.
The key of the video fusion technology is that the time offset between the images before fusion is required to be within a controllable range and to be correctable in real time, which presents a new challenge to the design of an image sensing module. In a real scene, a plurality of image sensing modules are often in different exposure durations, because the brightness level of the environment in which each image sensing module is located is different, so that the deviation of the output image of all the image sensing modules can reach the maximum exposure time even if all the image sensing modules are started simultaneously, and the maximum exposure time can be the duration of one image frame or exceed the duration of one image frame.
In general, the output images of the image sensing module are fused by the image processing module, the maximum deviation time among a plurality of images determines the size of a pre-buffer of the image processing module, and the data amount of the pre-buffer is too large due to the overlarge maximum deviation time among the plurality of images, so that the storage space of the image processing module is greatly consumed.
For this reason, a method, apparatus, device, and medium for controlling an image output timing are needed to improve the above-described problems.
Disclosure of Invention
The invention aims to provide a method, a device, equipment and a medium for controlling image output time sequence, wherein the method is used for restraining the time deviation of output images when a plurality of image sensing modules work simultaneously so as to reduce the consumption of storage space of an image processing module caused by cache data.
In a first aspect, the present invention provides a method for controlling an image output timing, applied to an electronic device having an image processing module and N image sensing modules, where N is a positive integer greater than 1, the method including: when the N image sensing modules work simultaneously, the image processing module sends enabling signals to the N image sensing modules, wherein exposure time parameters of the N image sensing modules are mutually independent; and the N image sensing modules output images to the image processing module after a fixed time length when receiving the enabling signals.
The method for controlling the output time sequence of the image sensing module has the advantages that: the time sequence control method is used for restraining the time deviation of the images output by the simultaneous working of the image sensing modules and increasing the stability of the fused images.
In a possible embodiment, the image processing module receives images respectively output from the N image sensing modules; the image processing module calculates the maximum time offset among the images output by the N image sensing modules; and when the maximum time offset is greater than or equal to a set threshold, the image processing module returns to repeatedly execute the sending of the enabling signals to the N image sensing modules. The beneficial effects are that: the time offset between the images output by the N image sensing modules is always limited within a certain range, and the stability of video fusion is improved.
In a possible embodiment, when the N image sensing modules operate simultaneously, the method further includes: and performing clock synchronization on the N image sensing modules and the image processing module. The beneficial effects are that: clock skew between the N image sensing modules and the image processing module is reduced by clock synchronization.
In one possible embodiment, the time parameters of the exposures of the N image sensing modules include an exposure start time and an exposure time length. The beneficial effects are that: the time deviation of the output images of the image sensing modules can be adjusted conveniently by controlling the time parameters of the exposure.
In one possible embodiment, the N image sensing modules include a first image sensing module, a second image sensing module, through an nth image sensing module;
when the N image sensing modules receive the enabling signals, outputting images to the image processing module after a fixed time length FH, wherein the fixed time length FH comprises exposure time and delay time; the image processing module sends configuration signals to the N image sensing modules respectively, wherein the configuration signals comprise exposure time parameters and are used for controlling the exposure time of the image sensing modules; the first image sensing module receives the enabling signal at the time T1, triggers the first image sensing module, starts timing and calculates a delay time T0, wherein the delay time T0 is equal to a fixed time length FH minus an exposure time T1; when the timing reaches the delay time t0, the register generates an exposure starting signal, and the first image sensing module starts exposure; when exposure is finished, the first image sensing module outputs an image to the image processing module; the second image sensing module receives the enabling signal at the time T2, triggers the second image sensing module, starts timing and calculates a delay time T2, wherein the delay time T2 is equal to a fixed time length FH minus an exposure time T3; when the timing reaches the delay time t2, the register generates an exposure starting signal, and the second image sensing module starts exposure; when the exposure is finished, the second image sensing module outputs an image to the image processing module; by analogy, the Nth image sensing module receives the enabling signal at the moment TN, triggers the Nth image sensing module, starts timing and calculates delay time t (2N-2), wherein the delay time t (2N-2) is equal to a fixed time length FH minus exposure time t (2N-1); when the timing reaches the delay time t (2N-2), the register generates an exposure starting signal, and the Nth image sensing module starts exposure; and when the exposure is finished, the Nth image sensing module outputs an image to the image processing module. The beneficial effects are that: the delay time required by the image sensing modules before exposure is calculated and obtained through each image sensing module, so that the time deviation of the output images among the image sensing modules is restrained, and the fused images are more stable.
In one possible embodiment, the maximum time shift amount between the images output by the N image sensing modules includes a maximum time shift amount between times T1 to TN at which the N image sensing modules receive the enable signal and a maximum shift amount of clock difference between the N image sensing modules. The beneficial effects are that: the time deviation of the image received by the image processing module can be determined by calculating the maximum value of the time shift amount and the clock difference shift amount of the enable signal received by the N image sensing modules.
In another possible embodiment, the image sensing module receives the enable signal through a pin, starts timing and calculates a delay time, wherein the delay time is equal to a fixed time length FH minus an exposure time; when the timing reaches the delay time, the image sensing module starts exposure; when the exposure is finished, the image sensing module outputs an image to the image processing module. The embodiment cancels the register setting in the image sensing module, saves the resources required by the register setting, and realizes the restriction of the time deviation of the output images among a plurality of image sensing modules.
It should be noted that the number of the image processing modules may be one or more, and the number of the image processing modules performing the same function is one, which can avoid the problem of conflict or asynchronization between different image processing modules.
In a second aspect, an embodiment of the present application further provides an apparatus for controlling an image output timing, where the apparatus includes a read-write control unit and a storage unit; the storage unit is used for storing a computer program; the read-write control unit is used for reading a computer program from the storage unit and controlling the image processing module to send enabling signals to the N image sensing modules when the N image sensing modules work simultaneously, wherein the exposure time parameters of the N image sensing modules are independent; and controlling the N image sensing modules to output images to the image processing module after a fixed time length when the N image sensing modules receive the enabling signals. The beneficial effects are that: the method is applied to different hardware devices.
In a possible embodiment, the read-write control unit is further configured to: controlling the image processing module to receive images respectively output by the N image sensing modules; controlling the image processing module to calculate the maximum time offset among the images output by the N image sensing modules; and when the maximum time offset is greater than or equal to a set threshold, controlling the image processing module to return to repeatedly execute sending of the synchronous signals to the N image sensing modules. The beneficial effects are that: the maximum time offset between the images output by the N image sensing modules can be limited within a threshold value, and the stability of video fusion is improved.
In a possible embodiment, when the N image sensing modules operate simultaneously, the apparatus further includes a synchronization unit; and the synchronization unit is used for carrying out clock synchronization on the N image sensing modules and the image processing module. The beneficial effects are that: the N image sensing modules and the image processing module can be subjected to clock synchronization through the synchronization unit, and clock differences between the N image sensing modules and the image processing module can be reduced.
In one possible embodiment, the time parameters of the exposures of the N image sensing modules include an exposure start time and an exposure time length. The beneficial effects are that: the time shift amount of the output image between the image sensing modules can be reduced by adjusting the exposure start time and the exposure time length of each image sensing module.
In a third aspect, an embodiment of the present invention further provides an electronic device, where the electronic device includes a computer program, and the image processing module and the N image sensing modules described in any one of the first aspect, where N is a positive integer greater than 1. The beneficial effects are that: the computer program, when run on an electronic device, is capable of causing the electronic device to perform any one of the possible designed methods of any of the above aspects.
In a fourth aspect, an embodiment of the present invention further provides a computer-readable storage medium, in which a computer program is stored, and the computer program, when executed by an image processing module, causes an electronic device to execute any one of the possible design methods of any one of the aspects.
In a fifth aspect, the present invention further provides a method including a computer program product, when the computer program product runs on an electronic device, causing the electronic device to execute any one of the possible designs of any one of the above aspects.
For the remaining advantages of the second to fifth aspects, reference may be made to the description of the first aspect, and details will not be repeated.
Drawings
Fig. 1 is a schematic structural diagram of an electronic device provided in the present invention;
FIG. 2 is a flowchart illustrating a method for controlling an image output timing sequence of an image sensor module according to the present invention;
FIG. 3 is a comparison diagram of the working timing sequences of three image sensing modules according to the present invention;
FIG. 4 is a schematic diagram illustrating an installation position of an image sensing module according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of an apparatus for controlling an image output timing sequence according to the present invention;
fig. 6 is a schematic structural diagram of another electronic device provided in the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings of the present invention, and it is obvious that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention. Unless defined otherwise, technical or scientific terms used herein shall have the ordinary meaning as understood by one of ordinary skill in the art to which this invention belongs. As used herein, the word "comprising" and similar words are intended to mean that the element or item listed before the word covers the element or item listed after the word and its equivalents, but does not exclude other elements or items.
In describing embodiments of the present invention, the terminology used in the following embodiments is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the description of the invention and the appended claims, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, such as "one or more", unless the context clearly indicates otherwise. It should also be understood that in the following embodiments of the present invention, "at least one", "one or more" means one or more than two (including two). The term "and/or" is used to describe an association relationship that associates objects, meaning that three relationships may exist; for example, a and/or B, may represent: a alone, both A and B, and B alone, where A, B may be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present invention. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise. The term "coupled" includes both direct and indirect connections, unless otherwise noted. "first" and "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated.
In the embodiments of the present invention, words such as "exemplary" or "for example" are used to mean serving as examples, illustrations or descriptions. Any embodiment or design described as "exemplary" or "e.g.," an embodiment of the present invention is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
As shown in fig. 1, the present invention provides an electronic device, which includes an image processing module and N image sensing modules, where a buffer is disposed in the image processing module, and is capable of receiving images output by the N image sensing modules, where N is a positive integer greater than 1. Before the electronic equipment starts to work, an image processing module respectively configures exposure time of each image sensing module, when the electronic equipment starts to work, the image processing module simultaneously sends enabling signals to N image sensing modules, when the image sensing modules receive the enabling signals, an external triggering mode of the image sensing modules is triggered, delay timing is started, fixed time FH is calculated to subtract the exposure time to obtain required delay time, when the delay time is reached by timing, a register in the image sensing modules generates an exposure starting signal, the rising edge of the exposure starting signal triggers the image sensing modules to start exposure, and when the exposure time is ended, the image sensing modules output images to the image processing modules; the delay time required by the image sensing modules before exposure is calculated and obtained through each image sensing module, so that the time deviation of the output images among the image sensing modules is restrained, and the fused images are more stable.
As shown in fig. 2, the present invention also provides a method for controlling an image output timing, which can be used in the electronic device having the image processing module and N image sensing modules shown in fig. 1, and includes:
s201, when the N image sensing modules work simultaneously, the image processing module sends enabling signals to the N image sensing modules, wherein the exposure time parameters of the N image sensing modules are independent.
Wherein, the time parameter of the exposure of the N image sensing modules comprises the length of the exposure time. The exposure start time and the exposure time length may be different for different image sensing modules.
And S202, outputting images to the image processing module after a fixed time length when the N image sensing modules receive the enabling signals.
The method can control the image output among the image sensing modules after a fixed time length, so that the time deviation generated among the image sensing modules is the maximum time offset between the enabling signal receiving moments and the maximum clock difference offset among the N image sensing modules, and the time deviation can be far shorter than the time required by one line of cache, thereby reducing the consumption of cache space caused by cache data of the image processing module and increasing the stability of the fused image.
In a possible embodiment, the enable signal may be a pulse signal, for example, when the N image sensing modules receive a rising edge or a falling edge of an external trigger signal, the N image sensing modules output an image to the image processing module after a fixed time period when receiving the enable signal.
In a possible embodiment, in order to reduce the time deviation of the images output by the N image sensing modules, when the N image sensing modules operate simultaneously, the method may further perform clock synchronization on the N image sensing modules and the image processing module. This can reduce the maximum shift amount of the clock difference between the N image sensing modules.
In a possible embodiment, after a long time of operation, it is considered that clock asynchronization may still occur between the N image sensing modules, which results in an excessively large maximum offset of clock difference between the N image sensing modules, so that the image processing module receives images respectively output from the N image sensing modules; the image processing module calculates the maximum time offset among the images output by the N image sensing modules; when the maximum time offset is greater than or equal to the set threshold, the image processing module returns to repeatedly execute sending the enabling signal to the N image sensing modules, so that the N image sensing modules output images to the image processing module after a fixed time length when receiving the enabling signal again, the problem that the time deviation of the output images is overlarge due to clock frequency deviation among the image sensing modules is solved, the time deviation of the final output images of the image sensing modules is ensured to be within a line cache time, therefore, only one line cache is theoretically required during synchronization, the consumption of cache space is greatly reduced, and the area of the image processing module occupied by the cache is favorably reduced.
For the convenience of understanding, taking three image sensing modules as an example, after the three image sensing modules respectively receive the enable signals from the image processing module, the operation timing sequence of the three image sensing modules is shown in fig. 3.
Because the distances between the three image sensing modules and the image processing module are different, the time when the three image sensing modules receive the enabling signals is different, the image processing module firstly sends configuration signals to the three image sensing modules, and the configuration signals comprise exposure time parameters and are used for controlling the exposure time of the image sensing modules; as can be seen from fig. 3, the first image sensing module receives the rising edge of the enable signal at time T1, starts timing and calculates a delay time T0, where the delay time T0 is equal to the fixed duration FH minus the exposure time T1; when the timing reaches the delay time t0, the register of the first image sensing module generates an exposure start signal, and the rising edge of the exposure start signal triggers the first image sensing module to start exposure; when the exposure is finished, the first image sensing module outputs an image to the image processing module. The second image sensing module receives the rising edge of the enable signal at the time T2, starts timing and calculates a delay time T2, wherein the delay time T2 is equal to a fixed time length FH minus an exposure time T3; when the timing reaches the delay time t2, the register of the second image sensing module generates an exposure start signal, and the rising edge of the exposure start signal triggers the second image sensing module to start exposure; and when the exposure is finished, the second image sensing module outputs an image to the image processing module. The third image sensing module receives the rising edge of the enable signal at the time T3, starts timing and calculates a delay time T4, wherein the delay time T4 is equal to a fixed time length FH minus an exposure time T5; when the timing reaches the delay time, a register of the third image sensing module generates an exposure starting signal, and the rising edge of the exposure starting signal triggers the third image sensing module to start exposure; and when the exposure is finished, the third image sensing module outputs an image to the image processing module.
The three image sensing modules output images to the image processing module through a fixed time length FH after receiving the enabling signals, wherein the delay time length is obtained by subtracting the exposure time length from the fixed time length FH.
In addition, as can be seen from fig. 3, the time offset between time T1 and time T2 is δ 0; the time offset between time T2 and time T3 is δ 1; the time offset between time T1 and time T3 is δ 2; where δ 1 is the largest. In one case, when the first image sensing module, the second image sensing module, the third image sensing module and the image processing module are subjected to clock synchronization in advance, the maximum time offset of the three image sensing output images is δ 1; in another case, if the first image sensing module, the second image sensing module, the third image sensing module and the image processing module are not clocked in advance, the maximum time offset of the three image sensing output images is δ 1 and the maximum clock difference offset of the three image sensing output images, and generally, the maximum clock difference offset is much smaller than the time required for caching a line of image lines, so the maximum time offset of the three image sensing output images can be constrained to the time required for caching a line of image lines, and therefore, the caching required by the image processing module can be reduced, and the effect of increasing the stability of the fused image can be achieved.
It should be noted that the triggering condition for starting exposure of the image sensing module may be soft triggering or hard triggering, where the soft triggering is just like the above-mentioned triggering that the image sensing module delays and then generates an exposure starting signal through a register, and a rising edge of the exposure starting signal triggers the image sensing module to start exposure; the hard triggering is that the image sensing module directly generates a rising edge after delaying to trigger the image sensing module to start exposure.
In another possible embodiment, the image sensing module receives the enable signal through a pin, starts timing and calculates a fixed time length FH minus an exposure time to obtain a required delay time; when the timing reaches the delay time, the image sensing module generates a rising edge to trigger the image sensing module to start exposure; when the exposure is finished, the image sensing module outputs an image to the image processing module. The embodiment cancels the register setting in the image sensing module, saves the resources required by the register setting, and realizes the restriction of the time deviation of the output images among a plurality of image sensing modules.
It should be noted that the above method is equally applicable to an electronic device having two image sensing modules, and may also be equally applicable to an electronic device having three or more image sensing modules; the number of the image processing modules can be one or more, and the number of the image processing modules executing the same function is one, so that the problem of conflict or asynchronism among different image processing modules can be avoided.
The invention can be applied to a vehicle-mounted 360-degree look-around system, a monitoring system and a VR system to realize a low-delay and high-reliability video fusion technology, and in order to more clearly explain the working principle of the invention, the vehicle-mounted 360-degree look-around system is exemplarily taken as an example for detailed explanation.
As shown in fig. 4, the front and side mirrors of the vehicle are respectively provided with a camera, namely a first camera 401, a second camera 402, and a third camera 403. An image processing module 404 in the in-vehicle terminal is electrically connected to the first camera 401, the second camera 402, and the third camera 403. The arrival time of the enabling signals sent to the cameras by the image processing module is different considering that the lengths of the wires between the cameras and the image processing module are different.
In this example, first, the first camera 401, the second camera 402, the third camera 403, and the image processing module 404 are first clock-synchronized. Then, the image processing module 404 sends the enable signal to the three cameras at the same time, and the three cameras all output the image to the image processing module 404 after receiving the enable signal for a fixed time period. Because the three cameras have performed clock synchronization, the clock offset between the three cameras is almost zero and can be ignored. Thus, the time offset of the images output by the three cameras is the maximum time offset between the receiving times of the enabling signals reaching the three cameras, and the maximum time offset between the receiving times is far less than the time required by caching one line of image lines, so that the caching amount required by the image processing module 404 can be reduced by the method of time sequence control, and the effect of reducing the consumption of the storage space of the image processing module caused by caching data can be achieved.
It should be noted that, as shown in fig. 4, the number of the cameras may be three or more, and the cameras are arranged around the vehicle, and this arrangement enables the determination of the all-round road conditions outside the vehicle through video fusion by means of the images captured by the plurality of cameras.
Based on the above method for controlling the image output timing of the image sensing module, an embodiment of the present application further discloses a device for controlling the image output timing, as shown in fig. 5, the device 500 is used to implement the method described in the above method embodiments, and includes: a storage unit 501, a read-write control unit 502, and a synchronization unit 503.
The storage unit 501 is used for storing a computer program;
the read-write control unit 502 is configured to read a computer program from the storage unit, and control the image processing module to send an enable signal to the N image sensing modules when the N image sensing modules simultaneously operate, where exposure time parameters of the N image sensing modules are independent of each other; and controlling the N image sensing modules to output images to the image processing module after a fixed time length when the N image sensing modules receive the enabling signals.
In a possible embodiment, the read-write control unit 502 is further configured to:
controlling the image processing module to receive images respectively output by the N image sensing modules;
controlling the image processing module to calculate the maximum time offset among the images output by the N image sensing modules;
and when the maximum time offset is greater than or equal to a set threshold, controlling the image processing module to return to repeatedly execute sending of the synchronous signals to the N image sensing modules.
In a possible embodiment, when the N image sensing modules operate simultaneously, the apparatus further includes a synchronization unit 503;
the synchronization unit 503 is configured to perform clock synchronization on the N image sensing modules and the image processing module.
In one possible embodiment, the time parameters of the exposures of the N image sensing modules include an exposure start time and an exposure time length.
Fig. 6 shows a schematic structural diagram of an electronic device 600. The electronic device 600 may be used to implement the methods described in the above method embodiments, and reference may be made to the description in the above method embodiments. The electronic device 600 may be a chip, a network device (e.g., a base station), an electronic device or other network device, etc.
The electronic device 600 comprises an image processing module 601 and one or more image sensing modules 607. The image processing module 601 may be a general-purpose processor or a special-purpose processor, etc.
Optionally, the image processing module 601 may implement the method shown in the embodiment of the method.
Optionally, the image processing module 601 may also implement other functions besides the method shown in the method embodiment.
Optionally, in one design, the image processing module 601 may also include instructions 603, and the instructions 603 may be executed on the image processing module, so that the electronic device 600 performs the method described in the above method embodiment.
In yet another possible design, the electronic device 600 may include one or more storage modules 602, on which instructions 604 are stored, where the instructions 604 may be executed on the image processing module, so that the electronic device 600 performs the method described in the above method embodiment. Optionally, the storage module may further store data therein. Instructions and/or data may also be stored in the optional image processing module. For example, the one or more storage modules 602 may store the correspondence described in the above embodiments, or related parameters involved in the above embodiments, and the like. The image processing module and the storage module can be arranged separately or integrated together.
In yet another possible design, the electronic device 600 may also include a communication interface 605 and an antenna 606. The image processing module 601 may be referred to as a processing unit, and controls a communication device (a terminal or a base station). The communication interface 605 may be referred to as a transceiver, a transceiving circuit, a transceiver, or the like, and is used for implementing transceiving functions of a communication device through the antenna 606.
It should be understood that the image Processing module in the embodiments of the present invention may be a Central Processing Unit (CPU) or a Graphics Processing Unit (GPU), and the image Processing module may also be other general purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments provided in the present invention, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.
In one possible design, the N image sensing modules may be separately arranged or integrated on the same integrated circuit chip; the image processing module and the image sensing module can be arranged independently or integrated on the same integrated circuit chip.
It should be noted that, in the embodiment of the present invention, the image sensing module may be an image sensor, and has the capability of converting a light image on the light sensing surface into an electrical signal in a proportional relationship with the light image by using a photoelectric conversion function of an optoelectronic device. In an implementation process, the image sensing module may be a Charge Coupled Device (CCD), a Complementary Metal-Oxide Semiconductor (CMOS) or other photosensitive Device. The function of outputting an image to the image processing module in the embodiment of the present invention may be realized or performed. It should be understood that the image sensing modules of the systems and methods described herein are intended to include, but not be limited to, these and any other suitable types of image sensors.
It should be noted that the image processing module in the embodiment of the present invention may be an image processing chip or an integrated circuit chip having a processing capability for an image signal. In implementation, the steps of the above method embodiments may be implemented by integrated logic circuits of hardware in the image processing module or instructions in the form of software. The image processing module may be a general-purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable Gate Array (FPGA) or other programmable logic device, a discrete Gate or transistor logic device, or a discrete hardware component. The various methods, steps and logic blocks disclosed in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is positioned in the storage module, and the image processing module reads the information in the storage module and completes the steps of the method by combining the hardware of the image processing module.
It will be appreciated that the memory modules in embodiments of the invention may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The non-volatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash Memory. Volatile Memory can be Random Access Memory (RAM), which acts as external cache Memory. By way of example, but not limitation, many forms of RAM are available, such as Static random access memory (Static RAM, SRAM), Dynamic Random Access Memory (DRAM), Synchronous Dynamic random access memory (Synchronous DRAM, SDRAM), Double Data Rate Synchronous Dynamic random access memory (DDR SDRAM), Enhanced Synchronous SDRAM (ESDRAM), Synchronous link SDRAM (SLDRAM), and Direct Rambus RAM (DR RAM). It should be noted that the memory modules of the systems and methods described herein are intended to comprise, without being limited to, these and any other suitable types of memory.
An embodiment of the present invention further provides a computer-readable medium, on which a computer program is stored, where the computer program, when executed by a computer, implements the method of any of the above method embodiments.
The embodiment of the invention also provides a computer program product, and the computer program product realizes the method of any one of the above method embodiments when being executed by a computer.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with embodiments of the invention, to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored on a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website, computer, server, or data center to another website, computer, server, or data center via wire (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., Digital Video Disk (DVD)), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
It should be understood that the above control device may be a chip, the image processing module may be implemented by hardware or software, and when implemented by hardware, the image processing module may be a logic circuit, an integrated circuit, or the like; when implemented in software, the image processing module may be a general-purpose processor, implemented by reading software code stored in a memory module, which may be integrated in the image processing module, may be located outside the image processing module, and may stand alone.
Those of ordinary skill in the art will appreciate that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein may be embodied in electronic hardware, computer software, or combinations of both, and that the components and steps of the examples have been described in a functional general in the foregoing description for the purpose of illustrating clearly the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In short, the above description is only a preferred embodiment of the present invention, and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (11)

1. A control method of image output timing sequence is applied to an electronic device with an image processing module and N image sensing modules, wherein N is a positive integer greater than 1, and the method comprises the following steps:
when the N image sensing modules work simultaneously, the image processing module sends enabling signals to the N image sensing modules, wherein exposure time parameters of the N image sensing modules are mutually independent;
and the N image sensing modules output images to the image processing module after a fixed time length when receiving the enabling signals.
2. The method of claim 1, further comprising:
the image processing module receives images respectively output by the N image sensing modules;
the image processing module calculates the maximum time offset among the images output by the N image sensing modules;
and when the maximum time offset is greater than or equal to a set threshold, the image processing module returns to repeatedly execute the sending of the enabling signals to the N image sensing modules.
3. The method of claim 1 or 2, wherein when the N image sensing modules are operating simultaneously, the method further comprises:
and performing clock synchronization on the N image sensing modules and the image processing module.
4. The method according to claim 1 or 2, wherein the time parameters of the exposures of the N image sensing modules comprise an exposure start time and an exposure time length.
5. The method according to claim 1 or 2, wherein the N image sensing modules include a first image sensing module, a second image sensing module through an nth image sensing module;
when the N image sensing modules receive the enabling signals, outputting images to the image processing module after a fixed time length FH, wherein the fixed time length FH comprises exposure time and delay time;
the image processing module sends configuration signals to the N image sensing modules respectively, wherein the configuration signals comprise exposure time parameters and are used for controlling the exposure time of the image sensing modules;
the first image sensing module receives the enabling signal at the time T1, triggers the first image sensing module, starts timing and calculates a delay time T0, wherein the delay time T0 is equal to a fixed time length FH minus an exposure time T1; when the timing reaches the delay time t0, the register generates an exposure starting signal, and the first image sensing module starts exposure; when exposure is finished, the first image sensing module outputs an image to the image processing module;
the second image sensing module receives the enabling signal at the time T2, triggers the second image sensing module, starts timing and calculates a delay time T2, wherein the delay time T2 is equal to a fixed time length FH minus an exposure time T3; when the timing reaches the delay time t2, the register generates an exposure starting signal, and the second image sensing module starts exposure; when the exposure is finished, the second image sensing module outputs an image to the image processing module;
by analogy, the Nth image sensing module receives the enabling signal at the moment TN, triggers the Nth image sensing module, starts timing and calculates delay time t (2N-2), wherein the delay time t (2N-2) is equal to a fixed time length FH minus exposure time t (2N-1); when the timing reaches the delay time t (2N-2), the register generates an exposure starting signal, and the Nth image sensing module starts exposure; and when the exposure is finished, the Nth image sensing module outputs an image to the image processing module.
6. The method according to claim 5, wherein the maximum time shift amount between the images output by the N image sensing modules comprises a maximum time shift amount between T1 time and TN time when the enable signal is received by the N image sensing modules and a maximum shift amount of clock difference between the N image sensing modules.
7. A control device of image output time sequence is characterized by comprising a read-write control unit and a storage unit;
the storage unit is used for storing a computer program;
the read-write control unit is used for reading a computer program from the storage unit and controlling the image processing module to send enabling signals to the N image sensing modules when the N image sensing modules work simultaneously, wherein the exposure time parameters of the N image sensing modules are independent; and controlling the N image sensing modules to output images to the image processing module after a fixed time length when the N image sensing modules receive the enabling signals.
8. The apparatus of claim 7, wherein the read-write control unit is further configured to:
controlling the image processing module to receive images respectively output by the N image sensing modules;
controlling the image processing module to calculate the maximum time offset among the images output by the N image sensing modules;
and when the maximum time offset is greater than or equal to a set threshold, controlling the image processing module to return to repeatedly execute sending of the synchronous signals to the N image sensing modules.
9. The apparatus according to claim 7 or 8, wherein when the N image sensing modules operate simultaneously, the apparatus further comprises a synchronization unit;
and the synchronization unit is used for carrying out clock synchronization on the N image sensing modules and the image processing module.
10. An electronic device comprising an image processing module according to the method of any one of claims 1 to 6 and N image sensing modules, N being a positive integer greater than 1.
11. A computer-readable storage medium, in which a computer program is stored which, when being executed by an image processing module, carries out the method of any one of claims 1 to 5.
CN202111250085.5A 2021-10-26 2021-10-26 Method, device, equipment and medium for controlling image output time sequence Pending CN113973179A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111250085.5A CN113973179A (en) 2021-10-26 2021-10-26 Method, device, equipment and medium for controlling image output time sequence

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111250085.5A CN113973179A (en) 2021-10-26 2021-10-26 Method, device, equipment and medium for controlling image output time sequence

Publications (1)

Publication Number Publication Date
CN113973179A true CN113973179A (en) 2022-01-25

Family

ID=79588456

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111250085.5A Pending CN113973179A (en) 2021-10-26 2021-10-26 Method, device, equipment and medium for controlling image output time sequence

Country Status (1)

Country Link
CN (1) CN113973179A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116170704A (en) * 2023-02-28 2023-05-26 武汉极动智能科技有限公司 Timing control system, method, device, electronic equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110205380A1 (en) * 2010-02-19 2011-08-25 Canon Kabushiki Kaisha Image sensing apparatus, communication apparatus, and control method of these apparatuses
JP2018056652A (en) * 2016-09-26 2018-04-05 キヤノン株式会社 Imaging device and control method for imaging device
US20180309919A1 (en) * 2017-04-19 2018-10-25 Qualcomm Incorporated Methods and apparatus for controlling exposure and synchronization of image sensors
WO2019023868A1 (en) * 2017-07-31 2019-02-07 深圳市大疆创新科技有限公司 Control method, control device and control system for image photographing
US20200213525A1 (en) * 2018-12-27 2020-07-02 Canon Kabushiki Kaisha Image capture apparatus and control method thereof

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110205380A1 (en) * 2010-02-19 2011-08-25 Canon Kabushiki Kaisha Image sensing apparatus, communication apparatus, and control method of these apparatuses
JP2018056652A (en) * 2016-09-26 2018-04-05 キヤノン株式会社 Imaging device and control method for imaging device
US20180309919A1 (en) * 2017-04-19 2018-10-25 Qualcomm Incorporated Methods and apparatus for controlling exposure and synchronization of image sensors
WO2019023868A1 (en) * 2017-07-31 2019-02-07 深圳市大疆创新科技有限公司 Control method, control device and control system for image photographing
US20200213525A1 (en) * 2018-12-27 2020-07-02 Canon Kabushiki Kaisha Image capture apparatus and control method thereof

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116170704A (en) * 2023-02-28 2023-05-26 武汉极动智能科技有限公司 Timing control system, method, device, electronic equipment and storage medium
CN116170704B (en) * 2023-02-28 2023-11-07 武汉极动智能科技有限公司 Timing control system, method, device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
US9813783B2 (en) Multi-camera dataset assembly and management with high precision timestamp requirements
CN107409176B (en) Apparatus and method for image synchronization
US20140368609A1 (en) Systems And Methods For Generating A Panoramic Image
US20150304629A1 (en) System and method for stereophotogrammetry
CN109685748B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN113038027B (en) Exposure control method, device, equipment and storage medium
CN112689090B (en) Photographing method and related equipment
US20230410251A1 (en) Methods And Apparatus For Optimized Stitching Of Overcapture Content
CN113973179A (en) Method, device, equipment and medium for controlling image output time sequence
US9969387B2 (en) Control system
CN111343381B (en) Method and device for controlling anti-shake function to be started, electronic equipment and storage medium
CN111556255B (en) Image generation method and device
CN113296114A (en) dTOF depth image acquisition method, dTOF depth image acquisition device, electronic device, and medium
US11082612B2 (en) Electronic device and image acquisition method thereof
US11699291B2 (en) Algorithm triggered sensor data acquisition
CN110999274B (en) Synchronizing image capture in multiple sensor devices
CN115277982A (en) Synchronous exposure processing method, device and storage medium
CN114697465B (en) Multi-image sensor synchronization and collaboration method and device, storage medium and terminal
CN112102796A (en) Method and device for acquiring ambient light information by using camera and terminal equipment
KR20210058363A (en) Electronic device for using depth information and operating method thereof
CN111787184B (en) Camera system
CN106937157B (en) Device for automatically synchronizing cross-clock domain video and operation method thereof
CN112051586B (en) Multi-TOF camera joint work anti-interference method, TOF camera and electronic equipment
US20230221419A1 (en) Lidar adaptive single-pass histogramming for low power lidar system
CN108683866B (en) Image processing and transmitting method, image processor, and related storage medium and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination