CN113542589B - Lens control method, electronic equipment and computer storage medium - Google Patents

Lens control method, electronic equipment and computer storage medium Download PDF

Info

Publication number
CN113542589B
CN113542589B CN202110602306.4A CN202110602306A CN113542589B CN 113542589 B CN113542589 B CN 113542589B CN 202110602306 A CN202110602306 A CN 202110602306A CN 113542589 B CN113542589 B CN 113542589B
Authority
CN
China
Prior art keywords
control
lens
instruction
parameters
feedback
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110602306.4A
Other languages
Chinese (zh)
Other versions
CN113542589A (en
Inventor
房鑫平
李龙杰
徐权正
张天齐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Dahua Technology Co Ltd
Original Assignee
Zhejiang Dahua Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Dahua Technology Co Ltd filed Critical Zhejiang Dahua Technology Co Ltd
Priority to CN202110602306.4A priority Critical patent/CN113542589B/en
Publication of CN113542589A publication Critical patent/CN113542589A/en
Application granted granted Critical
Publication of CN113542589B publication Critical patent/CN113542589B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/675Focus control based on electronic image sensor signals comprising setting of focusing regions
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Lens Barrels (AREA)
  • Studio Devices (AREA)

Abstract

The application discloses a lens control method, electronic equipment and a computer storage medium. The lens control method comprises the following steps: monitoring a zooming movement instruction, and responding to the zooming movement instruction to acquire control parameters of the lens; generating a multi-level instruction stream according to the control parameters, and caching the multi-level instruction stream; the group motion of the lens is controlled by using the cached multilevel instructions to adjust the focal length and/or focus of the lens. By the method, communication delay between chips of the lens can be reduced, so that the problem that the running speed of the lens group is too high to step out is solved.

Description

Lens control method, electronic equipment and computer storage medium
Technical Field
The present disclosure relates to the field of video monitoring technologies, and in particular, to a lens control method, an electronic device, and a computer storage medium.
Background
With the continuous expansion of security markets and the continuous integration of artificial intelligence (Artificial Intelligence, AI) technologies, higher and higher requirements are put forward on the performance of lenses commonly used for front-end cameras, such as wider focal segments, faster speeds, higher precision, longer service lives, and the like.
In the current lens control field, for the movement lens, a master control chip and a driving control chip are adopted to drive zoom and focus, so that a better full-range rapid focusing effect can be formed.
However, the existing general movement lens control scheme is adopted, and because of time delay (including system call and the like) in communication between the main control chip and the drive control chip, the running speed of the lens group is too high to cause step out easily, and the problems of optical and electromechanical performances of the lens cannot be fully exerted.
Disclosure of Invention
The technical problem that this application mainly solves is how to reduce the communication delay between the chip of camera lens to improve camera lens group running speed too fast and the problem of step out.
In order to solve the technical problems, one technical scheme adopted by the application is as follows: a lens control method is provided. The lens control method comprises the following steps: monitoring a zooming movement instruction, and responding to the zooming movement instruction to acquire control parameters of the lens; generating a multi-level instruction stream according to the control parameters, and caching the multi-level instruction stream; the group motion of the lens is controlled by using the cached multilevel instructions to adjust the focal length and/or focus of the lens.
In order to solve the technical problems, one technical scheme adopted by the application is as follows: an electronic device is provided. The electronic device includes: the micro control module is respectively connected with the main control module and the driving module; the main control module is used for monitoring a zooming movement instruction, responding to the zooming movement instruction, acquiring control parameters of the lens and generating a multi-stage instruction stream according to the control parameters; the micro control module is used for caching the multi-stage instruction stream and driving the group motion of the lens by utilizing the cached multi-stage instruction stream control driving module so as to adjust the focal length and/or the focal point of the lens.
In order to solve the technical problems, one technical scheme adopted by the application is as follows: a computer storage medium is provided. The computer storage medium has program instructions stored thereon that when executed implement: monitoring a zooming movement instruction, and responding to the zooming movement instruction to acquire control parameters of the lens; generating a multi-level instruction stream according to the control parameters, and caching the multi-level instruction stream; the group motion of the lens is controlled by using the cached multilevel instructions to adjust the focal length and/or focus of the lens.
The beneficial effects of this application are: compared with the prior art, the method and the device have the advantages that firstly, the control parameters of the lens are generated into a multi-level control flow, the multi-level control flow is cached, and then the cached multi-level control flow is utilized to control the group motion of the lens; by the method, when one or some control instructions in the multi-stage control flow are updated (after execution is completed), other control instructions in the multi-stage control flow can be executed at the same time, so that the execution operation and the updating operation of the multi-stage control flow can be performed at the same time, the communication delay of the multi-stage control flow between chips can be obviously reduced, and the problem that the group running speed of a lens is too fast and easy to step out due to the communication delay between chips can be effectively solved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art. Wherein:
FIG. 1 is a flow chart of an embodiment of a lens control method of the present application;
FIG. 2 is a flowchart illustrating a lens control method according to the embodiment of FIG. 1 at step S12;
FIG. 3 is a flowchart illustrating a step S13 in the lens control method of the embodiment of FIG. 1;
FIG. 4 is a schematic flowchart of step S31 in the method of FIG. 3;
FIG. 5 is a flow chart of another embodiment of a lens control method of the present application;
FIG. 6 is a schematic structural diagram of an embodiment of an electronic device of the present application;
FIG. 7 is a schematic diagram of an embodiment of a lens curve for a lens of the present application;
FIG. 8 is a schematic diagram of the workflow of the master control chip in the embodiment of FIG. 6;
FIG. 9 is a schematic workflow diagram of the micro-control chip in the embodiment of FIG. 6;
FIG. 10 is a schematic diagram of an embodiment of a computer storage medium of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
The terms "first," "second," and the like in this application are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. In the description of the present application, the meaning of "plurality" means at least two, for example, two, three, etc., unless specifically defined otherwise. Furthermore, the terms "comprise" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those listed steps or elements but may include other steps or elements not listed or inherent to such process, method, article, or apparatus.
First, a lens control method is provided, as shown in fig. 1, fig. 1 is a schematic flow chart of an embodiment of the lens control method. The lens control method of the embodiment comprises the following steps:
step S11: and monitoring a zooming movement instruction, and responding to the zooming movement instruction to acquire control parameters of the lens.
The zooming movement instruction can be generated by a user or automatically generated by the lens according to information such as a use scene and the like; and receiving a zooming movement instruction and acquiring control parameters of the lens.
Alternatively, the present embodiment may implement step S11 by a method as shown in fig. 2. The method of the present embodiment includes step S21 and step S22.
Step S21: and monitoring a zooming movement instruction, and responding to the zooming movement instruction to acquire the performance parameters and the current use scene information of the lens.
The performance parameters of the lens can be obtained by evaluation according to the functions of lens products and the parameters of lens suppliers when leaving factories, and are used for reflecting the performance (capability) of the lens; the performance parameters of the lens related to the embodiment may include parameters such as single motion step length, rotation speed, acceleration and deceleration intervals of each group of lenses under different focal segments and object distances.
The current use scene information of the lens mainly reflects the performance (capability) of the lens in the current use scene; the current usage scenario information of the lens related to the embodiment may include information of different focal segments, object distances and the like corresponding to the lens in the usage process. The current usage scenario information of the shot changes with the change of the current scenario.
Step S22: control parameters of the lens are determined based on the performance parameters and the current usage scenario information.
And carrying out fusion processing on the performance parameters of the lens and the current use scene information to determine control parameters of the lens under the current use scene.
The control parameters of the lens of this embodiment include an acceleration control parameter, a deceleration control parameter, a peak control parameter, and the like, and the related parameters mainly include a motion step, a Pulse Per Second (PPS) of the motor, a duty cycle, a subdivision number, and the like.
Step S12: and generating a multi-stage instruction stream according to the control parameters, and caching the multi-stage instruction stream.
The control parameters are respectively converted into (can be directly used as) control instructions, and the control instructions are stored in the cache area in a multi-stage instruction stream. For example, the control instructions may be cached in a multi-stage instruction stream according to the current focal length, object distance, motor control state, etc. of the lens.
Step S13: the group motion of the lens is controlled by using the cached multilevel instructions to adjust the focal length and/or focus of the lens.
The group motion of the lens is controlled by utilizing the cached multilevel command flow so as to adjust the focal length and/or focus of the lens, thereby realizing focusing and focusing of the lens and the like.
Compared with the prior art, the method and the device have the advantages that firstly, the control parameters of the lens are generated into a multi-level control flow, the multi-level control flow is cached, and then the cached multi-level control flow is utilized to control the group motion of the lens; by the method, when one or some control instructions in the multi-stage control flow are updated (after execution is completed), other control instructions in the multi-stage control flow can be executed at the same time, so that the execution operation and the updating operation of the multi-stage control flow can be performed at the same time, the communication delay of the multi-stage control flow between chips can be obviously reduced, and the problem that the group running speed of a lens is too fast and easy to step out due to the communication delay between chips can be effectively solved.
The multi-stage instruction stream of the embodiment comprises a plurality of control instructions, and a certain control logic is arranged among the plurality of control instructions so as to ensure that each group of the lens moves normally. The present embodiment can implement step S13 by the method shown in fig. 3. The method of the present embodiment includes steps S31 to S35.
Step S31: and controlling the corresponding group of the lenses to do corresponding movement by using the control instruction, and generating feedback parameters.
A lens is typically provided with a plurality of (including two) groups, and there are a plurality of control instructions (control parameters) controlling the movement of each group, each control instruction in the multi-stage instruction stream corresponding to a certain movement of a certain group.
Of course, in other embodiments, one-to-many relationships of control instructions to groups and/or group movements may be implemented by more complex control logic.
And generating feedback parameters according to the motion condition of the group after each control instruction is executed or in the executing process.
Various feedback parameters in the lens group motion are monitored and acquired in real time, including whether a lens object distance curve needs to be switched, whether a parameter adjustment threshold value is reached, whether control instruction execution is completed, and the like.
Step S32: judging whether a feedback signal is monitored or not based on the feedback parameter; if yes, go to step S33; if not, step S34 is performed.
The feedback signals may include feedback signals such as control commands (group motion) completion; the feedback signal may be derived from the feedback parameters described above.
Step S33: and updating the control parameters of the lens.
Judging whether a feedback signal of motion is received, if yes, updating control parameters of the lens, wherein the analysis shows that along with the change of the current use scene of the lens, the current use scene information of the lens is updated, so that the control parameters of the lens need to be acquired again, namely, the control parameters of the lens need to be updated. The method for acquiring the control parameters of the lens in step S11 is similar to that described above, and will not be repeated here. Or step S33 is a sub-step in step S11.
For example, if a feedback signal of a motion controlled by a certain control command is detected, new control parameters and control commands corresponding to the motion are generated according to characteristic parameters of the lens and current use scene information, and the control commands are updated into a multi-level control stream.
Step S34: monitoring whether feedback of the feedback signal is overtime; if yes, go to step S35.
Setting a time threshold for the feedback signal to judge whether feedback of the feedback signal is overtime; the time thresholds for different control commands, different groups, or different movements may be the same or different.
Step S35: and outputting a timeout prompt.
If the feedback of the feedback signal is overtime, outputting a overtime prompt, such as outputting a processing overtime error prompt, and ending the current lens control.
If the feedback of the feedback signal is not overtime, continuing to monitor whether the feedback of the feedback signal is overtime, and continuing to execute step S34.
Alternatively, the present embodiment may implement step S31 by a method as described in fig. 4. The method of the present embodiment includes step S41 and step S43.
Step S41: judging whether the multi-stage instruction stream is empty or not; if not, step S42 is executed, and if yes, step S43 is executed.
Step S42: and reading control instructions from the multi-stage instruction stream, controlling the corresponding groups of the lenses to do corresponding movements by using the control instructions according to a preset sequence, and generating feedback parameters.
Specifically, the present embodiment can realize step S42 by step S421 to step S424 as shown in fig. 4.
Step S421: and reading the control instruction from the multi-stage instruction stream, and storing the control instruction into a preset cache area, wherein the cache area is different from the storage area of the multi-stage instruction stream.
If the multi-stage instruction stream is not empty, the control instruction is read from the multi-stage instruction stream and added to a cache area for executing the control instruction.
In this embodiment, the storage area of the write (receive) data of the multi-stage instruction stream is different from the storage area of the read (execute) data, so that the problem of data read/write collision can be avoided, and the accuracy and efficiency of lens control can be improved.
Of course, in other embodiments, to save memory space, only one memory area may be provided for the multi-level instruction stream, i.e., the memory area of the write data of the multi-level instruction stream is the same as the memory area of the read data thereof.
Step S422: and controlling the corresponding group of the lenses to do corresponding movement by using the control instruction of the buffer area according to a preset sequence, and generating feedback parameters.
The embodiment controls the movement of the lens group by using the control instruction of the buffer area, and can execute other control instructions in the multi-stage control flow at the same time when one or some control instructions in the multi-stage control flow are updated.
Step S423: judging whether the multi-stage instruction stream is empty or not; if yes, step S424 is executed, and if no, step S421 is executed.
After each control instruction execution is completed, it is again determined whether the multi-stage instruction stream is empty (same as step S422).
Step S424: judging whether the cache area is empty or not; if not, go to step S422; if yes, go to step S43.
Step S43: ending the task.
If the data in the storage area of the write-in data of the multi-stage instruction stream is empty and the data in the cache area of the execution data is also empty, ending the task.
It may be further determined whether the task is required to be ended before executing step S43, if yes, step S43 is executed, and if no, step S41 is executed.
In another embodiment, as shown in fig. 5, the lens control method of the present embodiment includes the following steps:
step S51: and monitoring a zooming movement instruction, and responding to the zooming movement instruction to acquire the performance parameters and the current use scene information of the lens.
Step S51 is similar to step S21 described above, and is not repeated here.
Step S52: control parameters of the lens are determined based on the performance parameters and the current usage scenario information.
Step S52 is similar to step S22 described above, and is not repeated here.
Step S53: and generating a multi-stage instruction stream according to the control parameters, and caching the multi-stage instruction stream.
Step S53 is similar to step S12 described above, and is not repeated here.
Step S54: and controlling the corresponding group of lenses to do corresponding movement by using a control instruction in the multi-stage instruction stream, and generating feedback parameters.
Step S54 is similar to step S31 described above, and is not repeated here.
Step S55: judging whether a stop motion instruction of the group is monitored; if not, step S56, step S59 to step S512 are executed, and if yes, step S57 and step S58 are executed.
Step S56: and adjusting the control parameters based on the feedback parameters and the current use scene information.
The adjustment method can be referred to the above embodiments.
Step S57: the current operating parameters of the group are obtained from the control parameters.
Step S58: and setting the deceleration stop parameters of the group according to the current operation parameters, and updating the deceleration stop parameters into the cached multi-stage instruction stream so as to control the deceleration stop of the group by utilizing the updated multi-stage instruction stream.
The method from step S55 to step S58 can determine whether to stop the movement of the lens group, and if a stop command is received or the group has moved to a position close to the boundary, the movement of the lens group needs to be stopped, and a deceleration stop parameter needs to be set; if the group motion does not need to be stopped, the control parameters are adjusted based on the feedback parameters and the current use scene information so as to improve the accuracy of the group motion control.
Step S59: judging whether a feedback signal is monitored or not based on the feedback parameter; if yes, go to step S510; if not, step S511 is performed.
Step S510: and updating the control parameters of the lens.
Step S511: monitoring whether feedback of the feedback signal is overtime; if yes, go to step S512.
Step S512: and outputting a timeout prompt.
Steps S59 to S512 are similar to steps S32 to S34 described above, and are not repeated here.
The present application further proposes an electronic device, as shown in fig. 6, an electronic device 60 of the present embodiment includes: the micro control module 62 is respectively connected with the main control module 61 and the driving module 63; the main control module 61 is configured to monitor a zoom motion command, obtain control parameters of a lens (not shown) in response to the zoom motion command, and generate a multi-stage command stream according to the control parameters; the micro-control module 62 is used for buffering the multi-level instruction stream, and driving the group motion of the lens by using the buffered multi-level instruction stream control driving module 63 to adjust the focal length and/or focus of the lens.
In contrast to the prior art, the present embodiment generates a multi-stage control flow from control parameters of the lens through the main control module 61; caching the multi-level control flow through the micro control module 62, and driving the group motion of the lens by utilizing the cached multi-level control flow control driving module 63; in this way, when updating one or some control instructions in the multi-stage control flow, the micro control module 62 executes other control instructions in the multi-stage control flow at the same time, so that the execution operation and the updating operation of the multi-stage control flow can be performed at the same time, the communication delay of the multi-stage control flow between the main control module 61 and the micro control module 62 can be obviously reduced, and the problem that the group running speed of the lens is too fast and easy to step out due to the communication delay between chips can be effectively solved.
The lens may be integrated in the electronic device 60.
The main control module 61 of the present embodiment may be a main control chip, the micro control module 62 may be a micro control unit (Microcontroller Unit, MCU), and the driving module 63 may be a stepper motor driving chip. The main control chip is mainly responsible for unified scheduling, configuration and issuing lens control parameters which are jointly generated by lens performance parameters and actual use scene information, wherein the parameters comprise acceleration control parameters, deceleration control parameters, peak control parameters and the like, and the related parameters mainly comprise a motion step length, a PPS (pulse signal processor), a duty ratio, a subdivision number and the like of a motor and are sent to the MCU in a form of a multi-level instruction stream. The MCU is used as a secondary execution chip and transmits a corresponding pulse width modulation signal to the stepping motor driving chip according to the control instruction so as to control the stepping motor driving chip to drive the operation of the boom and focus groups. Meanwhile, the MCU has the multi-stage instruction stream execution capability, can directly acquire a stepping motor driving chip in the multi-stage instruction stream, which is automatically controlled by a control instruction, and feeds back real-time motion execution condition information to a main control chip. The stepping motor driving chip is responsible for receiving pulse width modulation signals from the MCU and actually driving each group to accurately run according to steps.
The lens of the embodiment may be an ultra-long focal lens, which is a movement lens. The ultra-long focal lens is also called a large-long focal lens, and refers to a lens with a longer maximum focal length than a long-focal lens, such as a lens with a maximum focal length of 360 mm-1000+mm. The lens is mainly used for ultra-long distance monitoring, and part of the lenses of the type also drive each group in each lens through a stepping motor so as to achieve the effects of optical zooming and automatic focusing.
In other embodiments, the lens may also be other lenses, such as a tele lens, and the like.
As shown in FIG. 7, the lens curve of the ultra-long focal lens is more gentle at the near side, the focus value is less variable, the main bottleneck of the movement speed is zoom, and the operation speed of the zoom needs to be greatly improved; at the near tele end, the focus curve is particularly steep, the main bottleneck of the movement speed is focus, and the focus running speed needs to be greatly improved.
The control logic of the main control chip and the MCU of the application combines the optical characteristics of the lens, changes the instruction transmission and instruction execution time sequence of the main control end and the actual control end in a mode of combining the multistage instruction flow and the execution buffer, greatly reduces the communication delay between chips macroscopically, and effectively solves the problem that the group operation speed is too fast and easy to lose synchronization due to the communication delay between chips. On the basis, the purposes of fully playing the optical and electromechanical capabilities of the lens, improving the overall running speed of the lens and realizing the quick zooming and focusing of the ultra-long focal lens are achieved.
The multi-stage instruction stream includes a plurality of control instructions, and the main control module 61 is configured to obtain a performance parameter and current usage scenario information of the lens, and determine a control parameter of the lens based on the performance parameter and the current usage scenario information; the main control module 61 is configured to transmit a control instruction to the micro control module 62, so that the micro control module 62 controls the driving module 63 to drive the corresponding group of lenses to perform corresponding movement by using the control instruction, and generate feedback parameters to the main control module 61; the main control module 61 judges whether the feedback signal is monitored based on the feedback parameter; if yes, the main control module 61 executes the control parameters of the updated lens; if not, the main control module 61 monitors whether the feedback of the feedback signal is overtime; if the time-out is over, the main control module 61 outputs a time-out prompt; if not, the main control module 61 continues to monitor whether the feedback of the feedback signal is overtime.
In an application scenario, the main control chip workflow is as shown in fig. 8: the method comprises the steps that firstly, a main control chip receives a lens zooming movement instruction; secondly, evaluating a set of lens control parameters according to actual use scenes of the products, product functions and factory parameters (performance parameters) of suppliers; thirdly, issuing a control instruction to the MCU for execution in a multi-stage instruction stream mode according to the current focal section, object distance, motor control state and other conditions; step four, acquiring various feedback parameters in operation, including whether a lens object distance curve needs to be switched, whether a parameter adjustment threshold value is reached, and the like; fifthly, judging whether the movement needs to be stopped, and when a stopping command is received or the group moves to a position close to the boundary, setting a deceleration stopping parameter and the like, and turning to a tenth step; when a stop command is not received or the group does not move to a position close to the boundary; sixthly, obtaining subsequent control parameters according to the performance parameters and the current feedback parameters; seventh, if the running feedback condition of the previous instruction stream is received by the MCU, the process goes to the third step, the instruction stream is refreshed to the MCU, and if not, the process goes to the eighth step; eighth step, checking whether MCU feedback is overtime, if not, turning to seventh step, and if overtime, turning to ninth step; ninth, processing timeout error, and turning to twelfth after finishing; tenth, setting a deceleration stop parameter according to the performance parameter and the current actual running parameter, decomposing the deceleration stop parameter into an instruction stream, and turning to an eleventh step; eleventh step, issuing a multi-level instruction stream to the MCU, and turning to twelfth step after finishing; and twelfth, ending the flow.
The micro control module 62 of the present embodiment is configured to determine whether the multi-level instruction stream is empty; if not, the micro control module 62 reads the control instruction from the multi-level instruction stream and stores the control instruction in a preset cache area, wherein the cache area is different from the storage area of the multi-level instruction stream; the micro control module 62 is configured to control the corresponding group of lenses to perform corresponding movement according to a preset sequence by using the control instruction of the cache area, and generate feedback parameters to the main control module 61; the micro control module 62 is configured to determine whether the multi-level instruction stream is empty; if yes, the micro control module 62 determines whether the cache area is empty; if not, the micro control module 62 performs the steps of reading the control instruction from the multi-level instruction stream and storing the control instruction in a preset cache area; if the buffer area is not empty, the micro control module 62 performs the steps of controlling the corresponding group of lenses to perform corresponding movement according to the preset sequence by using the control command of the buffer area, and generating feedback parameters to the main control module 61.
In an application scenario, the MCU workflow is as shown in fig. 9: firstly, importing control parameters of a lens, and initializing a motion environment; step two, checking whether the multi-stage instruction stream is empty, otherwise, turning to step three; if yes, turning to an eighth step; thirdly, reading data in the multi-level instruction stream, and adding an execution buffer; fourth step; executing the commands in the cache, and notifying the main control chip after the single control command is executed, and sequentially executing the control commands in the cache; fifthly, checking whether the multi-stage instruction stream is empty, if not, transferring to the sixth step, and if empty, transferring to the seventh step; step six, reading data in the multi-stage instruction stream, adding an execution buffer, and switching to the step four; seventh, checking whether the execution buffer is empty, otherwise, turning to the fourth step, and turning to the eighth step; eighth, if the task is finished, turning to a ninth step, otherwise turning to a second step; and ninth, ending the process.
The method is also used for realizing the lens control method.
The present application further proposes a computer storage medium, as shown in fig. 10, and fig. 10 is a schematic structural diagram of an embodiment of the computer storage medium of the present application. The computer storage medium 90 has stored thereon program instructions 91, the program instructions 91 when executed by a processor (not shown) implementing: monitoring a zooming movement instruction, and responding to the zooming movement instruction to acquire control parameters of the lens; generating a multi-level instruction stream according to the control parameters, and caching the multi-level instruction stream; the group motion of the lens is controlled by using the cached multilevel instructions to adjust the focal length and/or focus of the lens.
The program instructions 91, when executed by a processor (not shown), also implement the lens control method of the above-described embodiment.
The computer storage medium 90 of the present embodiment may be, but is not limited to, a usb disk, an SD card, a PD optical drive, a mobile hard disk, a high capacity floppy drive, a flash memory, a multimedia memory card, a server, etc.
Compared with the prior art, the method and the device have the advantages that firstly, the control parameters of the lens are generated into a multi-level control flow, the multi-level control flow is cached, and then the cached multi-level control flow is utilized to control the group motion of the lens; by the method, when one or some control instructions in the multi-stage control flow are updated (after execution is completed), other control instructions in the multi-stage control flow can be executed at the same time, so that the execution operation and the updating operation of the multi-stage control flow can be performed at the same time, the communication delay of the multi-stage control flow between chips can be obviously reduced, and the problem that the group running speed of a lens is too fast and easy to step out due to the communication delay between chips can be effectively solved.
In addition, the above-described functions, if implemented in the form of software functions and sold or used as a separate product, may be stored in a mobile terminal-readable storage medium, that is, the present application also provides a storage device storing program data that can be executed to implement the method of the above-described embodiment, the storage device may be, for example, a U-disk, an optical disk, a server, or the like. That is, the present application may be embodied in a software product that includes instructions for causing a smart terminal to perform all or part of the steps of the methods described in the various embodiments.
In the description of the present application, a description referring to terms "one embodiment," "some embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present application. In this specification, schematic representations of the above terms are not necessarily directed to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, the different embodiments or examples described in this specification and the features of the different embodiments or examples may be combined and combined by those skilled in the art without contradiction.
Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include at least one such feature. In the description of the present application, the meaning of "plurality" is at least two, such as two, three, etc., unless explicitly defined otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and further implementations are included within the scope of the preferred embodiment of the present application in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the embodiments of the present application.
Logic and/or steps represented in the flowcharts or otherwise described herein, e.g., may be considered as a ordered listing of executable instructions for implementing logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device (which can be a personal computer, server, network device, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions). For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). In addition, the computer readable medium may even be paper or other suitable medium on which the program is printed, as the program may be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory.
The foregoing description is only of embodiments of the present application, and is not intended to limit the scope of the patent application, and all equivalent structures or equivalent processes using the descriptions and the contents of the present application or other related technical fields are included in the scope of the patent application.

Claims (6)

1. A lens control method, characterized by comprising:
monitoring a zooming movement instruction, and responding to the zooming movement instruction to acquire control parameters of the lens;
generating a multi-stage instruction stream according to the control parameters, and caching the multi-stage instruction stream;
controlling group motion of the lens by utilizing the cached multi-stage instruction flow so as to adjust the focal length and/or focus of the lens;
the multi-level instruction stream includes a plurality of control instructions, the controlling the group motion of the lens with the cached multi-level instruction stream includes:
controlling the corresponding group of the lenses to do corresponding movement by utilizing the control instruction, and generating feedback parameters;
judging whether a feedback signal is monitored or not based on the feedback parameter;
if yes, updating the control parameters;
wherein, each control instruction controls the corresponding group to do corresponding movement, and each control instruction generates corresponding feedback parameters according to the movement condition of the group after the execution of each control instruction is completed or in the execution process;
wherein, the group motion of the lens is controlled by the cached multi-stage instruction flow, and the method further comprises:
if not, monitoring whether the feedback of the feedback signal is overtime;
if the time is out, outputting a time-out prompt;
if not, continuing to monitor whether the feedback of the feedback signal is overtime;
wherein, the controlling the corresponding group of the lenses to perform corresponding movement by using the control instruction, and generating the feedback parameter includes:
judging whether the multi-stage instruction stream is empty or not;
if not, reading a control instruction from the multi-stage instruction stream, and controlling the corresponding group of the lens to do corresponding movement by using the control instruction according to a preset sequence, and generating feedback parameters;
the step of reading the control instruction from the multi-stage instruction stream, and controlling the corresponding group of the lens to perform corresponding movement by using the control instruction according to a preset sequence, and generating feedback parameters includes:
reading a control instruction from the multi-stage instruction stream, and storing the control instruction into a preset cache area, wherein the cache area is different from a storage area of the multi-stage instruction stream;
the corresponding groups of the lenses are controlled to do corresponding motions by using the control instructions of the cache areas according to a preset sequence, and feedback parameters are generated;
judging whether the multi-stage instruction stream is empty or not;
if yes, judging whether the cache area is empty or not;
if not, executing the step of reading the control instruction from the multi-stage instruction stream and storing the control instruction into a preset cache area;
and if the cache area is not empty, executing the step of controlling the corresponding group of the lens to do corresponding movement by using the control instruction of the cache area according to a preset sequence and generating feedback parameters.
2. The lens control method according to claim 1, wherein the acquiring the control parameters of the lens includes:
acquiring performance parameters and current use scene information of the lens;
and determining control parameters of the lens based on the performance parameters and the current use scene information.
3. The method according to claim 2, further comprising, after the controlling the corresponding group of lenses to perform the corresponding movement by the control command and generating the feedback parameter:
judging whether a stop motion instruction of the group is monitored;
if the motion stopping instruction of the group is not monitored, adjusting the control parameters based on the feedback parameters and the current use scene information;
if the motion stopping instruction of the group is monitored, acquiring the current operation parameters of the group from the control parameters;
and setting a deceleration stop parameter of the group according to the current operation parameter, and updating the deceleration stop parameter into the cached multi-stage instruction stream so as to control the deceleration stop of the group by utilizing the updated multi-stage instruction stream.
4. An electronic device, comprising: the device comprises a main control module, a micro control module and a driving module, wherein the micro control module is respectively connected with the main control module and the driving module;
the main control module is used for monitoring a zooming movement instruction, responding to the zooming movement instruction, obtaining control parameters of a lens, and generating a multi-stage instruction stream according to the control parameters; the micro control module is used for caching the multi-stage instruction stream, and utilizing the cached multi-stage instruction stream to control the driving module to drive the group of the lens to move so as to adjust the focal length and/or focus of the lens;
the multi-stage instruction stream comprises a plurality of control instructions, and the main control module is used for transmitting the control instructions to the micro control module so that the micro control module can control the driving module to drive the corresponding group of the lens to do corresponding movement by using the control instructions and generate feedback parameters to the main control module; the main control module judges whether a feedback signal is monitored or not based on the feedback parameter; if yes, the main control module updates the control parameters; if not, the main control module monitors whether the feedback of the feedback signal is overtime; if the time is out, the main control module outputs a time-out prompt; if not, the main control module continues to monitor whether the feedback of the feedback signal is overtime;
the micro control module is used for judging whether the multi-stage instruction stream is empty or not; if not, the micro control module reads a control instruction from the multi-stage instruction stream and stores the control instruction into a preset cache area, wherein the cache area is different from a storage area of the multi-stage instruction stream; the micro control module is used for controlling the corresponding group of the lenses to do corresponding movement by utilizing the control instructions of the cache area according to a preset sequence, and generating feedback parameters to the main control module; the micro control module is used for judging whether the multi-stage instruction stream is empty or not; if yes, the micro control module judges whether the cache area is empty; if not, the micro control module executes the step of reading a control instruction from the multi-stage instruction stream and storing the control instruction into a preset cache area; and if the cache area is not empty, the micro control module executes the steps of controlling the corresponding group of the lenses to do corresponding movement by using the control instruction of the cache area according to a preset sequence and generating feedback parameters to the main control module.
5. The electronic device of claim 4, wherein the master control module is configured to obtain performance parameters and current usage scenario information of the lens, and determine control parameters of the lens based on the performance parameters and the current usage scenario information.
6. A computer storage medium having stored thereon program instructions that when executed implement:
monitoring a zooming movement instruction, and responding to the zooming movement instruction to obtain control parameters of a lens;
generating a multi-stage instruction stream according to the control parameters, and caching the multi-stage instruction stream;
controlling group motion of the lens by utilizing the cached multi-stage instruction flow so as to adjust the focal length and/or focus of the lens;
the multi-level instruction stream includes a plurality of control instructions, the controlling the group motion of the lens with the cached multi-level instruction stream includes:
controlling the corresponding group of the lenses to do corresponding movement by utilizing the control instruction, and generating feedback parameters;
judging whether a feedback signal is monitored or not based on the feedback parameter;
if yes, updating the control parameters;
wherein, each control instruction controls the corresponding group to do corresponding movement, and each control instruction generates corresponding feedback parameters according to the movement condition of the group after the execution of each control instruction is completed or in the execution process;
wherein, the group motion of the lens is controlled by the cached multi-stage instruction flow, and the method further comprises:
if not, monitoring whether the feedback of the feedback signal is overtime;
if the time is out, outputting a time-out prompt;
if not, continuing to monitor whether the feedback of the feedback signal is overtime;
wherein, the controlling the corresponding group of the lenses to perform corresponding movement by using the control instruction, and generating the feedback parameter includes:
judging whether the multi-stage instruction stream is empty or not;
if not, reading a control instruction from the multi-stage instruction stream, and controlling the corresponding group of the lens to do corresponding movement by using the control instruction according to a preset sequence, and generating feedback parameters;
the step of reading the control instruction from the multi-stage instruction stream, and controlling the corresponding group of the lens to perform corresponding movement by using the control instruction according to a preset sequence, and generating feedback parameters includes:
reading a control instruction from the multi-stage instruction stream, and storing the control instruction into a preset cache area, wherein the cache area is different from a storage area of the multi-stage instruction stream;
the corresponding groups of the lenses are controlled to do corresponding motions by using the control instructions of the cache areas according to a preset sequence, and feedback parameters are generated;
judging whether the multi-stage instruction stream is empty or not;
if yes, judging whether the cache area is empty or not;
if not, executing the step of reading the control instruction from the multi-stage instruction stream and storing the control instruction into a preset cache area;
and if the cache area is not empty, executing the step of controlling the corresponding group of the lens to do corresponding movement by using the control instruction of the cache area according to a preset sequence and generating feedback parameters.
CN202110602306.4A 2021-05-31 2021-05-31 Lens control method, electronic equipment and computer storage medium Active CN113542589B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110602306.4A CN113542589B (en) 2021-05-31 2021-05-31 Lens control method, electronic equipment and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110602306.4A CN113542589B (en) 2021-05-31 2021-05-31 Lens control method, electronic equipment and computer storage medium

Publications (2)

Publication Number Publication Date
CN113542589A CN113542589A (en) 2021-10-22
CN113542589B true CN113542589B (en) 2023-05-19

Family

ID=78094926

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110602306.4A Active CN113542589B (en) 2021-05-31 2021-05-31 Lens control method, electronic equipment and computer storage medium

Country Status (1)

Country Link
CN (1) CN113542589B (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015041029A (en) * 2013-08-23 2015-03-02 キヤノン株式会社 Lens system
JP6516472B2 (en) * 2015-01-08 2019-05-22 オリンパス株式会社 Imaging device and control method of imaging device
JP6525792B2 (en) * 2015-07-29 2019-06-05 キヤノン株式会社 Lens control device and control method of lens control device
CN110225235B (en) * 2018-03-01 2021-03-09 浙江宇视科技有限公司 Zoom following method and electric lens
CN110855885B (en) * 2019-11-14 2021-02-02 浙江大华技术股份有限公司 Lens rotation control method and device

Also Published As

Publication number Publication date
CN113542589A (en) 2021-10-22

Similar Documents

Publication Publication Date Title
CN110351490B (en) Exposure method, exposure device and image pickup equipment
CN100559259C (en) Carry out the imaging device and the control method thereof of focusing
EP3641301B1 (en) Method and apparatus for inhibiting aec jump, and terminal device
US10447914B2 (en) Imaging apparatus and control method, lens apparatus and control method, using selective communication methods for improved focus adjustment
US9172523B2 (en) Image pickup apparatus, lens apparatus, and image pickup system
US5387960A (en) Camera apparatus
US20040263633A1 (en) Imaging apparatus
CN102967991A (en) Method, device, system and mobile terminal for adjusting projector focal length
JP2013140297A (en) Image signal processor
CN101790043A (en) Control method for automatic focusing and device thereof
CN113485465B (en) Camera holder control method, device, equipment and storage medium
CN110753182B (en) Adjusting method and apparatus of image forming apparatus
CN106993129A (en) Control method, control device and electronic installation
CN106101565A (en) A kind of motor-driven lens focus method and device
CN1072425C (en) Video zooming method and video camera adopting the same
CN113542589B (en) Lens control method, electronic equipment and computer storage medium
CN114845023B (en) Driving method and device
CN104767934B (en) Information processing method and electronic equipment
CN117412171A (en) Tracking focusing method, system, equipment and readable storage medium based on camera
CN109995982B (en) Method and device for automatically focusing electric lens and camera
CN113747042A (en) Stepping step length control method and method for determining stepping step length of stepping motor
CN117440236A (en) Control method, ISP (Internet service provider), electronic equipment and computer storage medium
CN110740249A (en) Image acquisition method and image acquisition equipment
CN107257437A (en) Accelerate the convergent method and devices of AEC, terminal device
CN115314628A (en) Imaging method, system and camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant