Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer, the present invention will be described in further detail with reference to the accompanying drawings, and it is apparent that the described embodiments are only a part of the embodiments of the present invention, not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, an embodiment of the present invention provides a schematic flow diagram of a shooting method for quickly moving an object, and specifically, the implementation method includes:
step S101, the terminal acquires monitoring information of a target object at least two different positions, wherein the movement direction of the target object is from the field of view range of a first camera to the field of view range of a second camera, and at least one position is simultaneously located in the field of view ranges of the first camera and the second camera 2.
And step S102, the terminal determines the moving speed of the target object according to the monitoring information of the at least two different positions.
Step S103, when the moving speed exceeds a first threshold value, the terminal determines time information of the target object moving to a second view field center of a second camera.
And step S104, the terminal sends a shooting instruction to the second camera, and the shooting instruction instructs the second camera to shoot the target object according to the time information.
As shown in fig. 2, the terminal in the above method is a terminal having two cameras, and the two cameras are embedded in the housing of the terminal in parallel. Therefore, when a user uses the terminal to shoot a moving object which enters the field range of the left camera, the left camera monitors the running state of the moving object, the time information of the moving object entering the focusing center of the right camera is determined according to the running state, and the right camera shoots the moving object according to the time information to obtain a clear image, as shown in fig. 3. Similarly, the moving object enters the field range of the right camera from the right side first, and then the left camera shoots the moving object, the principle is similar, and the description is omitted here. Therefore, in the above step, when the first camera is the camera on the left side of the terminal, the second camera is the camera on the right side of the terminal, and when the first camera is the camera on the right side of the terminal, the second camera is the camera on the left side of the terminal.
Further, the monitoring information of the at least two different positions includes a first included angle between the target object at a first position and a first view center line of the first camera, a second included angle between the target object at a second position and the first view center line of the first camera, a first time when the target object is located at the first position, and a second time when the target object is located at the second position, wherein the second position is within the view range of both the first camera and the second camera;
the terminal determines an offset angle between the first position and the second position according to the first included angle and the second included angle;
and the terminal determines the angular speed of the movement of the target object according to the third included angle and a first time length, wherein the first time length is the time difference between the first time and the second time.
Specifically, as shown in fig. 4, the first camera acquires image information of the moving object M at a first position and a second position, wherein the second camera also acquires image information of the moving object M at the second position, so that the processor can obtain a first angle, i.e. an angle θ 1, by which the moving object M is shifted from a center line of a field of view of the first camera at the first position, and a second angle, i.e. an angle θ 2, by which the moving object M is shifted from the center line of the field of view of the first camera at the second position, according to the image information of the moving object M at the first position and the second position, so that the sum of θ 1 and θ 2 is an angle by which the moving object M is moved from the first position to the second position, i.e. the sum of θ 1 and θ 2 is a shift angle between the first position and the second position, assuming that the time t1 of the image information of the moving object M at the first position, the time T2 when the image information of the moving object M is at the second position, T2 being separated from T1 by the time length, i.e., the first time length, is T1, then the angular velocity W of the moving object moving from the first position to the second position can be calculated by the following formula:
further, when the processor determines that the angular velocity W exceeds the first threshold, a first time duration for the target object to move from the second position to a second field of view center of the second camera is calculated, which is specifically as follows: because the moving object M is offset by the angle θ 3 of the central line of the field of view of the second camera when in the second position, that is, the third included angle is θ 3, further, the processor can predict the second time period T2 taken by the moving object M to move from the second position to the third position according to the angular velocity W and the third included angle θ 3, and can calculate the second time period T2 by the following formula:
assuming that the processor sends a command to the second camera with a time delay, wherein the time delay refers to a time difference between the sending time of the terminal sending the shooting command and the receiving time of the second camera receiving the shooting command, the processor instructs the second camera to start shooting when a time length of T3 passes after the time of T2, wherein T3 is equal to a time difference between T2 and the time delay, and the moving object M is located on a central line of a field of view of the second camera, so that the second camera can shoot a clearer image.
Considering that the time delay time is generally short, the processor may also directly calculate the shooting time of the second camera, that is, the third time T3, T3 is equal to the time after T2 and the time of T2 is elapsed, and the processor instructs the second camera to shoot the target object at T3, so that a clearer image can be shot.
On the other hand, in one possible design, if only one of the two cameras monitors the moving object M and the other camera takes a picture of it, the second position may be fixed on the boundary of the field of view range of the second camera shown in fig. 5, because θ 3 is half the angle of the field of view, so θ 3 is fixed and known. In the scenario shown in fig. 5, the first camera monitors that the moving object M moves from the first position to the second position, and records the image information of the two positions, and then the processor calculates the angular velocity W according to the formula [1], since θ 3 is known, T2 can be directly calculated according to θ 3 and the angular velocity W, and then the processor triggers the second camera to shoot the moving object M for a time period of T2.
Therefore, the method and the device have the advantages that the function of automatically capturing and shooting the fast moving object is developed on the terminal with the double cameras, so that the fast moving object can be automatically and accurately shot, and the problems of low shooting speed and easiness in shaking caused by manual clicking are solved. The problem of unclear imaging in the process of shooting a fast moving object is well solved.
Based on the same technical concept, the embodiment of the invention also provides a terminal, and the terminal can execute the method embodiment. As shown in fig. 6, the terminal provided in the embodiment of the present invention includes an obtaining unit 401, a determining unit 402, and a sending unit 403, where:
an obtaining unit 401, configured to obtain monitoring information of a target object at least two different positions, where a moving direction of the target object is from a field of view range of a first camera to a field of view range of a second camera, and at least one position is located in the field of view ranges of the first camera and the second camera at the same time;
a determining unit 402, configured to determine a moving speed of the target object according to the monitoring information of the at least two different positions; when the moving speed exceeds a first threshold value, determining time information of the target object moving to a second view field center of a second camera;
a sending unit 403, configured to send a shooting instruction to the second camera, where the shooting instruction instructs the second camera to shoot the target object according to the time information.
Wherein: the monitoring information of at least two different positions comprises a first included angle between the target object at a first position and a first view field central line of the first camera, a second included angle between the target object at a second position and the first view field central line of the first camera, and the target object is positioned at a first moment of the first position and at a second moment of the second position, wherein the second position is positioned in the view field range of the first camera and the second camera at the same time.
Further, the determining unit 402 is specifically configured to: determining an offset angle between the first position and the second position according to the first included angle and the second included angle; and determining the angular speed of the movement of the target object according to the offset angle and a first time length, wherein the first time length is the time length between the first time and the second time.
The determining unit 402 is specifically configured to:
determining a third included angle between the target object at the second position and a second view field central line of the second camera according to the monitoring information of the second position;
determining a second time length used when the target object moves from the second position to a second view field center of the second camera according to the ratio of the third included angle to the moving speed;
and determining a third time when the target object moves to the center of a second field of view of the second camera according to a second time when the target object is at the second position and the second duration.
The determining unit 402 is further configured to: determining a time delay used for sending the shooting instruction, wherein the time delay is a time difference between the sending time of the shooting instruction sent by the processor and the receiving time of the shooting instruction received by the second camera;
the sending unit 403 is specifically configured to send a shooting instruction to the second camera, where the shooting instruction instructs the second camera to shoot the target object through a third duration after the second time, and the third duration is a difference between the first duration and the time delay.
Further, still include: a receiving unit 404, configured to receive a shooting scene instruction input by a user, where the shooting scene instruction indicates that a current shooting scene is a fast shooting scene;
the sending unit 403 is further configured to notify the first camera and/or the second camera to start monitoring the target object according to the shooting scene instruction.
Fig. 7 is a schematic structural diagram of another terminal provided in the present application, where the terminal includes: a communication interface 501, a processor 502, a memory 503, and a bus system 504;
the memory 503 is used for storing programs. In particular, the program may include program code including computer operating instructions. The memory 503 may be a RAM or a NVM, such as at least one disk memory. Only one memory is shown in the figure, but of course, the memory may be provided in plural numbers as necessary. The memory 503 may also be memory in the processor 502.
The memory 503 stores the following elements, executable modules or data structures, or a subset thereof, or an expanded set thereof:
and (3) operating instructions: including various operational instructions for performing various operations.
Operating the system: including various system programs for implementing various basic services and for handling hardware-based tasks.
The processor 502 controls the operation of the dual cameras, and the processor 502 may also be referred to as a CPU. In a particular application, the various components of the terminal are coupled together by a bus system 504, where the bus system 504 may include a power bus, a control bus, a status signal bus, and the like, in addition to a data bus. For clarity of illustration, however, the various buses are designated in the figure as the bus system 504. For ease of illustration, it is only schematically drawn in fig. 7.
The method disclosed in the embodiments of the present application may be applied to the processor 502 or implemented by the processor 502. The processor 502 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 502. The processor 502 described above may be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in a memory 503, and the processor 502 reads the information in the memory 503 and performs the above method steps in conjunction with its hardware.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various modifications and variations can be made in the embodiments of the present invention without departing from the spirit and scope of the application. Thus, if such modifications and variations of the embodiments of the present invention fall within the scope of the claims of the present application and their equivalents, the present application is also intended to encompass such modifications and variations.