CN107395972B - Shooting method and terminal for fast moving object - Google Patents

Shooting method and terminal for fast moving object Download PDF

Info

Publication number
CN107395972B
CN107395972B CN201710642968.8A CN201710642968A CN107395972B CN 107395972 B CN107395972 B CN 107395972B CN 201710642968 A CN201710642968 A CN 201710642968A CN 107395972 B CN107395972 B CN 107395972B
Authority
CN
China
Prior art keywords
camera
target object
terminal
time
shooting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710642968.8A
Other languages
Chinese (zh)
Other versions
CN107395972A (en
Inventor
姚奇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huaqin Technology Co Ltd
Original Assignee
Huaqin Telecom Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huaqin Telecom Technology Co Ltd filed Critical Huaqin Telecom Technology Co Ltd
Priority to CN201710642968.8A priority Critical patent/CN107395972B/en
Publication of CN107395972A publication Critical patent/CN107395972A/en
Application granted granted Critical
Publication of CN107395972B publication Critical patent/CN107395972B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/52Details of telephonic subscriber devices including functional features of a camera

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Environmental & Geological Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Studio Devices (AREA)

Abstract

The invention discloses a shooting method and a terminal for a fast moving object, wherein the method is suitable for terminal equipment with double cameras, and the method comprises the following steps: the method comprises the steps that a terminal obtains monitoring information of a target object at least two different positions, wherein the moving direction of the target object is from the field of view range of a first camera to the field of view range of a second camera, and at least one position is located in the field of view ranges of the first camera and the second camera; the terminal determines the moving speed of the target object according to the monitoring information of the at least two different positions; when the moving speed exceeds a first threshold value, the terminal determines time information of the target object moving to a second view field center of a second camera; and the terminal sends a shooting instruction to the second camera, and the shooting instruction instructs the second camera to shoot the target object according to the time information. The method is used for solving the problem that the shooting imaging of the existing fast moving object is not clear.

Description

Shooting method and terminal for fast moving object
Technical Field
The invention relates to the field of image processing, in particular to a shooting method and a terminal for a fast moving object.
Background
At present, as the mobile phone shooting function is matured, people have more and more shooting requirements on special scenes, but shooting in the special scenes still has many problems, wherein the problem that a mobile phone is used for shooting a fast moving object still exists, for example, the object moves too fast to press a shutter; imaging blurring in the moving process of the object; and imaging after the object deviates from the center of the field of view in the moving process.
Therefore, a solution is needed to solve the problem of how to capture a sharp image for a fast moving object.
Disclosure of Invention
The embodiment of the invention provides a shooting method and a terminal for a fast moving object, which are used for solving the problem that the shooting imaging of the existing fast moving object is not clear.
The method comprises a shooting method of a fast moving object, which is suitable for a terminal with double cameras, and comprises the following steps: the terminal acquires monitoring information of a target object at least two different positions, wherein the movement direction of the target object is from the field of view range of a first camera to the field of view range of a second camera, and at least one position is simultaneously positioned in the field of view ranges of the first camera and the second camera;
the terminal determines the moving speed of the target object according to the monitoring information of the at least two different positions;
when the moving speed exceeds a first threshold value, the terminal determines time information of the target object moving to a second view field center of the second camera;
and the terminal sends a shooting instruction to the second camera, and the shooting instruction instructs the second camera to shoot the target object according to the time information.
Based on the same inventive concept, an embodiment of the present invention further provides a terminal, where the terminal includes:
the device comprises an acquisition unit, a processing unit and a display unit, wherein the acquisition unit is used for acquiring monitoring information of a target object at least two different positions, the movement direction of the target object is from the field of view range of a first camera to the field of view range of a second camera, and at least one position is simultaneously positioned in the field of view ranges of the first camera and the second camera;
the determining unit is used for determining the moving speed of the target object according to the monitoring information of the at least two different positions; when the moving speed exceeds a first threshold value, determining time information of the target object moving to a second view field center of a second camera;
and the sending unit is used for sending a shooting instruction to the second camera, and the shooting instruction instructs the second camera to shoot the target object according to the time information.
According to the embodiment of the invention, the first camera in the double cameras of the mobile terminal is used for monitoring the moving object, the monitoring result is sent to the processor of the terminal, when the processor judges that the moving speed meets the set threshold value, the processor can determine the time information when the moving object enters the focusing center of the second camera, and then the second camera is informed to shoot at the time, so that the second camera can shoot clear images, and the automatic capturing and shooting function of the fast moving object is realized.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic flow chart of a photographing method for rapidly moving an object according to an embodiment of the present invention;
fig. 2 is a diagram illustrating a terminal with two cameras according to an embodiment of the present invention;
FIG. 3 is a first schematic view of a shooting scene according to an embodiment of the present invention;
fig. 4 is a schematic diagram of a shooting scene according to an embodiment of the present invention;
fig. 5 is a schematic diagram of a shooting scene three according to an embodiment of the present invention;
fig. 6 is a first schematic structural diagram of a terminal according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of a terminal according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer, the present invention will be described in further detail with reference to the accompanying drawings, and it is apparent that the described embodiments are only a part of the embodiments of the present invention, not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, an embodiment of the present invention provides a schematic flow diagram of a shooting method for quickly moving an object, and specifically, the implementation method includes:
step S101, the terminal acquires monitoring information of a target object at least two different positions, wherein the movement direction of the target object is from the field of view range of a first camera to the field of view range of a second camera, and at least one position is simultaneously located in the field of view ranges of the first camera and the second camera 2.
And step S102, the terminal determines the moving speed of the target object according to the monitoring information of the at least two different positions.
Step S103, when the moving speed exceeds a first threshold value, the terminal determines time information of the target object moving to a second view field center of a second camera.
And step S104, the terminal sends a shooting instruction to the second camera, and the shooting instruction instructs the second camera to shoot the target object according to the time information.
As shown in fig. 2, the terminal in the above method is a terminal having two cameras, and the two cameras are embedded in the housing of the terminal in parallel. Therefore, when a user uses the terminal to shoot a moving object which enters the field range of the left camera, the left camera monitors the running state of the moving object, the time information of the moving object entering the focusing center of the right camera is determined according to the running state, and the right camera shoots the moving object according to the time information to obtain a clear image, as shown in fig. 3. Similarly, the moving object enters the field range of the right camera from the right side first, and then the left camera shoots the moving object, the principle is similar, and the description is omitted here. Therefore, in the above step, when the first camera is the camera on the left side of the terminal, the second camera is the camera on the right side of the terminal, and when the first camera is the camera on the right side of the terminal, the second camera is the camera on the left side of the terminal.
Further, the monitoring information of the at least two different positions includes a first included angle between the target object at a first position and a first view center line of the first camera, a second included angle between the target object at a second position and the first view center line of the first camera, a first time when the target object is located at the first position, and a second time when the target object is located at the second position, wherein the second position is within the view range of both the first camera and the second camera;
the terminal determines an offset angle between the first position and the second position according to the first included angle and the second included angle;
and the terminal determines the angular speed of the movement of the target object according to the third included angle and a first time length, wherein the first time length is the time difference between the first time and the second time.
Specifically, as shown in fig. 4, the first camera acquires image information of the moving object M at a first position and a second position, wherein the second camera also acquires image information of the moving object M at the second position, so that the processor can obtain a first angle, i.e. an angle θ 1, by which the moving object M is shifted from a center line of a field of view of the first camera at the first position, and a second angle, i.e. an angle θ 2, by which the moving object M is shifted from the center line of the field of view of the first camera at the second position, according to the image information of the moving object M at the first position and the second position, so that the sum of θ 1 and θ 2 is an angle by which the moving object M is moved from the first position to the second position, i.e. the sum of θ 1 and θ 2 is a shift angle between the first position and the second position, assuming that the time t1 of the image information of the moving object M at the first position, the time T2 when the image information of the moving object M is at the second position, T2 being separated from T1 by the time length, i.e., the first time length, is T1, then the angular velocity W of the moving object moving from the first position to the second position can be calculated by the following formula:
Figure BDA0001366288030000041
further, when the processor determines that the angular velocity W exceeds the first threshold, a first time duration for the target object to move from the second position to a second field of view center of the second camera is calculated, which is specifically as follows: because the moving object M is offset by the angle θ 3 of the central line of the field of view of the second camera when in the second position, that is, the third included angle is θ 3, further, the processor can predict the second time period T2 taken by the moving object M to move from the second position to the third position according to the angular velocity W and the third included angle θ 3, and can calculate the second time period T2 by the following formula:
Figure BDA0001366288030000051
assuming that the processor sends a command to the second camera with a time delay, wherein the time delay refers to a time difference between the sending time of the terminal sending the shooting command and the receiving time of the second camera receiving the shooting command, the processor instructs the second camera to start shooting when a time length of T3 passes after the time of T2, wherein T3 is equal to a time difference between T2 and the time delay, and the moving object M is located on a central line of a field of view of the second camera, so that the second camera can shoot a clearer image.
Considering that the time delay time is generally short, the processor may also directly calculate the shooting time of the second camera, that is, the third time T3, T3 is equal to the time after T2 and the time of T2 is elapsed, and the processor instructs the second camera to shoot the target object at T3, so that a clearer image can be shot.
On the other hand, in one possible design, if only one of the two cameras monitors the moving object M and the other camera takes a picture of it, the second position may be fixed on the boundary of the field of view range of the second camera shown in fig. 5, because θ 3 is half the angle of the field of view, so θ 3 is fixed and known. In the scenario shown in fig. 5, the first camera monitors that the moving object M moves from the first position to the second position, and records the image information of the two positions, and then the processor calculates the angular velocity W according to the formula [1], since θ 3 is known, T2 can be directly calculated according to θ 3 and the angular velocity W, and then the processor triggers the second camera to shoot the moving object M for a time period of T2.
Therefore, the method and the device have the advantages that the function of automatically capturing and shooting the fast moving object is developed on the terminal with the double cameras, so that the fast moving object can be automatically and accurately shot, and the problems of low shooting speed and easiness in shaking caused by manual clicking are solved. The problem of unclear imaging in the process of shooting a fast moving object is well solved.
Based on the same technical concept, the embodiment of the invention also provides a terminal, and the terminal can execute the method embodiment. As shown in fig. 6, the terminal provided in the embodiment of the present invention includes an obtaining unit 401, a determining unit 402, and a sending unit 403, where:
an obtaining unit 401, configured to obtain monitoring information of a target object at least two different positions, where a moving direction of the target object is from a field of view range of a first camera to a field of view range of a second camera, and at least one position is located in the field of view ranges of the first camera and the second camera at the same time;
a determining unit 402, configured to determine a moving speed of the target object according to the monitoring information of the at least two different positions; when the moving speed exceeds a first threshold value, determining time information of the target object moving to a second view field center of a second camera;
a sending unit 403, configured to send a shooting instruction to the second camera, where the shooting instruction instructs the second camera to shoot the target object according to the time information.
Wherein: the monitoring information of at least two different positions comprises a first included angle between the target object at a first position and a first view field central line of the first camera, a second included angle between the target object at a second position and the first view field central line of the first camera, and the target object is positioned at a first moment of the first position and at a second moment of the second position, wherein the second position is positioned in the view field range of the first camera and the second camera at the same time.
Further, the determining unit 402 is specifically configured to: determining an offset angle between the first position and the second position according to the first included angle and the second included angle; and determining the angular speed of the movement of the target object according to the offset angle and a first time length, wherein the first time length is the time length between the first time and the second time.
The determining unit 402 is specifically configured to:
determining a third included angle between the target object at the second position and a second view field central line of the second camera according to the monitoring information of the second position;
determining a second time length used when the target object moves from the second position to a second view field center of the second camera according to the ratio of the third included angle to the moving speed;
and determining a third time when the target object moves to the center of a second field of view of the second camera according to a second time when the target object is at the second position and the second duration.
The determining unit 402 is further configured to: determining a time delay used for sending the shooting instruction, wherein the time delay is a time difference between the sending time of the shooting instruction sent by the processor and the receiving time of the shooting instruction received by the second camera;
the sending unit 403 is specifically configured to send a shooting instruction to the second camera, where the shooting instruction instructs the second camera to shoot the target object through a third duration after the second time, and the third duration is a difference between the first duration and the time delay.
Further, still include: a receiving unit 404, configured to receive a shooting scene instruction input by a user, where the shooting scene instruction indicates that a current shooting scene is a fast shooting scene;
the sending unit 403 is further configured to notify the first camera and/or the second camera to start monitoring the target object according to the shooting scene instruction.
Fig. 7 is a schematic structural diagram of another terminal provided in the present application, where the terminal includes: a communication interface 501, a processor 502, a memory 503, and a bus system 504;
the memory 503 is used for storing programs. In particular, the program may include program code including computer operating instructions. The memory 503 may be a RAM or a NVM, such as at least one disk memory. Only one memory is shown in the figure, but of course, the memory may be provided in plural numbers as necessary. The memory 503 may also be memory in the processor 502.
The memory 503 stores the following elements, executable modules or data structures, or a subset thereof, or an expanded set thereof:
and (3) operating instructions: including various operational instructions for performing various operations.
Operating the system: including various system programs for implementing various basic services and for handling hardware-based tasks.
The processor 502 controls the operation of the dual cameras, and the processor 502 may also be referred to as a CPU. In a particular application, the various components of the terminal are coupled together by a bus system 504, where the bus system 504 may include a power bus, a control bus, a status signal bus, and the like, in addition to a data bus. For clarity of illustration, however, the various buses are designated in the figure as the bus system 504. For ease of illustration, it is only schematically drawn in fig. 7.
The method disclosed in the embodiments of the present application may be applied to the processor 502 or implemented by the processor 502. The processor 502 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 502. The processor 502 described above may be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in a memory 503, and the processor 502 reads the information in the memory 503 and performs the above method steps in conjunction with its hardware.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various modifications and variations can be made in the embodiments of the present invention without departing from the spirit and scope of the application. Thus, if such modifications and variations of the embodiments of the present invention fall within the scope of the claims of the present application and their equivalents, the present application is also intended to encompass such modifications and variations.

Claims (10)

1. A photographing method for a fast moving object, which is applicable to a terminal having dual cameras, the method comprising:
the method comprises the steps that a terminal obtains monitoring information of a target object at least two different positions, wherein the moving direction of the target object is from the field of view range of a first camera to the field of view range of a second camera, and at least one position is located in the field of view ranges of the first camera and the second camera;
the monitoring information of the at least two different positions comprises a first included angle between a first position of the target object and a first view field central line of the first camera, a second included angle between a second position of the target object and the first view field central line of the first camera, a first moment when the target object is located at the first position, and a second moment when the target object is located at the second position, wherein the second position is located in the view field range of the first camera and the second camera at the same time;
the terminal determines an offset angle between the first position and the second position according to the first included angle and the second included angle;
the terminal determines the angular speed of the target object according to the offset angle and a first time length, wherein the first time length is the time length between the first time and the second time;
when the angular velocity exceeds a first threshold value, the terminal determines time information of the target object moving to a second view field center of the second camera;
and the terminal sends a shooting instruction to the second camera, and the shooting instruction instructs the second camera to shoot the target object according to the time information.
2. The method of claim 1, wherein the terminal determining time information when the target object moves to a second center of field of view of a second camera comprises:
the terminal determines a third included angle between the second position of the target object and a second view field central line of the second camera according to the monitoring information of the second position;
the terminal determines a second time length used when the target object moves from the second position to a second view field center of the second camera according to the ratio of the third included angle to the angular speed;
and the terminal determines a third moment when the target object moves to the center of a second view field of the second camera according to the second moment and the second duration when the target object is at the second position.
3. The method of claim 2, wherein before the terminal sends the shooting instruction to the second camera, the method further comprises:
the terminal predicts the time delay for sending the shooting instruction, wherein the time delay is the time difference between the sending time of the shooting instruction sent by the terminal and the receiving time of the shooting instruction received by the second camera;
the terminal sends the shooting instruction to the second camera, and the shooting instruction comprises the following steps:
and the terminal sends a shooting instruction to the second camera, wherein the shooting instruction instructs the second camera to shoot the target object after a third time length after the second moment, and the third time length is a difference value between the second time length and the time delay.
4. The method of any one of claims 1 to 3, wherein before the terminal obtains the monitoring information of the target object at two different positions from the first camera, the method further comprises:
the terminal receives a shooting scene instruction input by a user, wherein the shooting scene instruction indicates that the current shooting scene is a quick shooting scene;
and the terminal informs the first camera and/or the second camera to start monitoring the target object according to the shooting scene instruction.
5. A terminal, characterized in that the terminal comprises:
the device comprises an acquisition unit, a processing unit and a display unit, wherein the acquisition unit is used for acquiring monitoring information of a target object at least two different positions, the movement direction of the target object is from the field of view range of a first camera to the field of view range of a second camera, and at least one position is simultaneously positioned in the field of view ranges of the first camera and the second camera;
the monitoring information of the at least two different positions comprises a first included angle between the target object at a first position and a first view field central line of the first camera, a second included angle between the target object at a second position and the first view field central line of the first camera, a first moment when the target object is located at the first position, and a second moment when the target object is located at the second position, wherein the second position is simultaneously located in the view field range of the first camera and the second camera;
the determination unit is used for: determining an offset angle between the first position and the second position according to the first included angle and the second included angle; determining the angular speed of the target object to move according to the offset angle and a first time length, wherein the first time length is the time length between the first time and the second time;
the determining unit is further configured to determine time information when the target object moves to a second field center of a second camera when the angular velocity exceeds a first threshold;
and the sending unit is used for sending a shooting instruction to the second camera, and the shooting instruction instructs the second camera to shoot the target object according to the time information.
6. The terminal of claim 5, wherein the determining unit is specifically configured to:
determining a third included angle between the target object at the second position and a second view field central line of the second camera according to the monitoring information of the second position;
determining a second time length used when the target object moves from the second position to a second view field center of the second camera according to the ratio of the third included angle to the angular speed;
and determining a third time when the target object moves to the center of a second field of view of the second camera according to a second time when the target object is at the second position and the second duration.
7. The terminal of claim 6, wherein the determining unit is further configured to: determining the time delay used for sending the shooting instruction, wherein the time delay is the time difference between the sending time of the shooting instruction sent by the processor and the receiving time of the shooting instruction received by the second camera;
the sending unit is specifically configured to send a shooting instruction to the second camera, where the shooting instruction instructs the second camera to shoot the target object after a third duration after the second time, and the third duration is a difference between the second duration and the time delay.
8. The terminal according to any of claims 5 to 7, further comprising:
the device comprises a receiving unit, a processing unit and a processing unit, wherein the receiving unit is used for receiving a shooting scene instruction input by a user, and the shooting scene instruction indicates that the current shooting scene is a quick shooting scene;
the sending unit is further configured to notify the first camera and/or the second camera to start monitoring the target object according to the shooting scene instruction.
9. A terminal, comprising: the system comprises a communication interface, two cameras, a memory and a processor; the processor receives monitoring information sent by the two cameras through the communication interface and sends instructions to the two cameras through the communication interface;
the memory is used for storing program codes comprising computer operation instructions, and the processor executes the computer operation instructions to execute the method of any one of the claims 1 to 4.
10. A non-transitory computer storage medium storing computer-executable instructions for causing a computer to perform the method of claims 1-4.
CN201710642968.8A 2017-07-31 2017-07-31 Shooting method and terminal for fast moving object Active CN107395972B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710642968.8A CN107395972B (en) 2017-07-31 2017-07-31 Shooting method and terminal for fast moving object

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710642968.8A CN107395972B (en) 2017-07-31 2017-07-31 Shooting method and terminal for fast moving object

Publications (2)

Publication Number Publication Date
CN107395972A CN107395972A (en) 2017-11-24
CN107395972B true CN107395972B (en) 2020-03-06

Family

ID=60343476

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710642968.8A Active CN107395972B (en) 2017-07-31 2017-07-31 Shooting method and terminal for fast moving object

Country Status (1)

Country Link
CN (1) CN107395972B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110245546A (en) * 2018-12-06 2019-09-17 浙江大华技术股份有限公司 A kind of Target Tracking System, method and storage medium
US20220319013A1 (en) * 2019-06-25 2022-10-06 Sony Group Corporation Image processing device, image processing method, and program
CN111526314B (en) * 2020-04-24 2022-04-05 荣耀终端有限公司 Video shooting method and electronic equipment
CN112333382B (en) * 2020-10-14 2022-06-10 维沃移动通信(杭州)有限公司 Shooting method and device and electronic equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1996022537A1 (en) * 1995-01-18 1996-07-25 Hardin Larry C Optical range and speed detection system
CN103270540A (en) * 2010-12-30 2013-08-28 派尔高公司 Tracking moving objects using a camera network
CN104155470A (en) * 2014-07-15 2014-11-19 华南理工大学 Detecting method and system based on binocular camera for real-time vehicle speed

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1996022537A1 (en) * 1995-01-18 1996-07-25 Hardin Larry C Optical range and speed detection system
CN103270540A (en) * 2010-12-30 2013-08-28 派尔高公司 Tracking moving objects using a camera network
CN104155470A (en) * 2014-07-15 2014-11-19 华南理工大学 Detecting method and system based on binocular camera for real-time vehicle speed

Also Published As

Publication number Publication date
CN107395972A (en) 2017-11-24

Similar Documents

Publication Publication Date Title
CN107395972B (en) Shooting method and terminal for fast moving object
CN109922251B (en) Method, device and system for quick snapshot
US8634016B2 (en) Imaging device and main photographic subject recognition method
EP2273450A2 (en) Target tracking apparatus, image tracking apparatus, methods of controlling operation of same, and digital camera
WO2020029596A1 (en) Lens control method and device and terminal
CN106412437B (en) Focusing method, device and the terminal of terminal
US11722771B2 (en) Information processing apparatus, imaging apparatus, and information processing method each of which issues a notification of blur of an object, and control method for the imaging apparatus
JP2013146017A (en) Imaging device and program
CN105635568A (en) Image processing method in mobile terminal and mobile terminal
US8836820B2 (en) Image capturing apparatus having a control unit controlling switching unit such that first image is displayed in case a predetermined motion is detected and a composite image is displayed in case motion is not detected, control method thereof, and storage medium
CN112449117A (en) Focusing step length determining method and device, storage medium and electronic device
CN112352417B (en) Focusing method of shooting device, system and storage medium
CN105635554A (en) Automatic focusing control method and device
CN114500826B (en) Intelligent shooting method and device and electronic equipment
US20150226934A1 (en) Focus adjustment apparatus having frame-out preventive control, and control method therefor
CN107018315B (en) Image pickup apparatus, motion vector detection device, and control method thereof
US9413940B2 (en) Digital electronic apparatus and method of controlling continuous photographing thereof
CN112770056B (en) Shooting method, shooting device and electronic equipment
JP2016197150A5 (en)
CN110460768B (en) Electronic device, photographing method, control device, and recording medium
JP6702737B2 (en) Image blur correction apparatus and method for controlling image blur correction apparatus
CN111787236B (en) Remote control device and method for panoramic picture shooting and panoramic picture shooting system
CN111654620B (en) Shooting method and device
JP2015097322A (en) Imaging apparatus, control method of the same, program, and storage medium
CN111586283B (en) Zooming method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: 201210 Building 1, 399 Keyuan Road, Zhangjiang High Tech Park, Pudong New Area, Shanghai

Patentee after: Huaqin Technology Co.,Ltd.

Address before: 201210 Building 1, 399 Keyuan Road, Zhangjiang High Tech Park, Pudong New Area, Shanghai

Patentee before: Huaqin Technology Co.,Ltd.

Address after: 201210 Building 1, 399 Keyuan Road, Zhangjiang High Tech Park, Pudong New Area, Shanghai

Patentee after: Huaqin Technology Co.,Ltd.

Address before: 201210 Building 1, 399 Keyuan Road, Zhangjiang High Tech Park, Pudong New Area, Shanghai

Patentee before: HUAQIN TELECOM TECHNOLOGY Co.,Ltd.

CP01 Change in the name or title of a patent holder