WO2021047070A1 - 终端拍摄方法、装置、移动终端及可读存储介质 - Google Patents
终端拍摄方法、装置、移动终端及可读存储介质 Download PDFInfo
- Publication number
- WO2021047070A1 WO2021047070A1 PCT/CN2019/122776 CN2019122776W WO2021047070A1 WO 2021047070 A1 WO2021047070 A1 WO 2021047070A1 CN 2019122776 W CN2019122776 W CN 2019122776W WO 2021047070 A1 WO2021047070 A1 WO 2021047070A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- shooting
- predicted
- terminal
- focus area
- shooting target
- Prior art date
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/631—Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
- H04N23/632—Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
Definitions
- This application relates to the field of terminal intelligent shooting technology, and in particular to a terminal shooting method, device, mobile terminal, and readable storage medium.
- the main purpose of this application is to provide a terminal shooting method, device, mobile terminal, and computer storage medium, aiming to solve the technical problem of inaccurate focus when the mobile terminal camera shoots a non-stationary shooting target in the prior art, which causes the shooting effect to be unclear. .
- an embodiment of the present application provides a terminal photographing method, and the terminal photographing method includes the following steps:
- the shooting target enters the predicted focus area, the shooting of the shooting target is completed.
- the displacement information includes historical motion trajectories and object attributes within a preset unit time before the current moment
- the step of determining the predicted focus area according to the displacement information includes:
- the predicted focus area is determined.
- the step of obtaining the predicted movement trajectory of the shooting target in the next preset unit duration according to the historical movement trajectory includes:
- the motion rule description table includes a motion reference point and the motion reference point mapped The predicted direction of movement
- the predicted motion speed and the historical motion trajectory, the predicted motion trajectory of the shooting target in the next preset unit duration is determined.
- the step of determining the predicted motion trajectory according to the motion rule description table, the predicted motion speed, and the historical motion trajectory includes:
- the predicted motion direction mapped by the current motion reference point is searched in the motion trajectory description table
- the predicted movement trajectory is determined.
- the step of determining the predicted focus area according to the position point on the predicted motion track includes:
- the predicted focus area is determined.
- the step of determining the predicted focus area by taking the center of the reference position as the geometric center includes:
- the display size of the overall area is determined according to the screen-to-body ratio of each of the shooting targets relative to the terminal screen, so as to generate a predicted focus area.
- the step of determining whether the shooting target enters the predicted focus area includes:
- each of the shooting targets It is determined whether the geometric center of the overall area formed by each of the shooting targets coincides with the geometric center of the predicted focus area, and if they overlap, it is determined that each of the shooting targets enters the predicted focus area.
- the method before the step of acquiring and tracking the displacement information of the shooting target in the preview frame of the terminal camera, the method includes:
- the step of completing the shooting of the shooting target includes:
- the user can manually confirm the shooting or start the shooting module to automatically shoot.
- the present application also provides a terminal photographing device, the terminal photographing device includes:
- the information acquisition module is used to acquire and track the displacement information of the shooting target in the preview frame of the terminal camera
- An area determination module configured to determine a predicted focus area according to the displacement information, and perform focusing in the predicted focus area
- a judging module configured to judge whether the shooting target enters the predicted focus area
- the shooting module is configured to complete shooting of the shooting target if the shooting target enters the predicted focus area.
- the area determination module of the terminal photographing device includes a prediction unit and a determination unit:
- the prediction unit is configured to obtain a predicted movement trajectory of the shooting target in the next preset unit duration according to the historical movement trajectory;
- the determining unit is configured to determine a predicted focus area according to a position point on the predicted motion track.
- the present application also provides a mobile terminal.
- the mobile terminal includes a memory, a processor, and a terminal photographing program stored on the memory and running on the processor, and the terminal photographing program is used by the processor. When executed, the steps of the terminal shooting method described above are implemented.
- the present application also provides a computer storage medium having a terminal shooting program stored on the computer storage medium, and when the terminal shooting program is executed by a processor, the steps of the terminal shooting method described above are implemented.
- the displacement information of the shooting target in the preview frame of the terminal camera is acquired and tracked; and then according to the The displacement information determines the predicted focus area, and focus is performed in the predicted focus area; finally, it is determined whether the shooting target enters the predicted focus area; if the shooting target enters the predicted focus area, the shooting of the shooting target is completed .
- This application focuses on the predicted focus area, and when the subject enters the predicted focus area, shooting can be performed without the user needing to focus manually, which avoids the change of the location of the subject, which leads to inaccurate focus and unclear photos, and improves the shooting effect;
- the shooting process is reduced, the user shooting operation steps are reduced, and the user experience is improved.
- FIG. 1 is a schematic diagram of the hardware structure of an optional mobile terminal according to an embodiment of the application
- FIG. 2 is a schematic flowchart of an embodiment of a terminal shooting method according to this application.
- FIG. 3 is a detailed flowchart of step S20 of an embodiment of a terminal shooting method according to this application;
- step S21 is a detailed flowchart of step S21 of an embodiment of a terminal shooting method according to this application;
- FIG. 5 is a detailed flowchart of step S212 of an embodiment of a terminal shooting method according to this application;
- FIG. 6 is a detailed flowchart of step S22 of an embodiment of a terminal shooting method according to this application.
- FIG. 7 is a detailed flowchart of step S222 of an embodiment of a terminal shooting method according to this application.
- FIG. 8 is a schematic diagram of functional modules in the area of the terminal camera of the application.
- FIG. 9 is a schematic diagram of detailed functional modules of the region determining module of the terminal photographing device of the application.
- FIG. 10 is a schematic diagram of an application scenario of an embodiment of the terminal shooting method of this application.
- FIG. 11 is a schematic diagram of another application scenario of the embodiment of the terminal shooting method of this application.
- FIG. 12 is a schematic diagram of another application scenario of the embodiment of the terminal shooting method of this application.
- FIG. 13 is a schematic diagram of another application scenario of the embodiment of the terminal shooting method of this application.
- module means, “component” or “unit” used to indicate elements is only for the description of the present application, and has no specific meaning in itself. Therefore, “module”, “part” or “unit” can be used in a mixed manner.
- the mobile terminal can be implemented in various forms.
- the mobile terminals described in this application may include mobile phones, tablet computers, notebook computers, palmtop computers, and personal digital assistants (Personal Digital Assistants). Digital Assistant, PDA) and other mobile terminals.
- PDA Personal Digital Assistants
- FIG. 1 is a schematic diagram of the hardware structure of a mobile terminal that implements the various embodiments of the present application.
- the mobile terminal 100 may include: RF (Radio Frequency (radio frequency) unit 101, WiFi module 102, audio output unit 103, A/V (audio/video) input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110 , And power supply 111 and other components.
- RF Radio Frequency
- WiFi module 102 Wireless Fidelity unit
- A/V (audio/video) input unit 104 sensor 105
- display unit 106 user input unit 107
- interface unit 108 user input unit
- memory 109 memory 109
- processor 110 And power supply 111 and other components.
- power supply 111 and other components.
- the mobile terminal may include more or less components than those shown in the figure, or a combination of certain components, or different components. Layout.
- the radio frequency unit 101 may be used for receiving and sending signals during information transmission or communication. Specifically, after receiving the downlink information of the base station, it is processed by the processor 110; in addition, the uplink data is sent to the base station.
- the radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
- the radio frequency unit 101 can also communicate with the network and other devices through wireless communication.
- WiFi is a short-distance wireless transmission technology.
- the mobile terminal can help users send and receive emails, browse web pages, and access streaming media through the WiFi module 102. It provides users with wireless broadband Internet access.
- FIG. 1 shows the WiFi module 102, it is understandable that it is not a necessary component of the mobile terminal, and can be omitted as needed without changing the essence of the invention.
- the audio output unit 103 can store data received by the radio frequency unit 101 or the WiFi module 102 or stored in the memory 109 when the mobile terminal 100 is in a call signal receiving mode, a call mode, a recording mode, a voice recognition mode, a broadcast receiving mode, etc.
- the audio data is converted into an audio signal and output as sound.
- the audio output unit 103 may also provide audio output related to a specific function performed by the mobile terminal 100 (for example, call signal reception sound, message reception sound, etc.).
- the audio output unit 103 may include a speaker, a buzzer, and so on.
- the A/V input unit 104 is used to receive audio or video signals.
- the A/V input unit 104 may include a graphics processor (Graphics Processing Unit, GPU) 1041 and microphone 1042, and graphics processor 1041 process image data of still pictures or videos obtained by an image capture device (such as a camera) in a video capture mode or an image capture mode.
- the processed image frame can be displayed on the display unit 106.
- the image frame processed by the graphics processor 1041 may be stored in the memory 109 (or other storage medium) or sent via the radio frequency unit 101 or the WiFi module 102.
- the microphone 1042 can receive sound (audio data) via the microphone 1042 in operation modes such as a telephone call mode, a recording mode, a voice recognition mode, and the like, and can process such sound into audio data.
- the processed audio (voice) data can be converted into a format that can be sent to a mobile communication base station via the radio frequency unit 101 in the case of a telephone call mode for output.
- the microphone 1042 may implement various types of noise cancellation (or suppression) algorithms to eliminate (or suppress) noise or interference generated in the process of receiving and transmitting audio signals.
- the mobile terminal 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors.
- the light sensor includes an ambient light sensor and a proximity sensor.
- the ambient light sensor can adjust the brightness of the display panel 1061 according to the brightness of the ambient light.
- the proximity sensor can close the display panel 1061 and the display panel 1061 when the mobile terminal 100 is moved to the ear. / Or backlight.
- the accelerometer sensor can detect the magnitude of acceleration in various directions (usually three-axis), and can detect the magnitude and direction of gravity when it is stationary.
- the display unit 106 is used to display information input by the user or information provided to the user.
- the display unit 106 may include a display panel 1061, and may adopt a liquid crystal display (Liquid Crystal Display, LCD), Organic Light-Emitting Diode (Organic Light-Emitting Diode, OLED) and other forms to configure the display panel 1061.
- LCD Liquid Crystal Display
- OLED Organic Light-Emitting Diode
- the user input unit 107 may be used to receive inputted numeric or character information, and generate key signal input related to user settings and function control of the mobile terminal.
- the user input unit 107 may include a touch panel 1071 and other input devices 1072.
- the touch panel 1071 also called a touch screen, can collect user touch operations on or near it (for example, the user uses any suitable objects or accessories such as fingers, stylus, etc.) on the touch panel 1071 or near the touch panel 1071. Operation), and drive the corresponding connection device according to the preset program.
- the touch panel 1071 may include two parts: a touch detection device and a touch controller.
- the touch detection device detects the user's touch position, detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact coordinates, and then sends it To the processor 110, and can receive and execute the commands sent by the processor 110.
- the touch panel 1071 can be implemented in multiple types such as resistive, capacitive, infrared, and surface acoustic wave.
- the user input unit 107 may also include other input devices 1072.
- other input devices 1072 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), trackball, mouse, joystick, etc., which are not specifically limited here. .
- the touch panel 1071 can cover the display panel 1061.
- the touch panel 1071 detects a touch operation on or near it, it is sent to the processor 110 to determine the type of touch event, and then the processor 110 responds to the touch event.
- the type provides corresponding visual output on the display panel 1061.
- the touch panel 1071 and the display panel 1061 are used as two independent components to implement the input and output functions of the mobile terminal, but in some embodiments, the touch panel 1071 and the display panel 1061 can be integrated
- the implementation of the input and output functions of the mobile terminal is not specifically limited here.
- the interface unit 108 serves as an interface through which at least one external device can be connected to the mobile terminal 100.
- the external device may include a wired or wireless headset port, an external power source (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, audio input/output (I/O) port, video I/O port, headphone port, etc.
- the interface unit 108 can be used to receive input (for example, data information, power, etc.) from an external device and transmit the received input to one or more elements in the mobile terminal 100 or can be used to connect to the mobile terminal 100 and external Transfer data between devices.
- the memory 109 may be used to store software programs and various data.
- the memory 109 may be a computer storage medium, and the memory 109 stores the terminal shooting program of the present application.
- the memory 109 may mainly include a program storage area and a data storage area.
- the program storage area may store an operating system, an application program required by at least one function (such as a sound playback function, an image playback function, etc.), etc.; Data created by the use of mobile phones (such as audio data, phone book, etc.), etc.
- the memory 109 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
- the processor 110 is the control center of the mobile terminal. It uses various interfaces and lines to connect the various parts of the entire mobile terminal, runs or executes software programs and/or modules stored in the memory 109, and calls data stored in the memory 109. , Perform various functions of the mobile terminal and process data, so as to monitor the mobile terminal as a whole. For example, the processor 110 executes the terminal photographing program in the memory 109 to implement the steps of the embodiments of the terminal photographing method of the present application.
- the processor 110 may include one or more processing units; optionally, the processor 110 may integrate an application processor and a modem processor, where the application processor mainly processes the operating system, user interface, and application programs, etc.
- the adjustment processor mainly deals with wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 110.
- the mobile terminal 100 may also include a power source 111 (such as a battery) for supplying power to various components.
- a power source 111 such as a battery
- the power source 111 may be logically connected to the processor 110 through a power management system, so as to manage charging, discharging, and power consumption through the power management system. Management and other functions.
- the mobile terminal 100 may also include a Bluetooth module, etc., which will not be repeated here.
- the mobile terminal 100 can be connected with other terminal devices through Bluetooth to realize communication and information interaction.
- the terminal photographing method includes:
- Step S10 acquiring and tracking the displacement information of the shooting target in the preview frame of the terminal camera
- the execution subject of this application may be a terminal shooting application;
- the terminal camera preview frame refers to the display interface of the mobile terminal camera after the user opens the mobile terminal camera application; when the user opens the mobile terminal camera, the user adjusts the mobile terminal’s
- the position allows the shooting target to be displayed in the preview frame according to the user's shooting composition. At this time, the shooting target will move in the preview frame, and the speed and direction of the shooting target will change, that is, displacement.
- the information describing the displacement is the displacement information. .
- Step S20 Determine the predicted focus area according to the displacement information, and perform focusing in the predicted focus area;
- the predicted focus area refers to the area on the trajectory that is likely to be formed on the basis of the displacement of the moving target; focusing in the predicted focus area refers to the terminal shooting application when the target enters the predicted focus Before the area, focus on the predicted focus area in advance.
- Step S30 judging whether the shooting target enters the predicted focus area
- the shooting target Since the shooting target will be displaced during the shooting process and is not absolutely still, the positional relationship between the shooting target and the predicted focus area will change in real time while the shooting target is moving.
- step S40 if the shooting target enters the predicted focus area, the shooting of the shooting target is completed.
- shooting can be performed without the user needing to focus manually.
- the displacement information of the shooting target in the preview frame of the terminal camera is acquired and tracked; and then according to the The displacement information determines the predicted focus area, and focus is performed in the predicted focus area; finally, it is determined whether the shooting target enters the predicted focus area; if the shooting target enters the predicted focus area, the shooting of the shooting target is completed .
- This application focuses on the predicted focus area, and when the shooting target enters the predicted focus area, the user is not required to focus manually, which avoids the change of the location of the shooting target, which leads to inaccurate focus and unclear photos, and improves the shooting effect; at the same time, it simplifies The shooting process reduces the user's shooting operation steps and improves the user experience.
- Step S20 includes:
- Step S21 Obtain a predicted movement trajectory of the shooting target in the next preset unit duration according to the historical movement trajectory;
- the historical motion trajectory is one of the displacement information, which refers to the trajectory formed by the displacement of the shooting target in the terminal camera preview frame; the predicted motion trajectory of the next preset unit duration refers to the prediction of the shooting target after the current moment.
- the trajectory formed by the possible displacement within a unit time refers to the trajectory formed by the possible displacement within a unit time.
- Step S22 Determine the predicted focus area according to the position point on the predicted motion track.
- the position point on the predicted motion trajectory refers to the point that forms the predicted motion trajectory of the shooting target.
- the information acquisition module first obtains the historical motion trajectory of the shooting target, and then obtains the predicted motion trajectory from the historical motion trajectory, and then determines the predicted focus area by the position point on the predicted motion trajectory, so that the shooting application of this terminal can be advanced Focusing in the predicted focus area, as long as the subject enters the predicted focus area, you can shoot, avoiding inaccurate focus and unclear photos.
- Step S21 includes:
- Step S211 Determine the object attribute of the shooting target, and determine the motion rule description table and predicted motion speed of the shooting target according to the object attribute, wherein the motion rule description table includes the motion reference point and the predicted motion direction mapped by the motion reference point;
- the object attribute of the shooting target belongs to another displacement information, which means that the shooting target can be one or more of humans, animals, plants, and objects; the predicted motion direction mapped by the motion reference point and the motion reference point is based on the object
- the predicted movement direction of the big clock at this movement reference point is the tangent direction of the pendulum movement trajectory;
- the movement law description table refers to the mapping relationship table between the movement reference points describing the attributes of different objects and the predicted movement direction, which is described by the movement law description table You can query the motion reference point and the predicted motion direction mapped by the motion reference point;
- the predicted motion speed refers to the prediction of the motion speed of the shooting target according to the general motion law of the object attribute; because of the different types of object attributes, motion law and speed
- Step S212 according to the motion rule description table, the predicted motion speed and the historical motion trajectory, the predicted motion trajectory of the shooting target in the next preset unit duration is determined.
- the motion rule description table can query the predicted motion direction that has a mapping relationship with the motion reference point of the shooting target; when the predicted motion direction and the predicted motion speed are determined, combined with the historical motion trajectory, the predicted motion trajectory of the shooting target can be determined.
- the object attributes of the shooting target are determined first, and then the motion reference point, predicted motion direction and predicted motion speed of the shooting target are determined according to the object attributes, and finally combined with the historical motion trajectory, it is determined that the shooting target is in the next preset unit duration. Predict movement trajectory.
- step S212 includes:
- Step A10 Determine the current motion reference point of the shooting target according to the historical motion trajectory
- the current motion reference point refers to the position point at the current time on the historical motion trajectory.
- Step A20 search for the predicted motion direction mapped by the current motion reference point in the motion trajectory description table
- Each reference motion point in the motion trajectory description table has its own corresponding predicted motion direction. Only when the direction of the reference motion point is determined, can the motion direction of the shooting target be estimated.
- Step A30 Determine the predicted movement trajectory according to the predicted movement speed and the predicted movement direction.
- the motion trajectory of the shooting target after the current motion reference point can be estimated to obtain the predicted motion trajectory.
- the shooting target is a swinging clock, referred to as a pendulum.
- the historical movement trajectory of this pendulum (the solid arc part) is from the highest point on the left that the pendulum can reach to the pendulum vertical to the ground.
- the lowest point when the lowest point of the motion trajectory is selected as the motion reference point, the predicted motion direction corresponding to the lowest point is the tangent direction of the pendulum motion trajectory; the predicted motion speed is gradually reduced from the maximum speed of the lowest point ; Predict the movement trajectory (the part of the dashed line of the arc) from the lowest point to the highest point on the right.
- step S22 includes:
- Step S221 taking a preset number of position points on the predicted motion trajectory, and determining the reference position center of the graph formed by each position point on the predicted motion trajectory;
- Step S222 taking the center of the reference position as the geometric center, determine the predicted focus area.
- the user can adjust the predicted focus area, such as zoom in, zoom out, rotate, etc., to obtain the predicted focus area.
- the predicted focus area is determined, so that the predicted focus area is more likely to be in the shooting target The location to be moved to.
- step S222 includes:
- Step A40 taking the center of the reference position as the geometric center of the overall area formed by each shooting target
- the center of the reference position may be taken as the geometric center of the entire area formed by each shooting target; that is, the reference center position of the entire area of the frame selection covering all the shooting targets is determined first.
- Step A50 Determine the display size of the entire area according to the screen-to-body ratio of each photographed target relative to the terminal screen to generate a predicted focus area.
- the entire area at this time is used as the predicted focus area to generate the predicted focus area Therefore, it is predicted that the size of the focus area is moderate, which can not only include all the shooting targets without being too large in size, and the position of the preset focus area is on the predicted motion track, which is more conducive to the subsequent accurate focus of the shooting target.
- step S30 includes:
- each shooting target It is determined whether the geometric center of the overall area formed by each shooting target overlaps with the geometric center of the predicted focus area, and if they overlap, it is determined that each shooting target enters the predicted focus area.
- the geometric center is the center of gravity of the shooting target; when there are more than one shooting targets, refer to Figure 13, the geometric center is the center of gravity of the overall area formed by each shooting target, and when When the number of shooting targets is more than one, the speed and direction of each shooting target may not be the same during the movement, so the center of gravity of each position will change in real time; when the geometric center of the shooting target coincides with the geometric center of the predicted focus area When, it means that the geometric center of the shooting target and the geometric center of the predicted focus area are at the same position in the preview frame of the terminal camera, so that it can be concluded that the shooting target enters the predicted focus area.
- the steps before step S10 include:
- Step A60 detecting whether the shooting target in the preview frame of the terminal camera is a non-stationary living body or a non-stationary object
- the shooting target refers to the non-stationary living body (people or living animals and plants) or objects that the user wants to shoot, such as walking people, lively pets, plants floating in the wind, moving trains, swaying Big clock and so on.
- Step A70 if yes, execute the step of acquiring and tracking the displacement information of the shooting target in the preview frame of the terminal camera.
- the shooting target is a non-stationary living body or a non-stationary object
- the displacement information of the shooting target is acquired.
- the shooting target is a non-stationary living body or a non-stationary object, and if the step of acquiring displacement information of the shooting target is executed, the accuracy of information acquisition is improved.
- step S40 includes:
- the user can manually confirm the shooting or start the shooting module to automatically shoot.
- the shooting target When the shooting target enters the predicted focus area, it means that the focus of the shooting target has been completed. At this time, the user can manually click the confirmation box to complete the shooting of the shooting target; or the application will start the shooting module to automatically complete the shooting of the shooting target, and the effect of automatic shooting Better than manual shooting.
- the manual mode is convenient for the user to independently select the time to take a photo, and the effect of automatic shooting is better than that of manual shooting.
- an embodiment of the present application also proposes a terminal photographing device, and the terminal photographing device includes:
- the information acquisition module is used to acquire and track the displacement information of the shooting target in the preview frame of the terminal camera
- the area determination module is used to determine the predicted focus area according to the displacement information, and perform focusing in the predicted focus area;
- the judging module is used to judge whether the shooting target enters the predicted focus area
- the shooting module is used to complete the shooting of the shooting target if the shooting target enters the predicted focus area.
- an embodiment of the present application further proposes a terminal photographing device, and the terminal photographing device further includes:
- the detection module is used to detect whether the shooting target of the terminal camera preview frame is a non-stationary living body or a non-stationary object, and if so, the information acquisition module is triggered to perform the step of acquiring and tracking the displacement information of the shooting target in the terminal camera preview frame.
- the area determination module includes a prediction unit and a determination unit:
- the prediction unit is configured to obtain a predicted movement trajectory of the shooting target in the next preset unit duration according to the historical movement trajectory;
- the determining unit is configured to determine a predicted focus area according to a position point on the predicted motion track.
- the prediction unit is also used for:
- predicted motion speed and historical motion trajectory determine the predicted motion trajectory of the shooting target in the next preset unit duration.
- the prediction unit is also used for:
- the predicted movement trajectory is determined.
- the determining unit is also used for:
- Predict the position points on the motion trajectory by taking a preset number, and determine the reference position center of the graph formed by each position point on the predicted motion trajectory;
- the predicted focus area is determined.
- the determining unit is also used for:
- the display size of the entire area is determined to generate a predicted focus area.
- the judgment module is also used to:
- each shooting target It is determined whether the geometric center of the overall area formed by each shooting target coincides with the geometric center of the predicted focus area, and if they overlap, it is determined that each shooting target enters the predicted focus area.
- the steps implemented by the functional modules of the terminal photographing device can refer to the various embodiments of the terminal photographing method of the present application, which will not be repeated here.
- this application also provides a mobile terminal.
- the mobile terminal includes a memory 109, a processor 110, and a terminal shooting program stored on the memory 109 and running on the processor 110.
- the display control program of the mobile terminal is controlled by the processor 110. When executed, the steps of the foregoing embodiments of the terminal shooting method are implemented.
- the present application also provides a computer-readable storage medium that stores one or more programs, and the one or more programs may also be executed by one or more processors for use For realizing the steps of each embodiment of the above terminal shooting method.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Studio Devices (AREA)
Abstract
Description
Claims (13)
- 一种终端拍摄方法,其中,所述终端拍摄方法包括:获取并跟踪终端相机预览框拍摄目标的位移信息;根据所述位移信息确定预测对焦区域,在所述预测对焦区域进行对焦;判断所述拍摄目标是否进入所述预测对焦区域;若所述拍摄目标进入所述预测对焦区域,完成所述拍摄目标的拍摄。
- 如权利要求1所述的终端拍摄方法,其中,所述位移信息包括当前时刻之前预设单位时长内的历史运动轨迹,所述根据所述位移信息确定预测对焦区域的步骤包括:根据所述历史运动轨迹,获取所述拍摄目标在下一个预设单位时长的预测运动轨迹;根据所述预测运动轨迹上的位置点,确定预测对焦区域。
- 如权利要求2所述的终端拍摄方法,其中,所述位移信息还包括拍摄目标的对象属性,所述根据所述历史运动轨迹,获取拍摄目标在下一个预设单位时长的预测运动轨迹的步骤包括:确定所述拍摄目标的对象属性,并根据所述对象属性,确定所述拍摄目标的运动规律描述表和预测运动速度,其中所述运动规律描述表包括运动参考点与所述运动参考点所映射的预测运动方向;根据所述运动规律描述表、所述预测运动速度和所述历史运动轨迹,确定拍摄目标在下一个预设单位时长的预测运动轨迹。
- 如权利要求3所述的终端拍摄方法,其中,所述根据运动规律描述表、所述预测运动速度和所述历史运动轨迹,确定所述预测运动轨迹的步骤包括:根据所述历史运动轨迹确定所述拍摄目标的当前运动参考点;在所述运动轨迹描述表搜索到所述当前运动参考点所映射的预测运动方向;根据所述预测运动速度和所述预测运动方向,确定所述预测运动轨迹。
- 如权利要求2所述的终端拍摄方法,其中,所述根据所述预测运动轨迹上的位置点,确定预测对焦区域的步骤包括:取预设个数所述预测运动轨迹上的位置点,确定所述预测运动轨迹上的各位置点所构成图形的参考位置中心;以所述参考位置中心为几何中心,确定所述预测对焦区域。
- 如权利要求5所述的终端拍摄方法,其中,所述以所述参考位置中心为几何中心,确定所述预测对焦区域的步骤包括:将所述参考位置中心作为各所述拍摄目标所形成的整体区域的几何中心;根据各所述拍摄目标相对终端屏幕的屏占比,确定所述整体区域的显示尺寸,以生成预测对焦区域。
- 如权利要求1所述的终端拍摄方法,其中,所述判断所述拍摄目标是否进入所述预测对焦区域的步骤包括:判断各所述拍摄目标所形成的整体区域的几何中心与所述预测对焦区域的几何中心是否重合,若重合则判定所述各拍摄目标进入所述预测对焦区域。
- 如权利要求1所述的终端拍摄方法,其中,所述获取并跟踪终端相机预览框拍摄目标的位移信息的步骤之前,包括:检测所述终端相机预览框的拍摄目标是否为非静止的活体或者非静止的物体;若是,则执行获取并跟踪终端相机预览框拍摄目标的位移信息的步骤。
- 如权利要求1所述的终端拍摄方法,其中,所述若所述拍摄目标进入所述预测对焦区域,则完成所述拍摄目标的拍摄的步骤包括:若所述拍摄目标进入所述预测对焦区域,则用户可手动确认拍摄或者启动拍摄模块自动拍摄。
- 一种终端拍摄装置,其中,所述终端拍摄装置包括:信息获取模块,用于获取并跟踪终端相机预览框拍摄目标的位移信息;区域确定模块,用于根据所述位移信息确定预测对焦区域,在所述预测对焦区域进行对焦;判断模块,用于判断所述拍摄目标是否进入所述预测对焦区域;拍摄模块,用于若所述拍摄目标进入所述预测对焦区域,则完成所述拍摄目标的拍摄。
- 如权利要求10所述的终端拍摄装置,其中,所述终端拍摄装置的区域确定模块包括预测单元和确定单元:所述预测单元,用于根据所述历史运动轨迹,获取所述拍摄目标在下一个预设单位时长的预测运动轨迹;所述确定单元,用于根据所述预测运动轨迹上的位置点,确定预测对焦区域。
- 一种移动终端,其中,所述移动终端包括:存储器、处理器及存储在所述存储器上并可在所述处理器上运行的终端拍摄程序,所述终端拍摄程序被所述处理器执行时实现以下的步骤:获取并跟踪终端相机预览框拍摄目标的位移信息;根据所述位移信息确定预测对焦区域,在所述预测对焦区域进行对焦;判断所述拍摄目标是否进入所述预测对焦区域;若所述拍摄目标进入所述预测对焦区域,完成所述拍摄目标的拍摄。
- 一种存储介质,其中,所述存储介质上存储有终端拍摄程序,所述终端拍摄程序被处理器执行时实现以下的步骤:获取并跟踪终端相机预览框拍摄目标的位移信息;根据所述位移信息确定预测对焦区域,在所述预测对焦区域进行对焦;判断所述拍摄目标是否进入所述预测对焦区域;若所述拍摄目标进入所述预测对焦区域,完成所述拍摄目标的拍摄。
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910873337.6 | 2019-09-12 | ||
CN201910873337.6A CN110505408B (zh) | 2019-09-12 | 2019-09-12 | 终端拍摄方法、装置、移动终端及可读存储介质 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021047070A1 true WO2021047070A1 (zh) | 2021-03-18 |
Family
ID=68591921
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2019/122776 WO2021047070A1 (zh) | 2019-09-12 | 2019-12-03 | 终端拍摄方法、装置、移动终端及可读存储介质 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN110505408B (zh) |
WO (1) | WO2021047070A1 (zh) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115546111A (zh) * | 2022-09-13 | 2022-12-30 | 武汉海微科技有限公司 | 曲面屏检测方法、装置、设备及存储介质 |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110505408B (zh) * | 2019-09-12 | 2021-07-27 | 深圳传音控股股份有限公司 | 终端拍摄方法、装置、移动终端及可读存储介质 |
CN110933303B (zh) * | 2019-11-27 | 2021-05-18 | 维沃移动通信(杭州)有限公司 | 拍照方法及电子设备 |
CN112312005A (zh) * | 2020-02-12 | 2021-02-02 | 北京字节跳动网络技术有限公司 | 图像获取方法及装置 |
WO2021258321A1 (zh) * | 2020-06-24 | 2021-12-30 | 华为技术有限公司 | 一种图像获取方法以及装置 |
CN114979455A (zh) * | 2021-02-25 | 2022-08-30 | 北京小米移动软件有限公司 | 拍摄方法、装置以及存储介质 |
CN113724338B (zh) * | 2021-08-31 | 2024-05-03 | 上海西井科技股份有限公司 | 基于球台拍摄移动对象的方法、系统、设备及存储介质 |
CN113780214B (zh) * | 2021-09-16 | 2024-04-19 | 上海西井科技股份有限公司 | 基于人群进行图像识别的方法、系统、设备及存储介质 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105554367A (zh) * | 2015-09-30 | 2016-05-04 | 宇龙计算机通信科技(深圳)有限公司 | 一种运动拍摄方法及移动终端 |
JP2017103601A (ja) * | 2015-12-01 | 2017-06-08 | 株式会社ニコン | 焦点検出装置およびカメラ |
CN106961552A (zh) * | 2017-03-27 | 2017-07-18 | 联想(北京)有限公司 | 一种对焦控制方法及电子设备 |
US20180007254A1 (en) * | 2016-06-30 | 2018-01-04 | Canon Kabushiki Kaisha | Focus adjusting apparatus, focus adjusting method, and image capturing apparatus |
CN110505408A (zh) * | 2019-09-12 | 2019-11-26 | 深圳传音控股股份有限公司 | 终端拍摄方法、装置、移动终端及可读存储介质 |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101387812B (zh) * | 2007-09-13 | 2011-03-23 | 鸿富锦精密工业(深圳)有限公司 | 相机自动对焦系统及方法 |
CN101247286B (zh) * | 2008-03-21 | 2011-01-05 | 中兴通讯股份有限公司 | 一种对视频分发系统进行服务质量检测方法和系统 |
CN102056010A (zh) * | 2009-11-02 | 2011-05-11 | 鸿富锦精密工业(深圳)有限公司 | 笔记本电脑照相机功能自动测试系统及方法 |
CN103369227A (zh) * | 2012-03-26 | 2013-10-23 | 联想(北京)有限公司 | 一种运动对象的拍照方法及电子设备 |
CN103929596B (zh) * | 2014-04-30 | 2016-09-14 | 努比亚技术有限公司 | 引导拍摄构图的方法及装置 |
CN104125433A (zh) * | 2014-07-30 | 2014-10-29 | 西安冉科信息技术有限公司 | 基于多球机联动结构的视频运动目标监控方法 |
CN105827928A (zh) * | 2015-01-05 | 2016-08-03 | 中兴通讯股份有限公司 | 一种选择对焦区域的方法及装置 |
CN106060373B (zh) * | 2015-04-03 | 2019-12-20 | 佳能株式会社 | 焦点检测装置及其控制方法 |
US10009536B2 (en) * | 2016-06-12 | 2018-06-26 | Apple Inc. | Applying a simulated optical effect based on data received from multiple camera sensors |
CN106357973A (zh) * | 2016-08-26 | 2017-01-25 | 深圳市金立通信设备有限公司 | 一种聚焦的方法及终端 |
JPWO2018062368A1 (ja) * | 2016-09-30 | 2019-08-15 | 株式会社ニコン | 撮像装置および撮像システム |
CN106454135B (zh) * | 2016-11-29 | 2019-11-01 | 维沃移动通信有限公司 | 一种拍照提醒方法及移动终端 |
CN107124556B (zh) * | 2017-05-31 | 2021-03-02 | Oppo广东移动通信有限公司 | 对焦方法、装置、计算机可读存储介质和移动终端 |
-
2019
- 2019-09-12 CN CN201910873337.6A patent/CN110505408B/zh active Active
- 2019-12-03 WO PCT/CN2019/122776 patent/WO2021047070A1/zh active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105554367A (zh) * | 2015-09-30 | 2016-05-04 | 宇龙计算机通信科技(深圳)有限公司 | 一种运动拍摄方法及移动终端 |
JP2017103601A (ja) * | 2015-12-01 | 2017-06-08 | 株式会社ニコン | 焦点検出装置およびカメラ |
US20180007254A1 (en) * | 2016-06-30 | 2018-01-04 | Canon Kabushiki Kaisha | Focus adjusting apparatus, focus adjusting method, and image capturing apparatus |
CN106961552A (zh) * | 2017-03-27 | 2017-07-18 | 联想(北京)有限公司 | 一种对焦控制方法及电子设备 |
CN110505408A (zh) * | 2019-09-12 | 2019-11-26 | 深圳传音控股股份有限公司 | 终端拍摄方法、装置、移动终端及可读存储介质 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115546111A (zh) * | 2022-09-13 | 2022-12-30 | 武汉海微科技有限公司 | 曲面屏检测方法、装置、设备及存储介质 |
CN115546111B (zh) * | 2022-09-13 | 2023-12-05 | 武汉海微科技有限公司 | 曲面屏检测方法、装置、设备及存储介质 |
Also Published As
Publication number | Publication date |
---|---|
CN110505408B (zh) | 2021-07-27 |
CN110505408A (zh) | 2019-11-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021047070A1 (zh) | 终端拍摄方法、装置、移动终端及可读存储介质 | |
WO2016208797A1 (ko) | 헤드셋 및 그 제어 방법 | |
WO2015160193A1 (en) | Wearable device, master device operating with the wearable device, and control method for wearable device | |
WO2012091185A1 (en) | Display device and method of providing feedback for gestures thereof | |
WO2014038916A1 (en) | System and method of controlling external apparatus connected with device | |
WO2018070624A2 (en) | Mobile terminal and control method thereof | |
WO2013042804A1 (en) | Mobile terminal, method for controlling of the mobile terminal and system | |
WO2018124334A1 (ko) | 전자장치 | |
WO2015178561A1 (ko) | 이동 단말기 및 그의 동적 프레임 조절 방법 | |
WO2015180013A1 (zh) | 一种终端的触摸操作方法及装置 | |
WO2020022780A1 (en) | Method and apparatus for establishing device connection | |
WO2020013676A1 (en) | Electronic device and operating method of controlling brightness of light source | |
EP3808097A1 (en) | Method and apparatus for establishing device connection | |
WO2018090822A1 (zh) | 基于智能手表的移动终端相机控制方法及控制系统 | |
WO2016182090A1 (ko) | 안경형 단말기 및 이의 제어방법 | |
WO2018049715A1 (zh) | 一种信息处理方法及其相关设备 | |
WO2015190668A1 (ko) | 이동 단말기 | |
WO2018135675A1 (ko) | 전자장치 | |
WO2020153766A1 (en) | Method for displaying visual information associated with voice input and electronic device supporting the same | |
WO2020171342A1 (ko) | 외부 객체의 정보에 기반하여 시각화된 인공 지능 서비스를 제공하는 전자 장치 및 전자 장치의 동작 방법 | |
WO2021080290A1 (en) | Electronic apparatus and control method thereof | |
WO2016024707A1 (ko) | 이동 단말기 및 그 제어 방법 | |
WO2018131747A1 (ko) | 이동 단말기 및 그 제어 방법 | |
WO2017119536A1 (ko) | 모바일 디바이스 및 모바일 디바이스의 제어방법 | |
WO2020013363A1 (ko) | 이동단말기 및 그 제어 방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19945085 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19945085 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 26/08/2022) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19945085 Country of ref document: EP Kind code of ref document: A1 |