WO2023284218A1 - 摄像控制方法、移动终端及存储介质 - Google Patents

摄像控制方法、移动终端及存储介质 Download PDF

Info

Publication number
WO2023284218A1
WO2023284218A1 PCT/CN2021/132163 CN2021132163W WO2023284218A1 WO 2023284218 A1 WO2023284218 A1 WO 2023284218A1 CN 2021132163 W CN2021132163 W CN 2021132163W WO 2023284218 A1 WO2023284218 A1 WO 2023284218A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging device
sharpness
image
state
camera
Prior art date
Application number
PCT/CN2021/132163
Other languages
English (en)
French (fr)
Inventor
黄朝远
Original Assignee
深圳传音控股股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳传音控股股份有限公司 filed Critical 深圳传音控股股份有限公司
Publication of WO2023284218A1 publication Critical patent/WO2023284218A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/633Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters

Definitions

  • the present application relates to the field of camera technology, and in particular to a camera control method, a mobile terminal and a storage medium.
  • the user When a user uses a mobile terminal installed with a camera to take pictures, the user often can only judge the clarity of the captured image through experience. For example, the user observes the sharpness of the image on the screen of the mobile phone by adjusting the object distance between the mobile phone and the subject, and takes an image when the sharpness is high.
  • the applicant found at least the following problems: the user observed the clarity of the image through experience, so that the clarity of the captured image was low, resulting in a poor shooting effect.
  • the present application provides a camera control method, a mobile terminal and a storage medium, which are used to solve the technical problem of poor shooting effect.
  • the present application provides a camera control method applied to a mobile terminal, including:
  • the camera lens of the camera device is a fixed-focus lens
  • the camera is controlled to take images of the target object.
  • step S13 includes:
  • the sharpness state is a raised state, a lowered state, or a state in the highest range;
  • the camera device is controlled to capture an image of the target object.
  • controlling the imaging device to capture images of the target object according to the clarity state of the imaging device includes:
  • the sharpness state of the imaging device it is judged whether there is a target image in the at least two images, and the sharpness of the target image is a preset sharpness;
  • judging whether there is a target image in the at least two images according to the sharpness state of the imaging device includes:
  • the sharpness state of the imaging device is in a reduced state, and the historical sharpness state of the imaging device is in the highest range state, then it is determined that the target image exists in the at least two images.
  • the method also includes:
  • Indicating information corresponding to the sharpness state of the imaging device is displayed on the shooting interface of the imaging device.
  • the indication information corresponding to the sharpness state of the imaging device includes at least one of the following: if the sharpness state of the imaging equipment is in an increasing state, then determine that the indication information is an upward arrow;
  • the sharpness state of the imaging device is a reduced state, then determine that the indication information is a downward arrow;
  • the sharpness state of the imaging device is in the highest range state, then it is determined that the indication information is a shooting frame.
  • the method also includes:
  • the captured image is processed according to the definition of the target image and the definition of the captured image.
  • processing the captured image according to the clarity of the target image and the clarity of the captured image includes:
  • the captured image is stored in the imaging device.
  • the present application also provides a camera control device, including a first acquisition module, a second acquisition module and a control module, wherein:
  • the first acquiring module is configured to acquire at least two images of the target object continuously captured by the imaging device, and optionally, the imaging lens of the imaging device is a fixed-focus lens;
  • the second acquiring module is configured to acquire the clarity of the at least two images
  • the control module is configured to control the camera device to take images of the target object according to the clarity of the at least two images.
  • control module is specifically used for:
  • the sharpness state is a raised state, a lowered state, or a state in the highest range;
  • the camera device is controlled to capture an image of the target object.
  • control module is specifically used for:
  • the sharpness state of the imaging device it is judged whether there is a target image in the at least two images, and the sharpness of the target image is a preset sharpness;
  • control module is specifically used for:
  • the sharpness state of the imaging device is in a reduced state, and the historical sharpness state of the imaging device is in the highest range state, then it is determined that the target image exists in the at least two images.
  • the camera control device further includes a display module, and the display module is used for:
  • Indicating information corresponding to the sharpness state of the imaging device is displayed on the shooting interface of the imaging device.
  • the indication information corresponding to the sharpness state of the imaging device includes at least one of the following: if the sharpness state of the imaging equipment is in an increasing state, then determine that the indication information is an upward arrow;
  • the sharpness state of the imaging device is a reduced state, then determine that the indication information is a downward arrow;
  • the sharpness state of the imaging device is in the highest range state, then it is determined that the indication information is a shooting frame.
  • the camera control device further includes a third acquisition module, and the third acquisition module is used for:
  • the captured image is processed according to the definition of the target image and the definition of the captured image.
  • the third acquiring module is specifically configured to:
  • the captured image is stored in the imaging device.
  • the present application also provides a mobile terminal, including: a memory and a processor, wherein a camera control program is stored in the memory, and when the camera control program is executed by the processor, the steps of the above method are implemented.
  • the present application also provides a computer storage medium, the computer storage medium stores a computer program, and when the computer program is executed by a processor, the steps of the above method are realized.
  • the present application provides a camera control method, a mobile terminal and a storage medium, which acquire at least two images continuously captured by the camera device on the target object, obtain the sharpness of the at least two images, and obtain the sharpness of the at least two images according to the to control the camera device to capture images of the target object.
  • the imaging device can accurately determine the sharpness of the multiple images continuously acquired, and control the imaging device to capture images of the target object according to the sharpness, so that the images captured by the imaging device have a higher definition, thereby improving the quality of shooting. Effect.
  • FIG. 1 is a schematic diagram of a hardware structure of a mobile terminal implementing various embodiments of the present application
  • FIG. 2 is a system architecture diagram of a communication network provided by an embodiment of the present application.
  • FIG. 3 is a schematic diagram of an application scenario provided by an embodiment of the present application.
  • FIG. 4 is a schematic flowchart of a camera control method provided in an embodiment of the present application.
  • FIG. 5 is a schematic diagram of the relationship between image clarity and object distance provided by the embodiment of the present application.
  • FIG. 6 is a schematic diagram of a clarity state provided by the embodiment of the present application as being in the highest range state
  • FIG. 7 is a schematic flow chart of displaying the sharpness state of an imaging device provided by an embodiment of the present application.
  • FIG. 8A is a schematic diagram of a process for displaying indication information provided by an embodiment of the present application.
  • FIG. 8B is a schematic diagram of another process for displaying indication information provided by the embodiment of the present application.
  • FIG. 8C is a schematic diagram of another process of displaying indication information provided by the embodiment of the present application.
  • FIG. 9 is a schematic diagram of a process for processing captured images provided by an embodiment of the present application.
  • FIG. 10 is a schematic diagram of acquiring a captured image provided by an embodiment of the present application.
  • FIG. 11 is a process schematic diagram of a camera control method provided in an embodiment of the present application.
  • FIG. 12 is a schematic structural diagram of a camera control device provided in an embodiment of the present application.
  • FIG. 13 is a schematic structural diagram of another camera control device provided by an embodiment of the present application.
  • first, second, third, etc. may be used herein to describe various information, the information should not be limited to these terms. These terms are only used to distinguish information of the same type from one another. For example, without departing from the scope of this document, first information may also be called second information, and similarly, second information may also be called first information.
  • first information may also be called second information, and similarly, second information may also be called first information.
  • second information may also be called first information.
  • the word “if” as used herein may be interpreted as “at” or “when” or “in response to a determination”.
  • the singular forms "a”, “an” and “the” are intended to include the plural forms as well, unless the context indicates otherwise.
  • A, B, C means “any of the following: A; B; C; A and B; A and C; B and C; A and B and C
  • A, B or C or "A, B and/or C” means "any of the following: A; B; C; A and B; A and C; B and C; A and B and C”. Exceptions to this definition will only arise when combinations of elements, functions, steps or operations are inherently mutually exclusive in some way.
  • the words “if”, “if” as used herein may be interpreted as “at” or “when” or “in response to determining” or “in response to detecting”.
  • the phrases “if determined” or “if detected (the stated condition or event)” could be interpreted as “when determined” or “in response to the determination” or “when detected (the stated condition or event) )” or “in response to detection of (a stated condition or event)”.
  • step codes such as S11 and S12 are used, the purpose of which is to express the corresponding content more clearly and concisely, and does not constitute a substantive limitation on the order.
  • S12 will be executed first and then S11, etc., but these should be within the scope of protection of this application.
  • Mobile terminals may be implemented in various forms.
  • the mobile terminals described in this application may include mobile phones, tablet computers, notebook computers, palmtop computers, personal digital assistants (Personal Digital Assistant, PDA), portable media players (Portable Media Player, PMP), navigation devices, Mobile terminals such as wearable devices, smart bracelets, and pedometers, and fixed terminals such as digital TVs and desktop computers.
  • PDA Personal Digital Assistant
  • PMP portable media players
  • Navigation devices Mobile terminals such as wearable devices, smart bracelets, and pedometers
  • Mobile terminals such as wearable devices, smart bracelets, and pedometers
  • fixed terminals such as digital TVs and desktop computers.
  • a mobile terminal will be taken as an example, and those skilled in the art will understand that, in addition to elements specially used for mobile purposes, the configurations according to the embodiments of the present application can also be applied to fixed-type terminals.
  • FIG. 1 is a schematic diagram of the hardware structure of a mobile terminal implementing various embodiments of the present application.
  • the mobile terminal 100 may include: an RF (Radio Frequency, radio frequency) unit 101, a WiFi module 102, an audio output unit 103, an A /V (audio/video) input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111 and other components.
  • RF Radio Frequency, radio frequency
  • the radio frequency unit 101 can be used for sending and receiving information or receiving and sending signals during a call. Specifically, after receiving the downlink information of the base station, it is processed by the processor 110; in addition, the uplink data is sent to the base station.
  • the radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency unit 101 can also communicate with the network and other devices through wireless communication.
  • the above wireless communication can use any communication standard or protocol, including but not limited to GSM (Global System of Mobile communication, Global System for Mobile Communications), GPRS (General Packet Radio Service, General Packet Radio Service), CDMA2000 (Code Division Multiple Access 2000 , Code Division Multiple Access 2000), WCDMA (Wideband Code Division Multiple Access, Wideband Code Division Multiple Access), TD-SCDMA (Time Division-Synchronous Code Division Multiple Access, Time Division Synchronous Code Division Multiple Access), FDD-LTE (Frequency Division Duplexing-Long Term Evolution, frequency division duplex long-term evolution) and TDD-LTE (Time Division Duplexing-Long Term Evolution, time-division duplex long-term evolution), etc.
  • GSM Global System of Mobile communication, Global System for Mobile Communications
  • GPRS General Packet Radio Service
  • CDMA2000 Code Division Multiple Access 2000
  • WCDMA Wideband Code Division Multiple Access
  • TD-SCDMA Time Division-Synchronous Code Division Multiple Access, Time Division Synchronous Code Division
  • WiFi is a short-distance wireless transmission technology.
  • the mobile terminal can help users send and receive emails, browse web pages, and access streaming media through the WiFi module 102, which provides users with wireless broadband Internet access.
  • Fig. 1 shows the WiFi module 102, it can be understood that it is not an essential component of the mobile terminal, and can be completely omitted as required without changing the essence of the application.
  • the audio output unit 103 can store the audio received by the radio frequency unit 101 or the WiFi module 102 or in the memory 109 when the mobile terminal 100 is in a call signal receiving mode, a call mode, a recording mode, a voice recognition mode, a broadcast receiving mode, or the like.
  • the audio data is converted into an audio signal and output as sound.
  • the audio output unit 103 can also provide audio output related to a specific function performed by the mobile terminal 100 (eg, call signal reception sound, message reception sound, etc.).
  • the audio output unit 103 may include a speaker, a buzzer, and the like.
  • the A/V input unit 104 is used to receive audio or video signals.
  • the A/V input unit 104 may include a graphics processing unit (Graphics Processing Unit, GPU) 1041 and a microphone 1042, and the graphics processing unit 1041 is used for still pictures or The image data of the video is processed.
  • the processed image frames may be displayed on the display unit 106 .
  • the image frames processed by the graphics processor 1041 may be stored in the memory 109 (or other storage media) or sent via the radio frequency unit 101 or the WiFi module 102 .
  • the microphone 1042 can receive sound (audio data) via the microphone 1042 in a phone call mode, a recording mode, a voice recognition mode, and the like operating modes, and can process such sound as audio data.
  • the processed audio (voice) data can be converted into a format transmittable to a mobile communication base station via the radio frequency unit 101 for output in case of a phone call mode.
  • the microphone 1042 may implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated in the process of receiving and transmitting audio signals.
  • the mobile terminal 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor includes an ambient light sensor and a proximity sensor.
  • the ambient light sensor can adjust the brightness of the display panel 1061 according to the brightness of the ambient light, and the proximity sensor can turn off the display when the mobile terminal 100 moves to the ear. panel 1061 and/or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in various directions (generally three axes), and can detect the magnitude and direction of gravity when it is stationary, and can be used to identify the application of mobile phone posture (such as horizontal and vertical screen switching, related Games, magnetometer attitude calibration), vibration recognition related functions (such as pedometer, tap), etc.; as for mobile phones, fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, Other sensors such as thermometers and infrared sensors will not be described in detail here.
  • the display unit 106 is used to display information input by the user or information provided to the user.
  • the display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a liquid crystal display (Liquid Crystal Display, LCD), an organic light-emitting diode (Organic Light-Emitting Diode, OLED), or the like.
  • LCD Liquid Crystal Display
  • OLED Organic Light-Emitting Diode
  • the user input unit 107 can be used to receive input numbers or character information, and generate key signal input related to user settings and function control of the mobile terminal.
  • the user input unit 107 may include a touch panel 1071 and other input devices 1072 .
  • the touch panel 1071 also referred to as a touch screen, can collect touch operations of the user on or near it (for example, the user uses any suitable object or accessory such as a finger or a stylus on the touch panel 1071 or near the touch panel 1071). operation), and drive the corresponding connection device according to the preset program.
  • the touch panel 1071 may include two parts, a touch detection device and a touch controller.
  • the touch detection device detects the user's touch orientation, detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device and converts it into contact coordinates , and then sent to the processor 110, and can receive the command sent by the processor 110 and execute it.
  • the touch panel 1071 can be implemented in various types such as resistive, capacitive, infrared, and surface acoustic wave.
  • the user input unit 107 may also include other input devices 1072 .
  • other input devices 1072 may include, but are not limited to, one or more of physical keyboards, function keys (such as volume control buttons, switch buttons, etc.), trackballs, mice, joysticks, etc., which are not specifically described here. limited.
  • the touch panel 1071 may cover the display panel 1061.
  • the touch panel 1071 detects a touch operation on or near it, it transmits to the processor 110 to determine the type of the touch event, and then the processor 110 determines the touch event according to the touch event.
  • the corresponding visual output is provided on the display panel 1061 .
  • the touch panel 1071 and the display panel 1061 are used as two independent components to realize the input and output functions of the mobile terminal, in some embodiments, the touch panel 1071 and the display panel 1061 can be integrated.
  • the implementation of the input and output functions of the mobile terminal is not specifically limited here.
  • the interface unit 108 serves as an interface through which at least one external device can be connected with the mobile terminal 100 .
  • an external device may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, audio input/output (I/O) ports, video I/O ports, headphone ports, and more.
  • the interface unit 108 can be used to receive input from an external device (for example, data information, power, etc.) transfer data between devices.
  • the memory 109 can be used to store software programs as well as various data.
  • the memory 109 can mainly include a storage program area and a storage data area.
  • the storage program area can store an operating system, at least one function required application program (such as a sound playback function, an image playback function, etc.) etc.
  • the storage data area can be Store data (such as audio data, phone book, etc.) created according to the use of the mobile phone.
  • the memory 109 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage devices.
  • the processor 110 is the control center of the mobile terminal, and uses various interfaces and lines to connect various parts of the entire mobile terminal, by running or executing software programs and/or modules stored in the memory 109, and calling data stored in the memory 109 , execute various functions of the mobile terminal and process data, so as to monitor the mobile terminal as a whole.
  • the processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor and a modem processor.
  • the application processor mainly processes operating systems, user interfaces, and application programs, etc.
  • the demodulation processor mainly handles wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 110 .
  • the mobile terminal 100 can also include a power supply 111 (such as a battery) for supplying power to various components.
  • a power supply 111 (such as a battery) for supplying power to various components.
  • the power supply 111 can be logically connected to the processor 110 through a power management system, so as to manage charging, discharging, and power consumption through the power management system. and other functions.
  • the mobile terminal 100 may also include a Bluetooth module, etc., which will not be repeated here.
  • the following describes the communication network system on which the mobile terminal of the present application is based.
  • Fig. 2 is a kind of communication network system architecture diagram that the embodiment of the present application provides, and this communication network system is the LTE system of general mobile communication technology, and this LTE system includes the UE (User Equipment, user equipment) that communication connects sequentially ) 201, E-UTRAN (Evolved UMTS Terrestrial Radio Access Network, Evolved UMTS Terrestrial Radio Access Network) 202, EPC (Evolved Packet Core, Evolved Packet Core Network) 203 and the operator's IP service 204.
  • UE User Equipment, user equipment
  • E-UTRAN Evolved UMTS Terrestrial Radio Access Network
  • EPC Evolved Packet Core, Evolved Packet Core Network
  • the UE 201 may be the above-mentioned terminal 100, which will not be repeated here.
  • E-UTRAN 202 includes eNodeB 2021 and other eNodeB 2022 and so on.
  • the eNodeB 2021 can be connected to other eNodeB 2022 through a backhaul (for example, X2 interface), the eNodeB 2021 is connected to the EPC 203 , and the eNodeB 2021 can provide access from the UE 201 to the EPC 203 .
  • a backhaul for example, X2 interface
  • EPC203 may include MME (Mobility Management Entity, Mobility Management Entity) 2031, HSS (Home Subscriber Server, Home Subscriber Server) 2032, other MME2033, SGW (Serving Gate Way, Serving Gateway) 2034, PGW (PDN Gate Way, packet data Network Gateway) 2035 and PCRF (Policy and Charging Rules Function, Policy and Charging Functional Entity) 2036, etc.
  • MME2031 is a control node that processes signaling between UE201 and EPC203, and provides bearer and connection management.
  • HSS2032 is used to provide some registers to manage functions such as home location register (not shown in the figure), and save some user-specific information about service features and data rates.
  • PCRF2036 is the policy and charging control policy decision point of business data flow and IP bearer resources, it is the policy and charging execution function A unit (not shown) selects and provides available policy and charging control decisions.
  • the IP service 204 may include Internet, Intranet, IMS (IP Multimedia Subsystem, IP Multimedia Subsystem) or other IP services.
  • IMS IP Multimedia Subsystem, IP Multimedia Subsystem
  • LTE system is used as an example above, those skilled in the art should know that this application is not only applicable to the LTE system, but also applicable to other wireless communication systems, such as GSM, CDMA2000, WCDMA, TD-SCDMA and future new wireless communication systems.
  • the network system, etc. are not limited here.
  • the camera lens in the terminal equipment uses a fixed-focus lens to reduce the cost of the terminal equipment, and the terminal equipment equipped with a fixed-focus lens does not have the function of auto-focusing.
  • the user judges the sharpness of the captured image through experience. For example, the user can adjust the object distance between the mobile phone and the subject to change the sharpness of the image on the screen. When the user observes that the sharpness is appropriate, the user clicks the camera button to obtain the image.
  • the sharpness of the image observed by the user through experience makes the sharpness of the captured image lower (the sharpness of the image observed by the user during macro shooting may not be the image with the best clarity), which in turn leads to a poor shooting effect.
  • an embodiment of the present application provides a camera control method, which acquires at least two images continuously collected by the camera device for the target object.
  • the camera lens of the camera device is fixed
  • the focus lens obtains the sharpness of at least two images, and determines the current sharpness state of the imaging device according to the sharpness of the at least two images.
  • the sharpness state is an increased state, a reduced state, or a state in the highest range.
  • the camera device is controlled to capture images of the target object.
  • FIG. 3 is a schematic diagram of an application scenario provided by an embodiment of the present application. See Figure 3, including camera equipment and object of interest.
  • the shooting function of the camera equipment has been turned on.
  • the image of the target object displayed on the screen of the imaging device is relatively blurred.
  • the imaging device moves towards the target object, the object distance between the imaging device and the target object changes, and the definition of the image of the target object displayed on the screen of the imaging device changes.
  • the camera device acquires multiple images of the target object, and determines the image with the highest definition among the multiple images of the target object as the captured image. In this way, when the lens of the imaging device is a fixed-focus lens, the imaging device can determine the captured image according to the definition of the image, thereby improving the shooting effect.
  • FIG. 4 is a schematic flowchart of a camera control method provided by an embodiment of the present application. See Figure 4, the method can include:
  • the executor of the embodiment of the present application may be a mobile terminal, or may be a camera control device set in the mobile terminal.
  • the camera control device may be implemented by software, or by a combination of software and hardware.
  • the camera device is any device with a camera function.
  • the imaging device may be a camera, a mobile phone, a tablet computer, and the like.
  • the camera lens of the camera equipment is a fixed focus lens.
  • the fixed-focus lens is a lens with only one fixed focal length. In an actual application process, if the lens of the camera device is a fixed-focus lens, the camera device cannot automatically focus.
  • the target object is an object to be photographed by the imaging device. For example, if the imaging device photographs a water cup, the target object is a water cup; if the imaging device photographs a table, the target object is a table.
  • the camera device when the camera device turns on the shooting function, the camera device can acquire images including the target object in real time. For example, when the mobile phone turns on the shooting function, the mobile phone can obtain the image of each frame captured by the lens. Optionally, when the camera device turns on the shooting function, the camera device may continuously acquire multiple images including the target object,
  • Sharpness is used to indicate the size of the focus value of the image. For example, the larger the focus value of the image, the higher the definition of the image, and the smaller the focus value of the image, the lower the definition of the image.
  • the sharpness of at least two images can be obtained through a preset model.
  • the preset model is learned from multiple groups of samples, and each group of samples includes a sample image and a sample definition.
  • the sets of samples can be pre-labeled samples. For example, for the sample image 1, the sample resolution 1 corresponding to the sample image 1 is acquired to obtain a group of samples, the group of samples includes the sample image 1 and the sample resolution 1. In this way, multiple groups of samples can be obtained. For example, multiple sets of samples can be shown in Table 1:
  • Sample Clarity 1 The second set of samples sample image 2
  • Sample Clarity 2 The third set of samples sample image 3
  • Sample Clarity 3 ... ... ...
  • Table 1 is only an illustration of multiple groups of samples, and is not a limitation on multiple groups of samples.
  • the corresponding resolution of the image is sample resolution 1; if the image input into the preset model is the same as sample image 2, the corresponding resolution of the image is is sample sharpness 2; if the image input into the preset model is the same as sample image 3, then the corresponding sharpness of the image is sample sharpness 3.
  • the sharpness corresponding to the image is displayed in real time on the shooting page of the camera device.
  • the sharpness corresponding to the image is displayed in a preset area of the shooting page, so that the user can accurately determine the corresponding sharpness of the image when the image is currently captured.
  • the clarity of the image captured by the imaging device is related to the object distance between the imaging device and the target object.
  • FIG. 5 is a schematic diagram of a relationship between image clarity and object distance provided by an embodiment of the present application. See Figure 5 for a coordinate system including the relationship between sharpness and object distance.
  • the horizontal axis of the coordinate system is the object distance between the imaging device and the target object
  • the vertical axis of the coordinate system is the corresponding definition of the image acquired by the imaging device.
  • the definition of the image captured by the camera device first increases and then decreases, and the highest point in the coordinate system is the highest definition of the image captured by the camera device.
  • the camera device can acquire each frame of image captured by the lens, and determine the corresponding definition of each frame of image.
  • the imaging device may be controlled to capture images of the target object according to the following feasible implementation manner: determining the sharpness status of the imaging device according to the sharpness of at least two images.
  • the sharpness state is an increased state, a reduced state, and a state in the highest range.
  • the increasing state is used to indicate that the definition of the image currently captured by the imaging device is increasing. For example, when the user moves the imaging device, if the sharpness of the image displayed on the screen of the imaging device is increasing, the current sharpness state of the imaging device is an increasing state.
  • the reduced state is used to indicate that the definition of the image currently captured by the imaging device is decreasing. For example, when the user moves the camera device, the object distance between the camera device and the photographed target object changes, and if the sharpness of the image displayed on the screen of the camera device is decreasing, the current sharpness state of the camera device is a declining state .
  • the state of being in the highest range is used to indicate that the resolution of the image currently captured by the camera device is within the preset range of the highest resolution.
  • the preset interval is a preset value. For example, if the highest resolution of the image is 100 and the preset interval is 10, then when the resolution of the image captured by the imaging device is 90-100, the state of the imaging device is in the highest range state. For example, when the user moves the camera device, the object distance between the camera device and the photographed target object changes, if the resolution of the image displayed on the screen of the camera device is within the preset range of the highest resolution, the camera device currently The sharpness status of is at the highest range status.
  • the camera device may include multiple resolution states.
  • the camera device may be in the raised state and in the highest range state at the same time, or the camera device may be in the lowered state and in the highest range state at the same time.
  • Case 1 The camera moves in one direction.
  • the sharpness state of the imaging device is an increased state. For example, when the imaging device moves towards the target object, the sharpness of the four images continuously acquired by the imaging device increases sequentially, which means that the current sharpness state of the imaging device is an increasing state.
  • the sharpness state of the imaging device is a reduced state. For example, when the imaging device moves towards the target object, the sharpness of the four images continuously acquired by the imaging device decreases successively, which means that the current sharpness state of the imaging device is in a reduced state.
  • the sharpness of the image captured by the camera device rises first and then decreases, then when the camera device is moving towards the target object, the sharpness state of the camera device corresponding to the sharpness within the preset range of the highest value of sharpness is at the highest scope status. For example, when the imaging device moves away from the target object or toward the target object, if the resolution of the first image is smaller than that of the second image, when the imaging device acquires the first image and the second image, the imaging device The sharpness state of the state is increased.
  • the camera equipment will acquire the third image And when the fourth image is acquired, the sharpness state of the imaging device is in a reduced state. At this time, it can be determined that the third image acquired by the imaging device has the highest sharpness, within the preset interval of the third image's sharpness. The sharpness state of the camera equipment corresponding to the sharpness is in the highest range state.
  • Case 2 The camera moves in multiple directions.
  • the definition When the camera moves in multiple directions, if the camera moves towards the target object, the definition continues to increase, then when the camera moves away from the target object, the definition decreases, and if the camera moves towards the target object, the definition continues to decrease , the sharpness increases when the imaging device moves away from the target object. Therefore, when the imaging device moves in multiple directions, when the sharpness state of the imaging device changes from the rising state to the falling state, it cannot be determined that the sharpness state of the imaging device is in the highest range state.
  • the sharpness state of the imaging device is in the highest range state.
  • FIG. 6 is a schematic diagram of a clarity state provided by an embodiment of the present application as being in the highest range state. See Figure 6, including the coordinate system for the relationship between sharpness and object distance.
  • the horizontal axis of the coordinate system is the object distance between the imaging device and the target object, and the vertical axis of the coordinate system is the corresponding definition of the image acquired by the imaging device.
  • the definition of the image obtained by the camera device increases, and when the definition of the image obtained by the camera device reaches the highest point, the The sharpness decreases.
  • the sharpness state of the camera equipment is in the highest range state.
  • the sharpness of the image captured by the camera equipment is higher. .
  • the camera device is controlled to shoot the target object.
  • the imaging device may be controlled to take images of the target object according to the following feasible implementation manner: according to the current sharpness state of the imaging device, it is judged whether there is a target image in at least two images.
  • the resolution of the target image is the preset resolution.
  • the preset definition can be the highest definition.
  • the resolution of the target image is the highest resolution that can be captured by the camera device.
  • the historical sharpness state is the sharpness state of the imaging device before the current moment.
  • the sharpness state of the imaging device before the current moment includes a raised state and a state in the highest range
  • the historical sharpness state includes a raised state and a state in the highest range.
  • the current resolution state of the camera equipment is declining, which means that the resolution of the image acquired by the camera equipment at the current moment is lower than that of the image acquired at the previous moment. If the sharpness state is in the highest range state, it means that the multiple images acquired by the imaging device in this imaging process include the highest-definition target image.
  • the target image is stored in the imaging device.
  • the imaging device acquires multiple images during this shooting, and if the multiple images include the target image with the highest definition, the target image is stored in the imaging device.
  • An embodiment of the present application provides a camera control method.
  • the camera lens of the camera device is a fixed-focus lens, and at least two images of the target object continuously collected by the camera device are obtained, and the images of the at least two images are obtained according to a preset model.
  • Sharpness and determine the current sharpness state of the imaging device based on the sharpness of at least two images.
  • the sharpness state is an elevated state, a reduced state, and a state in the highest range, according to the current sharpness state of the imaging device , judging whether there is a target image with a preset resolution in the at least two images, and storing the target image in the imaging device when it is determined that the target image exists in the at least two images.
  • the camera equipment with a fixed-focus lens when used to capture images, since the camera equipment can accurately obtain the sharpness of multiple images, and determine the clarity state of the camera equipment according to the clarity of the multiple images, and then can According to the definition status of the camera equipment, it is accurately judged whether there is an image with the highest definition among the multiple images, so that the image with the highest definition can be stored in the camera equipment, thereby improving the shooting effect of the fixed-focus camera equipment.
  • the present application also includes a method for displaying the sharpness status of the imaging device on the shooting page of the imaging device.
  • a method for displaying the sharpness status of the imaging device on the shooting page of the imaging device Next, with reference to FIG. 7 , the process of displaying the sharpness status of the imaging device on the shooting page of the imaging device will be described.
  • FIG. 7 is a schematic flow chart of displaying a sharpness state of an imaging device provided by an embodiment of the present application. See Figure 7, the method includes:
  • Clarity status can be raised, lowered or in the highest range.
  • step S71 reference may be made to the execution steps of step S13, which will not be repeated in this embodiment of the present application.
  • the indication information is used to indicate the sharpness state of the imaging device.
  • the indication information may be an up arrow, a down arrow or a shooting frame.
  • the indication information may be displayed on the shooting page of the imaging device according to the following feasible implementation manner: if the current sharpness state of the imaging equipment is in an increased state, then determine that the indication information is an upward arrow. For example, if the sharpness status of the imaging device is increased, an upward arrow may be generated in a preset window on the shooting page of the imaging device, so that the user can determine that the current sharpness status of the imaging device is increased through the upward arrow.
  • the indication information is a downward arrow.
  • a downward arrow may be generated in a preset window on the shooting page of the imaging device, so that the user can determine that the current sharpness status of the imaging device is reduced through the downward arrow.
  • the indication information is a shooting frame. For example, if the resolution status of the camera device is in the highest range state, a green shooting frame can be generated in the preset window on the shooting page of the camera device, so that the user can determine the current resolution status of the camera device through the green shooting frame is in the highest range state.
  • FIG. 8A is a schematic diagram of a process of displaying indication information provided by an embodiment of the present application. See Figure 8A, including camera equipment and a target object. An image including a target object may be displayed on the photographing page of the imaging device. When the camera device is moving towards the target object, the sharpness state of the camera device is an increased state.
  • FIG. 8B is a schematic diagram of another process for displaying indication information provided by the embodiment of the present application. See Figure 8B, including camera equipment and target objects. An image including a target object may be displayed on the photographing page of the imaging device. When the imaging device moves in a direction away from the target object, the sharpness state of the imaging device is a reduced state.
  • a downward arrow is generated in the preset area of the shooting page of the imaging device, so that the user can accurately determine the current sharpness state of the imaging device through the downward arrow. Decline state, improve the effect of shooting.
  • FIG. 8C is a schematic diagram of another process for displaying indication information provided by the embodiment of the present application. See Figure 8C, including camera equipment and target object. An image including a target object may be displayed on the photographing page of the imaging device. When the imaging device moves toward the target object, the sharpness state of the imaging device exists in the highest range state.
  • An embodiment of the present application provides a camera control method, which determines the current sharpness state of the camera device, and displays indication information corresponding to the current resolution state of the camera device on the shooting page of the camera device.
  • the indication information includes an up arrow, Down Arrow and Shooting Box.
  • the corresponding instruction information can be displayed in the preset area of the shooting page of the imaging device, so that the user can accurately determine the location of the imaging device according to the instruction information in the shooting page. Sharpness status, thereby improving the shooting effect.
  • the user can also take the initiative to take the image, and the present application also includes the process of processing the shot image.
  • the process of processing captured images will be described in detail.
  • FIG. 9 is a schematic diagram of a process for processing captured images provided by an embodiment of the present application. See Figure 9, the method includes:
  • the shooting instruction may be the user's click operation on the shooting button on the shooting page.
  • the captured image corresponding to the capturing instruction may be an image displayed on the capturing page when the imaging device receives the capturing instruction.
  • the photographed image corresponding to the photographing instruction is the image displayed on the photographing page of the photographing device when the user clicks the photographing button on the photographing page.
  • FIG. 10 is a schematic diagram of acquiring a captured image provided by an embodiment of the present application. See Figure 10, including camera equipment.
  • the imaging device includes a shooting page, and the shooting page includes an image of the target object and a shooting button.
  • the user clicks the shooting button, and the captured image acquired by the camera device is the image displayed on the shooting page.
  • the captured image may be processed according to the following feasible implementation methods:
  • the captured image is deleted. For example, if the resolution of the target image acquired by the camera device is the same as that of the image obtained by the user clicking the shooting button, the camera device only needs to save one image, and at this time, the camera device can delete the captured image corresponding to the shooting command , the camera device can also delete the target image.
  • the captured image is stored in the imaging device.
  • the resolution of the target image acquired by the camera device is different from the resolution of the image obtained by the user clicking the shooting button, since the resolution of the target image is the highest resolution, the resolution of the captured image corresponding to the shooting instruction is less than The clarity of the target image.
  • the shooting image corresponding to the shooting instruction can be stored in the camera device, which can ensure the user's shooting function with the camera device, improve the flexibility of the camera device, and then improve the shooting effect of the camera device.
  • An embodiment of the present application provides a camera control method. After the camera device acquires the target image, it responds to the shooting command to obtain the captured image corresponding to the shooting command, and processes the captured image according to the clarity of the target image and the clarity of the captured image. . In this way, when the resolution of the target image is the same as that of the captured image, the imaging device deletes the captured image, thereby saving memory space, and when the resolution of the captured image is different from that of the target image, the captured image is saved in the imaging device , and then improve the flexibility of shooting and improve the effect of shooting without affecting the active shooting function.
  • FIG. 11 is a schematic diagram of a process of an imaging control method provided by an embodiment of the present application. See Figure 11, including camera equipment and target object. When the camera device moves towards the target object, the camera device acquires at least two images continuously, and determines the sharpness state of the camera device according to the sharpness of the at least two images.
  • an upward arrow is generated in the preset window of the camera device.
  • the resolution of the image acquired by the camera device rises to the preset range of the highest resolution.
  • a shooting frame is generated in the preset window, and when the resolution of the image acquired by the camera device is increased to the highest resolution, the camera device stores the image with the highest resolution.
  • the sharpness of the image captured by the camera device decreases.
  • the sharpness state of the camera device is in a declining state, and a downward arrow is generated in the preset window of the shooting page of the camera device.
  • the imaging equipment can obtain the sharpness of the image during the movement of the imaging equipment, and determine the sharpness state of the imaging equipment according to the clarity of the image, and then determine the sharpness state of the imaging equipment according to the clarity of the imaging equipment. to determine the target image with the highest resolution and save the target image with the highest resolution, which can improve the shooting effect of the camera equipment.
  • FIG. 12 is a schematic structural diagram of a camera control device provided by an embodiment of the present application. Please refer to Fig. 12, the camera control device 10 can be set in a mobile terminal, the camera control device 10 includes a first acquisition module 11, a second acquisition module 12 and a control module 13, wherein:
  • the first acquiring module 11 is configured to acquire at least two images of the target object continuously captured by the imaging device, optionally, the imaging lens of the imaging device is a fixed-focus lens;
  • the second acquiring module 12 is configured to acquire the clarity of the at least two images
  • the control module 13 is configured to control the imaging device to capture images of the target object according to the clarity of the at least two images.
  • control module 13 is specifically used for:
  • the sharpness state of the imaging device determines the current sharpness state of the imaging device, and the sharpness state is an increased state, a reduced state, and a state in the highest range;
  • the imaging device is controlled to capture an image of the target object.
  • control module 13 is specifically used for:
  • the current sharpness state of the imaging device it is judged whether there is a target image in the at least two images, and the sharpness of the target image is a preset sharpness;
  • control module 13 is specifically used for:
  • the current sharpness state of the imaging device is a reduced state, and the historical sharpness state of the imaging device is in the highest range state, then it is determined that the target image exists in the at least two images.
  • An imaging control device provided in an embodiment of the present application can implement the technical solutions shown in the above method embodiments, and its implementation principles and beneficial effects are similar, and will not be repeated here.
  • FIG. 13 is a schematic structural diagram of another camera control device provided by an embodiment of the present application.
  • the camera control device 10 further includes a display module 14, and the display module 14 is used for:
  • Indicating information corresponding to the current sharpness state of the imaging device is displayed on the shooting interface of the imaging device.
  • the current sharpness state of the imaging device is an increased state, then determine that the indication information is an upward arrow;
  • the current sharpness state of the imaging device is a reduced state, then determine that the indication information is a downward arrow;
  • the indication information is a shooting frame.
  • the camera control device further includes a third acquisition module 15, and the third acquisition module 15 is used for:
  • the captured image is processed according to the definition of the target image and the definition of the captured image.
  • the third acquiring module 15 is specifically configured to:
  • the captured image is stored in the imaging device.
  • An imaging control device provided in an embodiment of the present application can implement the technical solutions shown in the above method embodiments, and its implementation principles and beneficial effects are similar, and will not be repeated here.
  • the present application also provides a mobile terminal device.
  • the terminal device includes a memory and a processor, and a camera control program is stored in the memory.
  • the camera control program is executed by the processor, the steps of the camera control method in any of the above embodiments are implemented.
  • the present application also provides a computer-readable storage medium, on which a camera control program is stored.
  • a camera control program is stored.
  • the steps of the camera control method in any of the above embodiments are implemented.
  • the embodiments of the mobile terminal and the computer-readable storage medium provided in this application include all the technical features of the embodiments of the camera control method described above. Do repeat.
  • An embodiment of the present application further provides a computer program product, the computer program product includes computer program code, and when the computer program code is run on the computer, the computer is made to execute the methods in the above various possible implementation manners.
  • the embodiment of the present application also provides a chip, including a memory and a processor.
  • the memory is used to store a computer program
  • the processor is used to call and run the computer program from the memory, so that the device installed with the chip executes the above various possible implementation modes. Methods.
  • the methods of the above embodiments can be implemented by means of software plus a necessary general-purpose hardware platform, and of course also by hardware, but in many cases the former is better implementation.
  • the technical solution of the present application can be embodied in the form of a software product in essence or in other words, the part that contributes to the prior art, and the computer software product is stored in one of the above storage media (such as ROM/RAM, magnetic CD, CD), including several instructions to make a terminal device (which may be a mobile phone, computer, server, controlled terminal, or network device, etc.) execute the method of each embodiment of the present application.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)
  • Telephone Function (AREA)

Abstract

本申请提供一种摄像控制方法、移动终端及存储介质,该方法包括:获取摄像设备对目标对象连续采集的至少两张图像;获取所述至少两张图像的清晰度;根据所述至少两张图像的清晰度,控制所述摄像设备对所述目标对象进行图像拍摄。提高拍摄的效果。

Description

摄像控制方法、移动终端及存储介质
本申请要求于2021年7月13日提交中国专利局、申请号为202110791430.X、申请名称为“摄像控制方法、移动终端及存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及摄像技术领域,具体涉及一种摄像控制方法、移动终端及存储介质。
背景技术
在用户使用安装摄像头的移动终端进行拍摄时,用户往往只能通过经验判断拍摄图像的清晰度。例如,用户通过调整手机与拍摄对象之间的物距,以观察手机屏幕中图像的清晰度,在清晰度较高时拍摄图像。
在构思及实现本申请过程中,申请人发现至少存在如下问题:用户通过经验观察图像的清晰度,使得拍摄的图像的清晰度较低,进而导致拍摄的效果较差。
前面的叙述在于提供一般的背景信息,并不一定构成现有技术。
申请内容
针对上述技术问题,本申请提供一种摄像控制方法、移动终端及存储介质,用于解决拍摄的效果较差的技术问题。
为解决上述技术问题,本申请提供一种摄像控制方法,应用于移动终端,包括:
获取摄像设备对目标对象连续采集的至少两张图像,可选地,所述摄像设备的摄像镜头为定焦镜头;
获取所述至少两张图像的清晰度;
根据所述至少两张图像的清晰度,控制所述摄像设备对所述目标对象进行图像拍摄。
可选地,所述步骤S13,包括:
根据所述至少两张图像的清晰度,确定所述摄像设备清晰度状态,可选地,所述清晰度状态为升高状态、降低状态、位于最高范围状态;
根据所述摄像设备清晰度状态,控制所述摄像设备对所述目标对象进行图像拍摄。
可选地,根据所述摄像设备清晰度状态,控制所述摄像设备对所述目标对象进行图像拍摄,包括:
根据所述摄像设备清晰度状态,判断所述至少两张图像中是否存在目标图像,所述目标图像的清晰度为预设清晰度;
在确定所述至少两张图像中存在目标图像时,将所述目标图像存储至所述摄像设备。
可选地,根据所述摄像设备清晰度状态,判断所述至少两张图像中是否存在目标图像,包括:
若所述摄像设备清晰度状态为降低状态,且所述摄像设备的历史清晰度状态中存在位于最高范围状态,则确定所述至少两张图像中存在目标图像。
可选地,所述方法还包括:
在所述摄像设备的拍摄界面中显示所述摄像设备清晰度状态对应的指示信息。
可选地,所述摄像设备清晰度状态对应的指示信息,包括以下至少一种:若所述摄像设备清晰度状态为升高状态,则确定所述指示信息为上升箭头;
若所述摄像设备清晰度状态为降低状态,则确定所述指示信息为下降箭头;
若所述摄像设备清晰度状态为位于最高范围状态,则确定所述指示信息为拍摄框。
可选地,所述方法还包括:
响应于拍摄指令,获取所述拍摄指令对应的拍摄图像;
根据所述目标图像的清晰度和所述拍摄图像的清晰度,对所述拍摄图像进行处理。
可选地,根据所述目标图像的清晰度和所述拍摄图像的清晰度,对所述拍摄图像进行处理,包括:
若所述目标图像的清晰度与所述拍摄图像的清晰度相同,则删除所述拍摄图像;和/或,
若所述目标图像的清晰度与所述拍摄图像的清晰度不相同,则在所述摄像设备中存储所述拍摄图像。
本申请还提供一种摄像控制装置,包括第一获取模块、第二获取模块和控制模块,其中:
所述第一获取模块用于,获取摄像设备对目标对象连续采集的至少两张图像,可选地,所述摄像设备的摄像镜头为定焦镜头;
所述第二获取模块用于,获取所述至少两张图像的清晰度;
所述控制模块用于,根据所述至少两张图像的清晰度,控制所述摄像设备对所述目标对象进行图像拍摄。
可选地,所述控制模块具体用于:
根据所述至少两张图像的清晰度,确定所述摄像设备清晰度状态,可选地,所述清晰度状态为升高状态、降低状态、位于最高范围状态;
根据所述摄像设备清晰度状态,控制所述摄像设备对所述目标对象进行图像拍摄。
可选地,所述控制模块具体用于:
根据所述摄像设备清晰度状态,判断所述至少两张图像中是否存在目标图像,所述目标图像的清晰度为预设清晰度;
在确定所述至少两张图像中存在目标图像时,将所述目标图像存储至所述摄像设备。
可选地,所述控制模块具体用于:
若所述摄像设备清晰度状态为降低状态,且所述摄像设备的历史清晰度状态中存在位于最高范围状态,则确定所述至少两张图像中存在目标图像。
可选地,所述摄像控制装置还包括显示模块,所述显示模块用于:
在所述摄像设备的拍摄界面中显示所述摄像设备清晰度状态对应的指示信息。
可选地,所述摄像设备清晰度状态对应的指示信息,包括以下至少一种:若所述摄像设备清晰度状态为升高状态,则确定所述指示信息为上升箭头;
若所述摄像设备清晰度状态为降低状态,则确定所述指示信息为下降箭头;
若所述摄像设备清晰度状态为位于最高范围状态,则确定所述指示信息为拍摄框。
可选地,所述摄像控制装置还包括第三获取模块,所述第三获取模块用于:
响应于拍摄指令,获取所述拍摄指令对应的拍摄图像;
根据所述目标图像的清晰度和所述拍摄图像的清晰度,对所述拍摄图像进行处理。
可选地,所述第三获取模块具体用于:
若所述目标图像的清晰度与所述拍摄图像的清晰度相同,则删除所述拍摄图像;和/或,
若所述目标图像的清晰度与所述拍摄图像的清晰度不相同,则在所述摄像设备中存储所述拍摄图像。
本申请还提供一种移动终端,包括:存储器、处理器,其中,所述存储器上存储有摄像控制程序,所述摄像控制程序被所述处理器执行时实现如上述方法的步骤。
本申请还提供一种计算机存储介质,所述计算机存储介质存储有计算机程序,所述计 算机程序被处理器执行时实现如上述方法的步骤。
如上所述,本申请提供一种摄像控制方法、移动终端及存储介质,获取摄像设备对目标对象连续采集的至少两张图像,获取至少两张图像的清晰度,根据至少两张图像的清晰度,控制摄像设备对目标对象进行图像拍摄。在上述方法中,摄像设备可以准确的确定连续获取的多张图像的清晰度,并根据清晰度,控制摄像设备对目标对象进行图像拍摄,使得摄像设备拍摄的图像清晰度较高,进而提高拍摄的效果。
附图说明
此处的附图被并入说明书中并构成本说明书的一部分,示出了符合本申请的实施例,并与说明书一起用于解释本申请的原理。为了更清楚地说明本申请实施例的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,对于本领域普通技术人员而言,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1为实现本申请各个实施例的一种移动终端的硬件结构示意图;
图2为本申请实施例提供的一种通信网络系统架构图;
图3为本申请实施例提供的一种应用场景示意图;
图4为本申请实施例提供的一种摄像控制方法的流程示意图;
图5为本申请实施例提供的一种图像清晰度与物距之间的关系示意图;
图6为本申请实施例提供的一种清晰度状态为位于最高范围状态的示意图;
图7为本申请实施例提供的一种显示摄像设备的清晰度状态的流程示意图;
图8A为本申请实施例提供的一种显示指示信息的过程示意图;
图8B为本申请实施例提供的另一种显示指示信息的过程示意图;
图8C为本申请实施例提供的另一种显示指示信息的过程示意图;
图9为本申请实施例提供的一种对拍摄图像的处理过程的示意图;
图10为本申请实施例提供的一种获取拍摄图像的示意图;
图11为本申请实施例提供的一种摄像控制方法的过程示意图;
图12为本申请实施例提供的一种摄像控制装置的结构示意图;
图13为本申请实施例提供的另一种摄像控制装置的结构示意图。
本申请目的的实现、功能特点及优点将结合实施例,参照附图做进一步说明。通过上述附图,已示出本申请明确的实施例,后文中将有更详细的描述。这些附图和文字描述并不是为了通过任何方式限制本申请构思的范围,而是通过参考特定实施例为本领域技术人员说明本申请的概念。
具体实施方式
这里将详细地对示例性实施例进行说明,其示例表示在附图中。下面的描述涉及附图时,除非另有表示,不同附图中的相同数字表示相同或相似的要素。以下示例性实施例中所描述的实施方式并不代表与本申请相一致的所有实施方式。相反,它们仅是与如所附权利要求书中所详述的、本申请的一些方面相一致的装置和方法的例子。
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者装置不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者装置所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、方法、物品或者装置中还存在另外的相同要素,此外,本申请不同 实施例中具有同样命名的部件、特征、要素可能具有相同含义,也可能具有不同含义,其具体含义需以其在该具体实施例中的解释或者进一步结合该具体实施例中上下文进行确定。
应当理解,尽管在本文可能采用术语第一、第二、第三等来描述各种信息,但这些信息不应限于这些术语。这些术语仅用来将同一类型的信息彼此区分开。例如,在不脱离本文范围的情况下,第一信息也可以被称为第二信息,类似地,第二信息也可以被称为第一信息。取决于语境,如在此所使用的词语"如果"可以被解释成为"在……时"或"当……时"或"响应于确定"。再者,如同在本文中所使用的,单数形式“一”、“一个”和“该”旨在也包括复数形式,除非上下文中有相反的指示。应当进一步理解,术语“包含”、“包括”表明存在所述的特征、步骤、操作、元件、组件、项目、种类、和/或组,但不排除一个或多个其他特征、步骤、操作、元件、组件、项目、种类、和/或组的存在、出现或添加。本申请使用的术语“或”、“和/或”、“包括以下至少一个”等可被解释为包括性的,或意味着任一个或任何组合。例如,“包括以下至少一个:A、B、C”意味着“以下任一个:A;B;C;A和B;A和C;B和C;A和B和C”,再如,“A、B或C”或者“A、B和/或C”意味着“以下任一个:A;B;C;A和B;A和C;B和C;A和B和C”。仅当元件、功能、步骤或操作的组合在某些方式下内在地互相排斥时,才会出现该定义的例外。
应该理解的是,虽然本申请实施例中的流程图中的各个步骤按照箭头的指示依次显示,但是这些步骤并不是必然按照箭头指示的顺序依次执行。除非本文中有明确的说明,这些步骤的执行并没有严格的顺序限制,其可以以其他的顺序执行。而且,图中的至少一部分步骤可以包括多个子步骤或者多个阶段,这些子步骤或者阶段并不必然是在同一时刻执行完成,而是可以在不同的时刻执行,其执行顺序也不必然是依次进行,而是可以与其他步骤或者其他步骤的子步骤或者阶段的至少一部分轮流或者交替地执行。
取决于语境,如在此所使用的词语“如果”、“若”可以被解释成为“在……时”或“当……时”或“响应于确定”或“响应于检测”。类似地,取决于语境,短语“如果确定”或“如果检测(陈述的条件或事件)”可以被解释成为“当确定时”或“响应于确定”或“当检测(陈述的条件或事件)时”或“响应于检测(陈述的条件或事件)”。
需要说明的是,在本文中,采用了诸如S11、S12等步骤代号,其目的是为了更清楚简要地表述相应内容,不构成顺序上的实质性限制,本领域技术人员在具体实施时,可能会先执行S12后执行S11等,但这些均应在本申请的保护范围之内。
应当理解,此处所描述的具体实施例仅仅用以解释本申请,并不用于限定本申请。
在后续的描述中,使用用于表示元件的诸如“模块”、“部件”或者“单元”的后缀仅为了有利于本申请的说明,其本身没有特定的意义。因此,“模块”、“部件”或者“单元”可以混合地使用。
移动终端可以以各种形式来实施。例如,本申请中描述的移动终端可以包括诸如手机、平板电脑、笔记本电脑、掌上电脑、个人数字助理(Personal Digital Assistant,PDA)、便捷式媒体播放器(Portable Media Player,PMP)、导航装置、可穿戴设备、智能手环、计步器等移动终端,以及诸如数字TV、台式计算机等固定终端。
后续描述中将以移动终端为例进行说明,本领域技术人员将理解的是,除了特别用于移动目的的元件之外,根据本申请的实施方式的构造也能够应用于固定类型的终端。
请参阅图1,其为实现本申请各个实施例的一种移动终端的硬件结构示意图,该移动终端100可以包括:RF(Radio Frequency,射频)单元101、WiFi模块102、音频输出单元103、A/V(音频/视频)输入单元104、传感器105、显示单元106、用户输入单元107、接口单元108、存储器109、处理器110、以及电源111等部件。本领域技术人员可 以理解,图1中示出的移动终端结构并不构成对移动终端的限定,移动终端可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。
下面结合图1对移动终端的各个部件进行具体的介绍:
射频单元101可用于收发信息或通话过程中,信号的接收和发送,具体的,将基站的下行信息接收后,给处理器110处理;另外,将上行的数据发送给基站。通常,射频单元101包括但不限于天线、至少一个放大器、收发信机、耦合器、低噪声放大器、双工器等。此外,射频单元101还可以通过无线通信与网络和其他设备通信。上述无线通信可以使用任一通信标准或协议,包括但不限于GSM(Global System of Mobile communication,全球移动通讯系统)、GPRS(General Packet Radio Service,通用分组无线服务)、CDMA2000(Code Division Multiple Access 2000,码分多址2000)、WCDMA(Wideband Code Division Multiple Access,宽带码分多址)、TD-SCDMA(Time Division-Synchronous Code Division Multiple Access,时分同步码分多址)、FDD-LTE(Frequency Division Duplexing-Long Term Evolution,频分双工长期演进)和TDD-LTE(Time Division Duplexing-Long Term Evolution,分时双工长期演进)等。
WiFi属于短距离无线传输技术,移动终端通过WiFi模块102可以帮助用户收发电子邮件、浏览网页和访问流式媒体等,它为用户提供了无线的宽带互联网访问。虽然图1示出了WiFi模块102,但是可以理解的是,其并不属于移动终端的必须构成,完全可以根据需要在不改变申请的本质的范围内而省略。
音频输出单元103可以在移动终端100处于呼叫信号接收模式、通话模式、记录模式、语音识别模式、广播接收模式等等模式下时,将射频单元101或WiFi模块102接收的或者在存储器109中存储的音频数据转换成音频信号并且输出为声音。而且,音频输出单元103还可以提供与移动终端100执行的特定功能相关的音频输出(例如,呼叫信号接收声音、消息接收声音等等)。音频输出单元103可以包括扬声器、蜂鸣器等等。
A/V输入单元104用于接收音频或视频信号。A/V输入单元104可以包括图形处理器(Graphics Processing Unit,GPU)1041和麦克风1042,图形处理器1041对在视频捕获模式或图像捕获模式中由图像捕获装置(如摄像头)获得的静态图片或视频的图像数据进行处理。处理后的图像帧可以显示在显示单元106上。经图形处理器1041处理后的图像帧可以存储在存储器109(或其它存储介质)中或者经由射频单元101或WiFi模块102进行发送。麦克风1042可以在电话通话模式、记录模式、语音识别模式等等运行模式中经由麦克风1042接收声音(音频数据),并且能够将这样的声音处理为音频数据。处理后的音频(语音)数据可以在电话通话模式的情况下转换为可经由射频单元101发送到移动通信基站的格式输出。麦克风1042可以实施各种类型的噪声消除(或抑制)算法以消除(或抑制)在接收和发送音频信号的过程中产生的噪声或者干扰。
移动终端100还包括至少一种传感器105,比如光传感器、运动传感器以及其他传感器。可选地,光传感器包括环境光传感器及接近传感器,可选地,环境光传感器可根据环境光线的明暗来调节显示面板1061的亮度,接近传感器可在移动终端100移动到耳边时,关闭显示面板1061和/或背光。作为运动传感器的一种,加速计传感器可检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别手机姿态的应用(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击)等;至于手机还可配置的指纹传感器、压力传感器、虹膜传感器、分子传感器、陀螺仪、气压计、湿度计、温度计、红外线传感器等其他传感器,在此不再赘述。
显示单元106用于显示由用户输入的信息或提供给用户的信息。显示单元106可包括显示面板1061,可以采用液晶显示器(Liquid Crystal Display,LCD)、有机发光二极管(Organic Light-Emitting Diode,OLED)等形式来配置显示面板1061。
用户输入单元107可用于接收输入的数字或字符信息,以及产生与移动终端的用户设 置以及功能控制有关的键信号输入。可选地,用户输入单元107可包括触控面板1071以及其他输入设备1072。触控面板1071,也称为触摸屏,可收集用户在其上或附近的触摸操作(比如用户使用手指、触笔等任何适合的物体或附件在触控面板1071上或在触控面板1071附近的操作),并根据预先设定的程式驱动相应的连接装置。触控面板1071可包括触摸检测装置和触摸控制器两个部分。可选地,触摸检测装置检测用户的触摸方位,并检测触摸操作带来的信号,将信号传送给触摸控制器;触摸控制器从触摸检测装置上接收触摸信息,并将它转换成触点坐标,再送给处理器110,并能接收处理器110发来的命令并加以执行。此外,可以采用电阻式、电容式、红外线以及表面声波等多种类型实现触控面板1071。除了触控面板1071,用户输入单元107还可以包括其他输入设备1072。可选地,其他输入设备1072可以包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆等中的一种或多种,具体此处不做限定。
可选地,触控面板1071可覆盖显示面板1061,当触控面板1071检测到在其上或附近的触摸操作后,传送给处理器110以确定触摸事件的类型,随后处理器110根据触摸事件的类型在显示面板1061上提供相应的视觉输出。虽然在图1中,触控面板1071与显示面板1061是作为两个独立的部件来实现移动终端的输入和输出功能,但是在某些实施例中,可以将触控面板1071与显示面板1061集成而实现移动终端的输入和输出功能,具体此处不做限定。
接口单元108用作至少一个外部装置与移动终端100连接可以通过的接口。例如,外部装置可以包括有线或无线头戴式耳机端口、外部电源(或电池充电器)端口、有线或无线数据端口、存储卡端口、用于连接具有识别模块的装置的端口、音频输入/输出(I/O)端口、视频I/O端口、耳机端口等等。接口单元108可以用于接收来自外部装置的输入(例如,数据信息、电力等等)并且将接收到的输入传输到移动终端100内的一个或多个元件或者可以用于在移动终端100和外部装置之间传输数据。
存储器109可用于存储软件程序以及各种数据。存储器109可主要包括存储程序区和存储数据区,可选地,存储程序区可存储操作系统、至少一个功能所需的应用程序(比如声音播放功能、图像播放功能等)等;存储数据区可存储根据手机的使用所创建的数据(比如音频数据、电话本等)等。此外,存储器109可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。
处理器110是移动终端的控制中心,利用各种接口和线路连接整个移动终端的各个部分,通过运行或执行存储在存储器109内的软件程序和/或模块,以及调用存储在存储器109内的数据,执行移动终端的各种功能和处理数据,从而对移动终端进行整体监控。处理器110可包括一个或多个处理单元;优选的,处理器110可集成应用处理器和调制解调处理器,可选地,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器110中。
移动终端100还可以包括给各个部件供电的电源111(比如电池),优选的,电源111可以通过电源管理系统与处理器110逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗管理等功能。
尽管图1未示出,移动终端100还可以包括蓝牙模块等,在此不再赘述。
为了便于理解本申请实施例,下面对本申请的移动终端所基于的通信网络系统进行描述。
请参阅图2,图2为本申请实施例提供的一种通信网络系统架构图,该通信网络系统为通用移动通信技术的LTE系统,该LTE系统包括依次通讯连接的UE(User Equipment,用户设备)201,E-UTRAN(Evolved UMTS Terrestrial Radio Access Network,演进式UMTS陆地无线接入网)202,EPC(Evolved Packet Core,演进式分组核心网)203和运营商的 IP业务204。
可选地,UE201可以是上述终端100,此处不再赘述。
E-UTRAN202包括eNodeB2021和其它eNodeB2022等。可选地,eNodeB2021可以通过回程(backhaul)(例如X2接口)与其它eNodeB2022连接,eNodeB2021连接到EPC203,eNodeB2021可以提供UE201到EPC203的接入。
EPC203可以包括MME(Mobility Management Entity,移动性管理实体)2031,HSS(Home Subscriber Server,归属用户服务器)2032,其它MME2033,SGW(Serving Gate Way,服务网关)2034,PGW(PDN Gate Way,分组数据网络网关)2035和PCRF(Policy and Charging Rules Function,政策和资费功能实体)2036等。可选地,MME2031是处理UE201和EPC203之间信令的控制节点,提供承载和连接管理。HSS2032用于提供一些寄存器来管理诸如归属位置寄存器(图中未示)之类的功能,并且保存有一些有关服务特征、数据速率等用户专用的信息。所有用户数据都可以通过SGW2034进行发送,PGW2035可以提供UE 201的IP地址分配以及其它功能,PCRF2036是业务数据流和IP承载资源的策略与计费控制策略决策点,它为策略与计费执行功能单元(图中未示)选择及提供可用的策略和计费控制决策。
IP业务204可以包括因特网、内联网、IMS(IP Multimedia Subsystem,IP多媒体子系统)或其它IP业务等。
虽然上述以LTE系统为例进行了介绍,但本领域技术人员应当知晓,本申请不仅仅适用于LTE系统,也可以适用于其他无线通信系统,例如GSM、CDMA2000、WCDMA、TD-SCDMA以及未来新的网络系统等,此处不做限定。
基于上述移动终端硬件结构以及通信网络系统,提出本申请各个实施例。
在相关技术中,终端设备中的摄像镜头使用定焦镜头,以降低终端设备的成本,而安装定焦镜头的终端设备不具备自动对焦的功能,在用户使用安装定焦镜头的终端设备进行拍摄时,用户通过经验判断拍摄图像的清晰度。例如,用户可以通过调整手机与拍摄对象之间的物距,以改变屏幕中图像的清晰度,在用户观察到清晰度合适的时候,用户点击拍照按键得到图像。但是,用户通过经验观察图像的清晰度,使得拍摄的图像的清晰度较低(微距拍摄时用户观察的图像的清晰度可能不是清晰度最好的图像),进而导致拍摄的效果较差。
为了解决相关技术中拍摄的效果较差的技术问题,本申请实施例提供一种摄像控制方法,获取摄像设备对目标对象连续采集的至少两张图像,可选地,摄像设备的摄像镜头为定焦镜头,获取至少两张图像的清晰度,根据至少两张图像的清晰度,确定摄像设备当前的清晰度状态,可选地,清晰度状态为升高状态、降低状态、位于最高范围状态,根据摄像设备当前的清晰度状态,控制摄像设备对目标对象进行图像拍摄。在上述方法中,在使用定焦镜头的摄像设备拍摄图像时,由于摄像设备可以准确的获取多张图像的清晰度,因此,通过多张图像的清晰度,可以准确的确定清晰度较好的图像,进而提高拍摄的效果。
为了便于理解,下面,结合图3,介绍本申请的应用场景。
图3为本申请实施例提供的一种应用场景示意图。请参见图3,包括摄像设备和目标对象。可选地,摄像设备拍摄功能已开启。在摄像设备与目标对象的距离较远时,摄像设备的屏幕中显示的目标对象的图像较为模糊。在摄像设备向目标对象移动时,摄像设备与目标对象之间的物距发生改变,摄像设备屏幕中显示的目标对象的图像的清晰度发生变化。在摄像设备向目标对象移动的过程中,摄像设备获取多张目标对象的图像,并在多张目标对象的图像中确定清晰度最高的图像作为拍摄的图像。这样,在摄像设备的镜头为定焦镜头时,摄像设备可以根据图像的清晰度,确定拍摄的图像,进而提高拍摄的效果。
下面,通过具体实施例对本申请所示的技术方案进行详细说明。可选地,如下实施例 可以单独存在,也可以相互结合,对于相同或相似的内容,在不同的实施例中不再重复说明。
图4为本申请实施例提供的一种摄像控制方法的流程示意图。请参见图4,该方法可以包括:
S11、获取摄像设备对目标对象连续采集的至少两张图像。
本申请实施例的执行主体可以为移动终端,也可以为设置在移动终端中的摄像控制装置。可选地,摄像控制装置可以通过软件实现,也可以通过软件和硬件的结合实现。
可选地,摄像设备为任意具有摄像功能的设备。例如,摄像设备可以为相机、手机、平板电脑等设备。摄像设备的摄像镜头为定焦镜头。可选地,定焦镜头为只有一个固定焦距的镜头。在实际应用过程,若摄像设备的镜头为定焦镜头,则摄像设备无法自动对焦。
目标对象为摄像设备的拍摄对象。例如,若摄像设备拍摄水杯,则目标对象为水杯;若摄像设备拍摄桌子,则目标对象为桌子。
可选地,在摄像设备打开拍摄功能时,摄像设备可以实时的获取包括目标对象的图像。例如,在手机打开拍摄功能时,手机可以获取镜头拍摄的每一帧的图像。可选地,在摄像设备打开拍摄功能时,摄像设备可以连续获取多张包括目标对象的图像,
S12、获取至少两张图像的清晰度。
清晰度用于指示图像的聚焦值的大小。例如,图像的聚焦值越大,图像的清晰度越高,图像的聚焦值越小,图像的清晰度越低。可选地,可以通过预设模型,得到至少两张图像的清晰度。可选地,预设模型为对多组样本学习得到的,每组样本包括样本图像和样本清晰度。
多组样本可以为预先标记的样本。例如,对于样本图像1,获取样本图像1对应的样本清晰度1,得到一组样本,该组样本包括样本图像1和样本清晰度1。采用该种方式,可以得到多组样本。例如,多组样本可以如表1所示:
表1
多组样本 样本图像 样本清晰度
第一组样本 样本图像1 样本清晰度1
第二组样本 样本图像2 样本清晰度2
第三组样本 样本图像3 样本清晰度3
…… …… ……
需要说明的是,表1只是以示例的形式示意多组样本,并非对多组样本的限定。
例如,若输入预设模型中的图像与样本图像1相同,则该图像对应的清晰度为样本清晰度1;若输入预设模型中的图像与样本图像2相同,则该图像对应的清晰度为样本清晰度2;若输入预设模型中的图像与样本图像3相同,则该图像对应的清晰度为样本清晰度3。
可选地,在摄像设备的拍摄页面中实时的显示图像对应的清晰度。例如,在拍摄页面中显示图像时,拍摄页面的预设区域中显示图像对应的清晰度,进而使得用户可以准确的确定当前拍摄图像时图像对应的清晰度。
可选地,在摄像设备的镜头为定焦镜头时,摄像设备拍摄的图像的清晰度和摄像设备与目标对象之间的物距相关。
下面,结合图5,对图像的清晰度和物距之间的关系进行说明。
图5为本申请实施例提供的一种图像清晰度与物距之间的关系示意图。请参见图5,包括清晰度与物距之间的关系坐标系。可选地,坐标系的横轴为摄像设备与目标对象之间的物距,坐标系的纵轴为摄像设备获取的图像对应的清晰度。
请参见图5,在摄像设备与目标对象的物距增加的过程中,摄像设备获取的图像的清晰度先升高再下降,坐标系中的最高点为摄像设备获取的图像的清晰度最高。在实际应用过程中,摄像设备可以获取镜头采集的每一帧图像,并确定每一帧图像对应的清晰度。
S13、根据至少两张图像的清晰度,控制摄像设备对目标对象进行图像拍摄。
可以根据如下可行的实现方式,控制摄像设备对目标对象进行图像拍摄:根据至少两张图像的清晰度,确定摄像设备的清晰度状态。可选地,清晰度状态为升高状态、降低状态、位于最高范围状态。
升高状态用于指示摄像设备当前拍摄的图像的清晰度正在升高。例如,在用户移动摄像设备时,若摄像设备的屏幕中显示的图像的清晰度在上升,则摄像设备当前的清晰度状态为升高状态。
降低状态用于指示摄像设备当前拍摄的图像的清晰度正在下降。例如,在用户移动摄像设备时,摄像设备与拍摄的目标对象之间的物距发生改变,若摄像设备的屏幕中显示的图像的清晰度在下降,则摄像设备当前的清晰度状态为下降状态。
位于最高范围状态用于指示摄像设备当前拍摄的图像的清晰度在最高清晰度的预设区间内。可选地,预设区间为预先设置的数值。例如,若图像的最高清晰度为100,预设区间为10,则在摄像设备拍摄的图像的清晰度为90-100时,摄像设备的状态为位于最高范围状态。例如,在用户移动摄像设备时,摄像设备与拍摄的目标对象之间的物距发生改变,若摄像设备的屏幕中显示的图像的清晰度在最高清晰度的预设范围内,则摄像设备当前的清晰度状态为位于最高范围状态。
可选地,摄像设备可以包括多个清晰度状态。例如,摄像设备可以同时出现升高状态和位于最高范围状态,或者,摄像设备可以同时出现降低状态和位于最高范围状态。
可选地,以摄像设备连续获取的四张图像为例,在确定清晰度状态时有如下2种情况:
情况1:摄像设备向一个方向移动。
在摄像设备持续向一个方向移动时,若第一张图像的清晰度低于第二张图像的清晰度,第二张图像的清晰度低于第三张图像的清晰度,第三张图像的清晰度低于第四张图像的清晰度,则确定摄像设备的清晰度状态为升高状态。例如,在摄像设备朝向目标对象移动时,摄像设备连续获取的四张图像的清晰度依次升高,则说明摄像设备当前的清晰度状态为升高状态。
在摄像设备持续向一个方向移动时,若第一张图像的清晰度高于第二张图像的清晰度,第二张图像的清晰度高于第三张图像的清晰度,第三张图像的清晰度高于第四张图像的清晰度,则确定摄像设备的清晰度状态为降低状态。例如,在摄像设备朝向目标对象移动时,摄像设备连续获取的四张图像的清晰度依次下降,则说明摄像设备当前的清晰度状态为降低状态。
摄像设备获取的图像的清晰度先升高再下降,则摄像设备在向目标对象移动的过程中,在清晰度最高值的预设区间内的清晰度对应的摄像设备的清晰度状态为位于最高范围状态。例如,在摄像设备向远离目标对象或者朝向目标对象移动时,若第一张图像的清晰度小于第二张图像的清晰度,则摄像设备获取第一张图像和第二张图像时,摄像设备的清晰度状态为升高状态,若第二张图像的清晰度大于第三张图像的清晰度,第三张图像的清晰度大于第四张图像的清晰度,则摄像设备获取第三张图像和获取第四张图像时,摄像设备的清晰度状态为降低状态,此时,可以确定摄像设备获取的第三张图像的清晰度最高,在第三张图像的清晰度的预设区间内的清晰度对应的摄像设备的清晰度状态为位于最高范围状态。
情况2:摄像设备向多个方向移动。
在摄像设备向多个方向移动时,若摄像设备朝向目标对象移动时,清晰度持续升高,则摄像设备背向目标对象移动时清晰度降低,若摄像设备朝向目标对象移动时清晰度持续 降低,则摄像设备背向目标对象移动时清晰度升高。因此,在摄像设备向多个方向移动时,在摄像设备的清晰度状态由上升状态变为下降状态时,并不能确定摄像设备的清晰度状态为位于最高范围状态。
下面,结合图6,对摄像设备的清晰度状态为位于最高范围状态进行详细说明。
图6为本申请实施例提供的一种清晰度状态为位于最高范围状态的示意图。请参见图6,包括清晰度与物距之间的关系坐标系。可选地,坐标系的横轴为摄像设备与目标对象之间的物距,坐标系的纵轴为摄像设备获取的图像对应的清晰度。
请参见图6,在摄像设备与目标对象的物距增加的过程中,摄像设备获取的图像的清晰度升高,在摄像设备获取的图像的清晰度达到最高点时,摄像设备获取的图像的清晰度下降,在最高点的预设范围内,摄像设备的清晰度状态都为位于最高范围状态,在摄像设备的清晰度状态为位于最高范围状态时,摄像设备拍摄的图像的清晰度较高。
根据摄像设备当前的清晰度状态,控制摄像设备对目标对象进行拍摄。可选地,可以根据如下可行的实现方式,控制摄像设备对目标对象进行图像拍摄:根据摄像设备当前的清晰度状态,判断至少两张图像中是否存在目标图像。其中目标图像的清晰度为预设清晰度。可选地,预设清晰度可以为最高清晰度。例如,目标图像的清晰度为摄像设备可以拍摄的最高的清晰度。
可选地,可以根据如下可行的实现方式,判断至少两张图像中是否存在目标图像:若摄像设备当前的清晰度状态为降低状态,且摄像设备的历史清晰度状态中存在位于最高范围状态,则确定至少两张图像中存在目标图像。可选地,历史清晰度状态为当前时刻之前摄像设备的清晰度状态。例如,在当前时刻之前摄像设备的清晰度状态包括升高状态和位于最高范围状态,则历史清晰度状态包括升高状态和位于最高范围状态。
在实际应用过程中摄像设备当前的清晰度状态为下降状态,说明摄像设备在当前时刻获取的图像的清晰度比上一时刻获取的图像的清晰度低,若摄像设备在本次摄像过程中存在清晰度状态为位于最高范围状态的情况,则说明摄像设备在本次摄像过程获取的多张图像中包括清晰度最高的目标图像。
在确定至少两种图像中存在目标图像时,将目标图像存储至摄像设备。例如,摄像设备在本次拍摄的过程中获取多张图像,若多张图像中包括清晰度最高的目标图像,则将目标图像存储至摄像设备。
本申请实施例提供一种摄像控制方法,可选地,摄像设备的摄像镜头为定焦镜头,获取摄像设备对目标对象连续采集的至少两张图像,根据预设模型,获取至少两张图像的清晰度,并根据至少两张图像的清晰度,确定摄像设备当前的清晰度状态,可选地,清晰度状态为升高状态、降低状态和位于最高范围状态,根据摄像设备当前的清晰度状态,判断至少两张图像中是否存在清晰度为预设清晰度的目标图像,在确定至少两张图像中存在目标图像时,将目标图像存储至摄像设备。在上述方法中,在使用定焦镜头的摄像设备拍摄图像时,由于摄像设备可以准确的获取多张图像的清晰度,并根据多张图像的清晰度,确定摄像设备的清晰度状态,进而可以根据摄像设备的清晰度状态,准确的判断多张图像中是否存在清晰度最高的图像,这样可以将清晰度最高的图像存储至摄像设备中,进而提高定焦摄像设备拍摄的效果。
在图4所示实施例的基础上,本申请还包括在摄像设备的拍摄页面中显示摄像设备的清晰度状态的方法。下面,结合图7,对在摄像设备的拍摄页面中显示摄像设备的清晰度状态的过程进行说明。
图7为本申请实施例提供的一种显示摄像设备的清晰度状态的流程示意图。请参见图7,该方法包括:
S71、确定摄像设备清晰度状态。
清晰度状态可以为升高状态、降低状态或位于最高范围状态。
需要说明的是步骤S71的执行过程可以参照步骤S13的执行步骤,本申请实施例在此不再进行赘述。
S72、在摄像设备的拍摄页面中显示摄像设备当前的清晰度状态对应的指示信息。
指示信息用于指示摄像设备的清晰度状态。可选地,指示信息可以为上升箭头、下降箭头或拍摄框。
可选地,可以根据如下可行的实现方式,在摄像设备的拍摄页面中显示指示信息:若摄像设备当前的清晰度状态为升高状态,则确定指示信息为上升箭头。例如,若摄像设备的清晰度状态为升高状态,则可以在摄像设备的拍摄页面中的预设窗口生成上升箭头,进而使得用户通过上升箭头确定摄像设备当前的清晰度状态为升高状态。
若摄像设备当前的清晰度状态为降低状态,则确定指示信息为下降箭头。例如,若摄像设备的清晰度状态为降低状态,则可以在摄像设备的拍摄页面中的预设窗口生成下降箭头,进而使得用户通过下降箭头确定摄像设备当前的清晰度状态为降低状态。
若摄像设备当前的清晰度状态为位于最高范围状态,则确定指示信息为拍摄框。例如,若摄像设备的清晰度状态为位于最高范围状态,则可以在摄像设备的拍摄页面中的预设窗口生成绿色的拍摄框,进而使得用户通过绿色的拍摄框确定摄像设备当前的清晰度状态为位于最高范围状态。
下面,结合图8A-图8C,详细介绍摄像设备显示指示信息的过程。
图8A为本申请实施例提供的一种显示指示信息的过程示意图。请参见图8A,包括摄像设备和目标对象。摄像设备的拍摄页面中可以显示包括目标对象的图像。在摄像设备向目标对象移动时,摄像设备的清晰度状态为升高状态。
请参见图8A,在摄像设备的清晰度状态为升高状态时,在摄像设备的拍摄页面的预设区域中生成上升箭头,进而使得用户通过上升箭头,准确的确定摄像设备当前的清晰度状态为升高状态,提高拍摄的效果。
图8B为本申请实施例提供的另一种显示指示信息的过程示意图。请参见图8B,包括摄像设备和目标对象。摄像设备的拍摄页面中可以显示包括目标对象的图像。在摄像设备向背离目标对象的方向移动时,摄像设备的清晰度状态为降低状态。
请参见图8B,在摄像设备的清晰度状态为降低状态时,在摄像设备的拍摄页面的预设区域中生成下降箭头,进而使得用户通过下降箭头,准确的确定摄像设备当前的清晰度状态为下降状态,提高拍摄的效果。
图8C为本申请实施例提供的另一种显示指示信息的过程示意图。请参见图8C,包括摄像设备和目标对象。摄像设备的拍摄页面中可以显示包括目标对象的图像。在摄像设备向目标对象的方向移动时,摄像设备的清晰度状态存在位于最高范围状态。
请参见图8C,在摄像设备的清晰度状态为位于最高范围状态时,在摄像设备的拍摄页面的预设区域中生成拍摄框,进而使得用户通过拍摄框,准确的确定摄像设备当前的清晰度状态为位于最高范围状态,提高拍摄的效果。
本申请实施例提供一种摄像控制方法,确定摄像设备当前的清晰度状态,在摄像设备的拍摄页面中显示摄像设备当前的清晰度状态对应的指示信息,可选地,指示信息包括上升箭头、下降箭头和拍摄框。在上述方法中,在摄像设备的清晰度状态发生变化时,摄像设备的拍摄页面的预设区域中可以显示对应的指示信息,这样用户可以根据拍摄页面中的指示信息,准确的确定摄像设备的清晰度状态,进而提高拍摄的效果。
在上述任意一个实施例的基础上,在摄像设备获取目标图像之后,用户也可以主动拍摄图像,本申请还包括对拍摄图像的处理过程。下面,结合图9,详细介绍对拍摄图像进行处理的过程。
图9为本申请实施例提供的一种对拍摄图像的处理过程的示意图。请参见图9,该方法包括:
S91、响应于拍摄指令,获取拍摄指令对应的拍摄图像。
可选地,拍摄指令可以为用户对拍摄页面中的拍摄按键的点击操作。拍摄指令对应的拍摄图像可以为摄像设备接收到拍摄指令时,拍摄页面中显示的图像。例如,拍摄指令对应的拍摄图像为在用户点击拍摄页面中的拍摄按键时,摄像设备的拍摄页面中显示的图像。
下面,结合图10,详细说明摄像设备获取拍摄指令对应的拍摄图像的过程。
图10为本申请实施例提供的一种获取拍摄图像的示意图。请参见图10,包括摄像设备。可选地,摄像设备包括拍摄页面,拍摄页面中包括目标对象的图像和拍摄按键。在拍摄页面中显示目标对象的图像时,用户点击拍摄按键,摄像设备获取的拍摄图像为拍摄页面中显示的图像。
S92、根据目标图像的清晰度和拍摄图像的清晰度,对拍摄图像进行处理。
可选地,可以根据如下可行的实现方式对拍摄图像进行处理:
若目标图像的清晰度与拍摄图像的清晰度相同,则删除拍摄图像。例如,若摄像设备获取的目标图像的清晰度与用户点击拍摄按键获取的图像的清晰度相同,则摄像设备只需保存一张图像即可,此时,摄像设备可以删除拍摄指令对应的拍摄图像,摄像设备也可以删除目标图像。
和/或,若目标图像的清晰度与拍摄图像的清晰度不相同,则在摄像设备中存储拍摄图像。例如,在摄像设备获取的目标图像的清晰度与用户点击拍摄按键获取的图像的清晰度不相同时,由于目标图像的清晰度为最高清晰度,因此,拍摄指令对应的拍摄图像的清晰度小于目标图像的清晰度,此时可以将拍摄指令对应的拍摄图像存储至摄像设备中,这样可以保证用户使用摄像设备拍摄的功能,提高摄像设备拍摄的灵活度,进而提高摄像设备拍摄的效果。
本申请实施例提供一种摄像控制方法,在摄像设备获取目标图像之后,响应于拍摄指令,获取拍摄指令对应的拍摄图像,根据目标图像的清晰度和拍摄图像的清晰度,对拍摄图像进行处理。这样,在目标图像的清晰度和拍摄图像的清晰度相同时,摄像设备删除拍摄图像,进而节省内存空间,在拍摄图像的清晰度与目标图像的清晰度不同时,在摄像设备中保存拍摄图像,进而在不影响主动拍摄功能的前提下,提高拍摄的灵活度,并提高拍摄的效果。
在上述任意一个实施例的基础上,下面,结合图11,对上述摄像控制方法的过程进行说明。
图11为本申请实施例提供的一种摄像控制方法的过程示意图。请参见图11,包括摄像设备和目标对象。在摄像设备向目标对象移动时,摄像设备连续获取至少两张图像,并根据至少两张图像的清晰度,确定摄像设备的清晰度状态。
请参见图11,在摄像设备的清晰度状态为升高状态时,在摄像设备的预设窗口生成上升箭头。在摄像设备继续向目标对象移动时,摄像设备获取图像的清晰度升高至最高清晰度的预设区间范围,此时,摄像设备的清晰度状态为位于最高范围状态,摄像设备的拍摄页面的预设窗口中生成拍摄框,在摄像设备获取的图像的清晰度升高至最高的清晰度时,摄像设备存储最高清晰度的图像。
请参见图11,在摄像设备继续向目标对象移动时,摄像设备获取图像的清晰度下降,此时摄像设备的清晰度状态为下降状态,摄像设备的拍摄页面的预设窗口中生成下降箭头。这样,在安装定焦摄像头的摄像设备拍摄图像时,在摄像设备移动过程中,摄像设备可以获取图像的清晰度,并根据图像的清晰度确定摄像设备的清晰度状态,进而根据摄像设备的清晰度状态,确定清晰度最高的目标图像,并保存清晰度最高的目标图像,这样可以提高摄像设备的拍摄效果。
图12为本申请实施例提供的一种摄像控制装置的结构示意图。请参见图12,该摄像控制装置10可设置在移动终端中,摄像控制装置10包括第一获取模块11、第二获取模 块12和控制模块13,其中:
所述第一获取模块11用于,获取摄像设备对目标对象连续采集的至少两张图像,可选地,所述摄像设备的摄像镜头为定焦镜头;
所述第二获取模块12用于,获取所述至少两张图像的清晰度;
所述控制模块13用于,根据所述至少两张图像的清晰度,控制所述摄像设备对所述目标对象进行图像拍摄。
可选地,所述控制模块13具体用于:
根据所述至少两张图像的清晰度,确定所述摄像设备当前的清晰度状态,所述清晰度状态为升高状态、降低状态、位于最高范围状态;
根据所述摄像设备当前的清晰度状态,控制所述摄像设备对所述目标对象进行图像拍摄。
可选地,所述控制模块13具体用于:
根据所述摄像设备当前的清晰度状态,判断所述至少两张图像中是否存在目标图像,所述目标图像的清晰度为预设清晰度;
在确定所述至少两张图像中存在目标图像时,将所述目标图像存储至所述摄像设备。
可选地,所述控制模块13具体用于:
若所述摄像设备当前的清晰度状态为降低状态,且所述摄像设备的历史清晰度状态中存在位于最高范围状态,则确定所述至少两张图像中存在目标图像。
本申请实施例提供的一种摄像控制装置可以执行上述方法实施例所示的技术方案,其实现原理以及有益效果类似,此处不再进行赘述。
图13为本申请实施例提供的另一种摄像控制装置的结构示意图。在图12所示的实施例的基础上,请参见图13,摄像控制装置10还包括显示模块14,所述显示模块14用于:
在所述摄像设备的拍摄界面中显示所述摄像设备当前的清晰度状态对应的指示信息。
可选地,若所述摄像设备当前的清晰度状态为升高状态,则确定所述指示信息为上升箭头;
若所述摄像设备当前的清晰度状态为降低状态,则确定所述指示信息为下降箭头;
若所述摄像设备当前的清晰度状态为位于最高范围状态,则确定所述指示信息为拍摄框。
可选地,所述摄像控制装置还包括第三获取模块15,所述第三获取模块15用于:
响应于拍摄指令,获取所述拍摄指令对应的拍摄图像;
根据所述目标图像的清晰度和所述拍摄图像的清晰度,对所述拍摄图像进行处理。
可选地,所述第三获取模块15具体用于:
若所述目标图像的清晰度与所述拍摄图像的清晰度相同,则删除所述拍摄图像;和/或,
若所述目标图像的清晰度与所述拍摄图像的清晰度不相同,则在所述摄像设备中存储所述拍摄图像。
本申请实施例提供的一种摄像控制装置可以执行上述方法实施例所示的技术方案,其实现原理以及有益效果类似,此处不再进行赘述。
本申请还提供一种移动终端设备,终端设备包括存储器、处理器,存储器上存储有摄像控制程序,摄像控制程序被处理器执行时实现上述任一实施例中的摄像控制方法的步骤。
本申请还提供一种计算机可读存储介质,计算机可读存储介质上存储有摄像控制程序,摄像控制程序被处理器执行时实现上述任一实施例中的摄像控制方法的步骤。
在本申请提供的移动终端和计算机可读存储介质的实施例中,包含了上述摄像控制方法各实施例的全部技术特征,说明书拓展和解释内容与上述方法的各实施例基本相同,在 此不做再赘述。
本申请实施例还提供一种计算机程序产品,计算机程序产品包括计算机程序代码,当计算机程序代码在计算机上运行时,使得计算机执行如上各种可能的实施方式中的方法。
本申请实施例还提供一种芯片,包括存储器和处理器,存储器用于存储计算机程序,处理器用于从存储器中调用并运行计算机程序,使得安装有芯片的设备执行如上各种可能的实施方式中的方法。
上述本申请实施例序号仅仅为了描述,不代表实施例的优劣。
在本申请中,对于相同或相似的术语概念、技术方案和/或应用场景描述,一般只在第一次出现时进行详细描述,后面再重复出现时,为了简洁,一般未再重复阐述,在理解本申请技术方案等内容时,对于在后未详细描述的相同或相似的术语概念、技术方案和/或应用场景描述等,可以参考其之前的相关详细描述。
在本申请中,对各个实施例的描述都各有侧重,某个实施例中没有详述或记载的部分,可以参见其它实施例的相关描述。
本申请技术方案的各技术特征可以进行任意的组合,为使描述简洁,未对上述实施例中的各个技术特征所有可能的组合都进行描述,然而,只要这些技术特征的组合不存在矛盾,都应当认为是本申请记载的范围。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在如上的一个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台终端设备(可以是手机,计算机,服务器,被控终端,或者网络设备等)执行本申请每个实施例的方法。
以上仅为本申请的优选实施例,并非因此限制本申请的专利范围,凡是利用本申请说明书及附图内容所作的等效结构或等效流程变换,或直接或间接运用在其他相关的技术领域,均同理包括在本申请的专利保护范围内。

Claims (10)

  1. 一种摄像控制方法,其特征在于,包括:
    S11、获取摄像设备对目标对象连续采集的至少两张图像;
    S12、获取所述至少两张图像的清晰度;
    S13、根据所述至少两张图像的清晰度,控制所述摄像设备对所述目标对象进行图像拍摄。
  2. 根据权利要求1所述的方法,其特征在于,所述S13,包括:
    根据所述至少两张图像的清晰度,确定所述摄像设备清晰度状态;
    根据所述摄像设备清晰度状态,控制所述摄像设备对所述目标对象进行图像拍摄。
  3. 根据权利要求2所述的方法,其特征在于,根据所述摄像设备清晰度状态,控制所述摄像设备对所述目标对象进行图像拍摄,包括:
    根据所述摄像设备清晰度状态,判断所述至少两张图像中是否存在目标图像,所述目标图像的清晰度为预设清晰度;
    在确定所述至少两张图像中存在目标图像时,将所述目标图像存储至所述摄像设备。
  4. 根据权利要求3所述的方法,其特征在于,根据所述摄像设备清晰度状态,判断所述至少两张图像中是否存在目标图像,包括:
    若所述摄像设备清晰度状态为降低状态,且所述摄像设备的历史清晰度状态中存在位于最高范围状态,则确定所述至少两张图像中存在目标图像。
  5. 根据权利要求3或4所述的方法,其特征在于,所述方法还包括:
    在所述摄像设备的拍摄界面中显示所述摄像设备清晰度状态对应的指示信息。
  6. 根据权利要求5所述的方法,其特征在于,所述摄像设备清晰度状态对应的指示信息,包括以下至少一种:
    若所述摄像设备清晰度状态为升高状态,则确定所述指示信息为上升箭头;
    若所述摄像设备清晰度状态为降低状态,则确定所述指示信息为下降箭头;
    若所述摄像设备清晰度状态为位于最高范围状态,则确定所述指示信息为拍摄框。
  7. 根据权利要求5或6所述的方法,其特征在于,所述方法还包括:
    响应于拍摄指令,获取所述拍摄指令对应的拍摄图像;
    根据所述目标图像的清晰度和所述拍摄图像的清晰度,对所述拍摄图像进行处理。
  8. 根据权利要求7所述的方法,其特征在于,所述根据所述目标图像的清晰度和所述拍摄图像的清晰度,对所述拍摄图像进行处理,包括:
    若所述目标图像的清晰度与所述拍摄图像的清晰度相同,则删除所述拍摄图像;和/或,
    若所述目标图像的清晰度与所述拍摄图像的清晰度不相同,则在所述摄像设备中存储所述拍摄图像。
  9. 一种移动终端,其特征在于,所述移动终端包括:存储器、处理器,其中,所述存储器上存储有摄像控制程序,所述摄像控制程序被所述处理器执行时实现如权利要求1至8中任一项所述的摄像控制方法的步骤。
  10. 一种可读存储介质,其特征在于,所述可读存储介质上存储有计算机程序,所述计算机程序被处理器执行时实现如权利要求1至8中任一项所述的摄像控制方法的步骤。
PCT/CN2021/132163 2021-07-13 2021-11-22 摄像控制方法、移动终端及存储介质 WO2023284218A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110791430.X 2021-07-13
CN202110791430.XA CN113542605A (zh) 2021-07-13 2021-07-13 摄像控制方法、移动终端及存储介质

Publications (1)

Publication Number Publication Date
WO2023284218A1 true WO2023284218A1 (zh) 2023-01-19

Family

ID=78098913

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/132163 WO2023284218A1 (zh) 2021-07-13 2021-11-22 摄像控制方法、移动终端及存储介质

Country Status (2)

Country Link
CN (1) CN113542605A (zh)
WO (1) WO2023284218A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113542605A (zh) * 2021-07-13 2021-10-22 深圳传音控股股份有限公司 摄像控制方法、移动终端及存储介质

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120086847A1 (en) * 2010-10-12 2012-04-12 Research In Motion Limited Convergence feedback indicator, provided when taking a picture in a camera application
US9549125B1 (en) * 2015-09-01 2017-01-17 Amazon Technologies, Inc. Focus specification and focus stabilization
CN106550183A (zh) * 2015-09-18 2017-03-29 维沃移动通信有限公司 一种拍摄方法和装置
CN106851112A (zh) * 2017-03-21 2017-06-13 惠州Tcl移动通信有限公司 一种移动终端的拍照方法及系统
CN110035216A (zh) * 2018-01-11 2019-07-19 浙江宇视科技有限公司 一种手动变焦镜头的可视化半自动对焦方法及装置
CN111314608A (zh) * 2020-02-24 2020-06-19 珠海市它物云科技有限公司 一种图像定焦提示方法、计算机装置及计算机可读存储介质
CN111970435A (zh) * 2020-08-03 2020-11-20 广东小天才科技有限公司 一种微距拍照的方法及装置
CN113542605A (zh) * 2021-07-13 2021-10-22 深圳传音控股股份有限公司 摄像控制方法、移动终端及存储介质

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120086847A1 (en) * 2010-10-12 2012-04-12 Research In Motion Limited Convergence feedback indicator, provided when taking a picture in a camera application
US9549125B1 (en) * 2015-09-01 2017-01-17 Amazon Technologies, Inc. Focus specification and focus stabilization
CN106550183A (zh) * 2015-09-18 2017-03-29 维沃移动通信有限公司 一种拍摄方法和装置
CN106851112A (zh) * 2017-03-21 2017-06-13 惠州Tcl移动通信有限公司 一种移动终端的拍照方法及系统
CN110035216A (zh) * 2018-01-11 2019-07-19 浙江宇视科技有限公司 一种手动变焦镜头的可视化半自动对焦方法及装置
CN111314608A (zh) * 2020-02-24 2020-06-19 珠海市它物云科技有限公司 一种图像定焦提示方法、计算机装置及计算机可读存储介质
CN111970435A (zh) * 2020-08-03 2020-11-20 广东小天才科技有限公司 一种微距拍照的方法及装置
CN113542605A (zh) * 2021-07-13 2021-10-22 深圳传音控股股份有限公司 摄像控制方法、移动终端及存储介质

Also Published As

Publication number Publication date
CN113542605A (zh) 2021-10-22

Similar Documents

Publication Publication Date Title
CN107807767B (zh) 一种通讯业务的处理方法、终端及计算机可读存储介质
CN109195213B (zh) 移动终端屏幕控制方法、移动终端及计算机可读存储介质
CN113179369B (zh) 拍摄画面的显示方法、移动终端及存储介质
WO2023005060A1 (zh) 拍摄方法、移动终端及存储介质
CN114126015A (zh) 功耗控制方法、智能终端及存储介质
WO2023284218A1 (zh) 摄像控制方法、移动终端及存储介质
CN112135060B (zh) 一种对焦处理方法、移动终端以及计算机存储介质
CN111970738A (zh) 一种网络切换控制方法、设备及计算机可读存储介质
CN110955397A (zh) 游戏终端帧率的设置方法、游戏终端及存储介质
WO2022252158A1 (zh) 拍照方法、移动终端及可读存储介质
CN113286106B (zh) 录像方法、移动终端及存储介质
CN112532786B (zh) 图像显示方法、终端设备和存储介质
WO2022133967A1 (zh) 拍摄的方法、终端及计算机存储介质
CN113572916A (zh) 拍摄方法、终端设备及存储介质
CN112947831A (zh) 投屏反向控制方法、移动终端及计算机可读存储介质
CN110058761B (zh) 复制选择方法、移动终端及计算机可读存储介质
CN107566745B (zh) 一种拍摄方法、终端和计算机可读存储介质
CN112532838A (zh) 一种图像处理方法、移动终端以及计算机存储介质
CN107613204B (zh) 一种对焦区域的调节方法及终端、计算机存储介质
WO2023108443A1 (zh) 图像处理方法、智能终端及存储介质
WO2023108442A1 (zh) 图像处理方法、智能终端及存储介质
CN110110138B (zh) 保存方法、移动终端及计算机可读存储介质
WO2022261897A1 (zh) 处理方法、移动终端及存储介质
CN114125151B (zh) 图像处理方法、移动终端及存储介质
WO2024045155A1 (zh) 图标显示控制方法、移动终端及存储介质

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE