CN108156378B - Photographing method, mobile terminal and computer-readable storage medium - Google Patents

Photographing method, mobile terminal and computer-readable storage medium Download PDF

Info

Publication number
CN108156378B
CN108156378B CN201711455917.0A CN201711455917A CN108156378B CN 108156378 B CN108156378 B CN 108156378B CN 201711455917 A CN201711455917 A CN 201711455917A CN 108156378 B CN108156378 B CN 108156378B
Authority
CN
China
Prior art keywords
aperture
camera
shape
photographing
depth
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711455917.0A
Other languages
Chinese (zh)
Other versions
CN108156378A (en
Inventor
刘磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nubia Technology Co Ltd
Original Assignee
Nubia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nubia Technology Co Ltd filed Critical Nubia Technology Co Ltd
Priority to CN201711455917.0A priority Critical patent/CN108156378B/en
Publication of CN108156378A publication Critical patent/CN108156378A/en
Application granted granted Critical
Publication of CN108156378B publication Critical patent/CN108156378B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/225Television cameras ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, camcorders, webcams, camera modules specially adapted for being embedded in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/232Devices for controlling television cameras, e.g. remote control ; Control of cameras comprising an electronic image sensor
    • H04N5/23229Devices for controlling television cameras, e.g. remote control ; Control of cameras comprising an electronic image sensor comprising further processing of the captured image without influencing the image pickup process
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/225Television cameras ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, camcorders, webcams, camera modules specially adapted for being embedded in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/232Devices for controlling television cameras, e.g. remote control ; Control of cameras comprising an electronic image sensor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2621Cameras specially adapted for the electronic generation of special effects during image pickup, e.g. digital cameras, camcorders, video cameras having integrated special effects capability

Abstract

The invention discloses a photographing method, a mobile terminal and a computer readable storage medium, wherein the photographing method comprises the following steps: when an object is shot through the first camera, zooming of grating blades in an aperture of the second camera is controlled to obtain a zoomed aperture shape; controlling the light transmission quantity of light rays through the lens through the zoomed grating blade so as to form a virtual light spot in the shape of the aperture; acquiring shooting data in a shooting process through a second camera, and calculating depth-of-field information according to the shooting data; and distinguishing a main body and a background in the image shot by the first camera according to the depth of field information to obtain an image with the aperture-shaped virtual light spot. The invention improves the convenience of shooting the blurred light spot image with the specific shape and reduces the cost.

Description

Photographing method, mobile terminal and computer-readable storage medium
Technical Field
The present invention relates to the field of photography, and in particular, to a photographing method, a mobile terminal, and a computer-readable storage medium.
Background
With the popularization of mobile terminals such as smart phones, more and more photo fans begin to shoot by using mobile phone cameras, and the functions of the single lens reflex camera and the post-processing photo effect are also concentrated on the mobile phones. For example, a user who wants to take a heart-shaped blurred spot portrait may use a camera and a special lens, or may perform ps post-processing to achieve this effect. Because the existing processing mode needs to be realized by a camera and a specific lens or by post-processing of photos, the operation is more complicated and complicated, and the cost required to be spent is higher.
Disclosure of Invention
The invention mainly aims to provide a photographing method, a mobile terminal and a computer readable storage medium, and aims to solve the technical problems of complex operation and high cost of the conventional photographing mode.
In order to achieve the above object, the present invention provides a photographing method, which is applied to a mobile terminal, wherein two cameras are arranged on the back of the mobile terminal, and the photographing method comprises:
when an object is shot through the first camera, zooming of grating blades in an aperture of the second camera is controlled to obtain a zoomed aperture shape;
controlling the light transmission quantity of light rays through the lens through the zoomed grating blade so as to form a virtual light spot in the shape of the aperture;
acquiring shooting data in a shooting process through a second camera, and calculating depth-of-field information according to the shooting data;
and distinguishing a main body and a background in the image shot by the first camera according to the depth of field information to obtain an image with the aperture-shaped virtual light spot.
Optionally, the step of controlling zooming of the grating blade in the aperture of the second camera to obtain a zoomed aperture shape when the object is shot by the first camera comprises:
when an object is shot through a first camera, popping up an aperture-shaped selection window in a camera application, wherein a selection control with various shapes is displayed in the selection window;
and when the touch control selection control is detected in the selection window, determining the shape corresponding to the selection control, and controlling the zooming of the grating blade in the aperture of the second camera according to the determined shape to obtain the zoomed aperture shape.
Optionally, when a touch-control selection control is detected in the selection window, determining a shape corresponding to the selection control, and controlling zooming of a grating blade in an aperture of the second camera according to the determined shape to obtain a zoomed aperture shape includes:
when a touch selection control is detected in a selection window, determining the shape corresponding to the selection control, and popping up a selection frame of an aperture value;
and when the selected aperture value is received in the aperture value selection frame, controlling the zooming of the grating blade in the aperture of the second camera according to the determined shape and the determined aperture value so as to obtain the zoomed aperture shape.
Optionally, the step of calculating depth information according to the shooting data includes:
acquiring an aperture value, a focal length, a circle of confusion diameter and a shooting distance in shooting data;
calculating the depth of field information according to the aperture value, the focal length, the diameter of the circle of confusion and the shooting distance, wherein the calculation formula is as follows:
Δ L indicates a depth of field, Δ L1 indicates a front depth of field, Δ L2 indicates a rear depth of field, F indicates an aperture value, F indicates a focal length, indicates a circle diameter of confusion, and L indicates a shooting distance.
Optionally, the step of distinguishing a subject and a background in the image captured by the first camera according to the depth of field information to obtain an image with the blurred light spot in the aperture shape includes:
distinguishing a main body and a background in an image shot by a first camera according to the depth of field value of the depth of field information;
and in the distinguished background, fusing the blurring light spots in the aperture shapes into the background to obtain images with the blurring light spots in the aperture shapes.
Optionally, when the first camera shoots an object, the step of controlling the aperture of the second camera to zoom the grating blades to obtain a zoomed aperture shape further includes:
when an object is shot through the first camera, judging whether the aperture is scalable;
and if the aperture is scalable, controlling the scaling of the grating blade in the aperture of the second camera so as to obtain the scaled aperture shape.
Optionally, after the step of determining whether the aperture is scalable when the object is captured by the first camera, the method further includes:
if the aperture is not zoomed, displaying a selection interface for a user to select the shape of the blurring light spot;
when the selected shape is received in a display interface, acquiring a pre-stored blurring light spot corresponding to the shape;
replacing the circular light spot formed by the current aperture with the pre-stored virtual light spot corresponding to the shape;
acquiring shooting data in a shooting process through a second camera, and calculating depth-of-field information according to the shooting data;
and distinguishing a main body and a background in the image shot by the first camera according to the depth of field information to obtain an image with the pre-stored blurring light spots corresponding to the shape.
Optionally, a component is installed inside the lens cover of the second camera, and a hole with a specific shape is formed in the component in a penetrating manner, and the photographing method further includes:
when an object is shot through the first camera, the light passing amount of light passing through the lens is controlled through the second camera aperture so as to form a virtual light spot in a specific shape;
acquiring shooting data in a shooting process through a second camera, and calculating depth-of-field information according to the shooting data;
and distinguishing a main body and a background in the image shot by the first camera according to the depth of field information to obtain an image with the blurring light spot with the specific shape.
In addition, in order to achieve the above object, the present invention further provides a mobile terminal, which includes a memory, a processor, and a photographing program stored on the memory and operable on the processor, wherein the photographing program, when executed by the processor, implements the steps of the photographing method as described above.
Further, to achieve the above object, the present invention also provides a computer-readable storage medium having stored thereon a photographing program which, when executed by a processor, implements the steps of the photographing method as described above.
According to the photographing method, two cameras are arranged on the back of a mobile terminal, when an object is photographed through a first camera, the zooming of a grating blade in an aperture of a second camera is controlled firstly to obtain a zoomed aperture shape, then the light passing amount of light passing through a lens is controlled through the zoomed grating blade to form a blurred light spot in the aperture shape, photographing data in the photographing process are collected through the second camera, depth of field information is calculated according to the photographing data, and finally a main body and a background in an image photographed by the first camera are distinguished according to the depth of field information to obtain an image with the blurred light spot in the aperture shape. The invention utilizes the double cameras and the diaphragm blurring function of the mobile terminal to control the zooming of the grating blades in the diaphragm so as to realize the shooting of the blurring light spot image with the specific diaphragm shape, does not need post-processing, improves the shooting convenience of the blurring light spot image with the specific shape, does not need to be realized by a camera and a specific lens, and reduces the cost.
Drawings
Fig. 1 is a schematic diagram of a hardware structure of a mobile terminal implementing various embodiments of the present invention;
FIG. 2 is a block diagram of the electrical structure of the camera of FIG. 1;
FIG. 3 is a schematic flowchart of a first embodiment of a photographing method according to the present invention;
FIG. 4 is a schematic diagram of a grating blade according to the present invention;
FIG. 5 is a diagram illustrating the generation of shot data by the shooting process in accordance with the present invention;
FIG. 6 is a schematic diagram illustrating a detailed flow of step S40 in FIG. 3;
FIG. 7 is a flowchart illustrating a second embodiment of a photographing method according to the present invention;
FIG. 8 is a schematic diagram illustrating a detailed flow of step S12 in FIG. 7;
FIG. 9 is a schematic diagram of the contraction of the grating blades corresponding to different aperture values in the present invention;
FIG. 10 is a schematic view of an assembly of the present invention with shaped openings formed therethrough;
fig. 11 is a circular blurring light spot image photographed by a mobile terminal in the prior art;
fig. 12 is a heart-shaped blurring light spot image captured by the double-camera and aperture blurring function according to the present invention.
The implementation, functional features and advantages of the present invention will be described with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for facilitating the explanation of the present invention, and have no specific meaning in itself. Thus, "module", "component" or "unit" may be used mixedly.
The mobile terminal may be implemented in various forms. For example, the mobile terminal described in the present invention may include mobile terminals such as a mobile phone, a tablet computer, a notebook computer, a palmtop computer, a Personal Digital Assistant (PDA), a Portable Media Player (PMP), a navigation device, a wearable device, a smart band, a pedometer, and the like, and fixed mobile terminals such as a Digital TV, a desktop computer, and the like.
The following description will be given by way of example of a mobile terminal, and it will be understood by those skilled in the art that the construction according to the embodiment of the present invention can be applied to a mobile terminal of a stationary type, in addition to elements particularly used for moving purposes.
Referring to fig. 1, which is a schematic diagram of a hardware structure of a mobile terminal for implementing various embodiments of the present invention, the mobile terminal 100 may include: RF (Radio Frequency) unit 101, WiFi module 102, audio output unit 103, a/V (audio/video) input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 1 is not intended to be limiting of mobile terminals, which may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile terminal in detail with reference to fig. 1:
the radio frequency unit 101 may be configured to receive and transmit signals during information transmission and reception or during a call, and specifically, receive downlink information of a base station and then process the downlink information to the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System for Mobile communications), GPRS (General Packet Radio Service), CDMA2000(Code Division Multiple Access 2000), WCDMA (Wideband Code Division Multiple Access), TD-SCDMA (Time Division Multiple Access), FDD-LTE (Frequency Division duplex Long Term Evolution), and TDD-LTE (Time Division duplex Long Term Evolution), etc.
WiFi belongs to short-distance wireless transmission technology, and the mobile terminal can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 102, and provides wireless broadband internet access for the user. Although fig. 1 shows the WiFi module 102, it is understood that it does not belong to the essential constitution of the mobile terminal, and may be omitted entirely as needed within the scope not changing the essence of the invention.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the WiFi module 102 or stored in the memory 109 into an audio signal and output as sound when the mobile terminal 100 is in a call signal reception mode, a call mode, a recording mode, a voice recognition mode, a broadcast reception mode, or the like. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the mobile terminal 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 may include a speaker, a buzzer, and the like.
The a/V input Unit 104 is used for receiving audio or video signals, and the a/V input Unit 104 may include a camera (GPU) 1041 and a microphone 1042, and the camera 1041 processes image data of still pictures or videos obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the camera 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the WiFi module 102. The microphone 1042 may receive sounds (audio data) via the microphone 1042 in a phone call mode, a recording mode, a voice recognition mode, or the like, and may be capable of processing such sounds into audio data. The processed audio (voice) data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode. The microphone 1042 may implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated in the course of receiving and transmitting audio signals.
The mobile terminal 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 1061 and/or a backlight when the mobile terminal 100 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
The display unit 106 is used to display information input by a user or information provided to the user. The Display unit 106 may include a Display panel 1061, and the Display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Specifically, the user input unit 107 may include a touch panel 1071 and other input devices 1072. The touch panel 1071, also referred to as a touch screen, may collect a touch operation performed by a user on or near the touch panel 1071 (e.g., an operation performed by the user on or near the touch panel 1071 using a finger, a stylus, or any other suitable object or accessory), and drive a corresponding connection device according to a predetermined program. The touch panel 1071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and can receive and execute commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. In particular, other input devices 1072 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like, and are not limited to these specific examples.
Further, the touch panel 1071 may cover the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although the touch panel 1071 and the display panel 1061 are shown in fig. 1 as two separate components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the mobile terminal, and is not limited herein.
The interface unit 108 serves as an interface through which at least one external device is connected to the mobile terminal 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the mobile terminal 100 or may be used to transmit data between the mobile terminal 100 and external devices.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the mobile terminal. Processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
Further, in the mobile terminal shown in fig. 1, the memory 109 stores a photographing program running on the processor 110, and the mobile terminal calls the photographing program stored in the memory 109 through the processor 110 and performs the following operations:
when an object is shot through the first camera, zooming of grating blades in an aperture of the second camera is controlled to obtain a zoomed aperture shape;
controlling the light transmission quantity of light rays through the lens through the zoomed grating blade so as to form a virtual light spot in the shape of the aperture;
acquiring shooting data in a shooting process through a second camera, and calculating depth-of-field information according to the shooting data;
and distinguishing a main body and a background in the image shot by the first camera according to the depth of field information to obtain an image with the aperture-shaped virtual light spot.
Further, the mobile terminal calls the photographing program stored in the memory 109 through the processor 110, and further implements the step of controlling the zoom of the grating blade in the aperture of the second camera when the object is photographed through the first camera, so as to obtain the zoomed aperture shape:
when an object is shot through a first camera, popping up an aperture-shaped selection window in a camera application, wherein a selection control with various shapes is displayed in the selection window;
and when the touch control selection control is detected in the selection window, determining the shape corresponding to the selection control, and controlling the zooming of the grating blade in the aperture of the second camera according to the determined shape to obtain the zoomed aperture shape.
Further, the mobile terminal calls the photographing program stored in the memory 109 through the processor 110, and further realizes the steps of determining a shape corresponding to the selection control when the touch selection control is detected in the selection window, and controlling the zooming of the grating blade in the aperture of the second camera according to the determined shape to obtain the zoomed aperture shape:
when a touch selection control is detected in a selection window, determining the shape corresponding to the selection control, and popping up a selection frame of an aperture value;
and when the selected aperture value is received in the aperture value selection frame, controlling the zooming of the grating blade in the aperture of the second camera according to the determined shape and the determined aperture value so as to obtain the zoomed aperture shape.
Further, the mobile terminal calls the photographing program stored in the memory 109 through the processor 110, and further implements the step of calculating the depth information according to the photographing data:
acquiring an aperture value, a focal length, a circle of confusion diameter and a shooting distance in shooting data;
calculating the depth of field information according to the aperture value, the focal length, the diameter of the circle of confusion and the shooting distance, wherein the calculation formula is as follows:
Δ L indicates a depth of field, Δ L1 indicates a front depth of field, Δ L2 indicates a rear depth of field, F indicates an aperture value, F indicates a focal length, indicates a circle diameter of confusion, and L indicates a shooting distance.
Further, the mobile terminal calls the photographing program stored in the memory 109 through the processor 110, and further implements a step of distinguishing a subject and a background in the image photographed by the first camera according to the depth of field information to obtain an image with the blurred light spot having the aperture shape:
distinguishing a main body and a background in an image shot by a first camera according to the depth of field value of the depth of field information;
and in the distinguished background, fusing the blurring light spots in the aperture shapes into the background to obtain images with the blurring light spots in the aperture shapes.
Further, the mobile terminal calls the photographing program stored in the memory 109 through the processor 110, and further implements the step of controlling the zoom of the grating blade in the aperture of the second camera when the object is photographed through the first camera, so as to obtain the zoomed aperture shape:
when an object is shot through the first camera, judging whether the aperture is scalable;
and if the aperture is scalable, controlling the scaling of the grating blade in the aperture of the second camera so as to obtain the scaled aperture shape.
Further, after the step of determining whether the aperture is scalable when the first camera captures the object, the mobile terminal calls the photographing program stored in the memory 109 through the processor 110, and further implements the following steps:
if the aperture is not zoomed, displaying a selection interface for a user to select the shape of the blurring light spot;
when the selected shape is received in a display interface, acquiring a pre-stored blurring light spot corresponding to the shape;
replacing the circular light spot formed by the current aperture with the pre-stored virtual light spot corresponding to the shape;
acquiring shooting data in a shooting process through a second camera, and calculating depth-of-field information according to the shooting data;
and distinguishing a main body and a background in the image shot by the first camera according to the depth of field information to obtain an image with the pre-stored blurring light spots corresponding to the shape.
Further, a component is installed inside the lens cover of the second camera, a hole with a specific shape is formed in the component, the mobile terminal calls the photographing program stored in the memory 109 through the processor 110, and the following steps are also implemented:
when an object is shot through the first camera, the light passing amount of light passing through the lens is controlled through the second camera aperture so as to form a virtual light spot in a specific shape;
acquiring shooting data in a shooting process through a second camera, and calculating depth-of-field information according to the shooting data;
and distinguishing a main body and a background in the image shot by the first camera according to the depth of field information to obtain an image with the blurring light spot with the specific shape.
The mobile terminal 100 may further include a power supply 111 (e.g., a battery) for supplying power to various components, and preferably, the power supply 111 may be logically connected to the processor 110 via a power management system, so as to manage charging, discharging, and power consumption management functions via the power management system.
Although not shown in fig. 1, the mobile terminal 100 may further include a bluetooth module or the like, which is not described in detail herein.
Referring to fig. 2, fig. 2 is a block diagram of an electrical structure of the camera of fig. 1.
The photographing lens 10411 is composed of a plurality of optical lenses for forming an object image, and is a single focus lens or a zoom lens. The photographing lens 10411 is movable in the optical axis direction under the control of a lens driver 10421, and the lens driver 10421 controls the focal position of the photographing lens 10411 based on a control signal from a lens driving control circuit 10422, and in the case of a zoom lens, the focal distance may be controlled. The lens drive control circuit 10422 performs drive control of the lens driver 10421 in accordance with a control command from the microcomputer 10417.
An imaging element 10412 is disposed on the optical axis of the photographing lens 10411 in the vicinity of the position of the object image formed by the photographing lens 10411. The imaging element 10412 is used to capture an object image and acquire captured image data. Photodiodes constituting each pixel are two-dimensionally arranged in a matrix on the imaging element 10412. Each photodiode generates a photoelectric conversion current corresponding to the amount of received light, and the photoelectric conversion current is charged by a capacitor connected to each photodiode. A bayer RGB color filter is disposed on the front surface of each pixel.
The image pickup device 10412 is connected to an image pickup circuit 10413, and the image pickup circuit 10413 performs charge accumulation control and image signal reading control in the image pickup device 10412, reduces reset noise in the read image signal (analog image signal), performs waveform shaping, and further performs gain improvement or the like so as to obtain an appropriate signal level. The image pickup circuit 10413 is connected to an a/D converter 10414, and the a/D converter 10414 performs analog-to-digital conversion on an analog image signal and outputs a digital image signal (hereinafter referred to as image data) to the bus 10427.
The bus 10427 is a transfer path for transferring various data read out or generated inside the camera. The a/D converter 10414, an image processor 10415, a JPEG processor 10416, a microcomputer 10417, an SDRAM (Synchronous Dynamic random access memory) 10418, a memory interface (hereinafter referred to as memory I/F)10419, and an LCD (Liquid Crystal Display) driver 10420 are connected to the bus 10427.
The image processor 10415 performs various image processes such as OB subtraction processing, white balance adjustment, color matrix operation, gamma conversion, color difference signal processing, noise removal processing, synchronization processing, and edge processing on image data output from the image pickup device 10412. When the JPEG processor 10416 records image data in the recording medium 10425, the image data read out from the SDRAM10418 is compressed in accordance with the JPEG compression method. The JPEG processor 10416 decompresses JPEG image data for image reproduction and display. When decompression is performed, a file recorded in the recording medium 10425 is read out, decompression processing is performed in the JPEG processor 10416, and the decompressed image data is temporarily stored in the SDRAM10418 and displayed on the LCD 10426. In the present embodiment, the JPEG system is used as the image compression/decompression system, but the compression/decompression system is not limited to this, and other compression/decompression systems such as MPEG, TIFF, and h.264 may be used.
The microcomputer 10417 functions as a control unit for the entire camera, and collectively controls various processing sequences of the camera. The microcomputer 10417 is connected to an operation unit 10423 and a flash memory 10424.
The operation unit 10423 includes, but is not limited to, physical keys or virtual keys, which may be various input buttons such as a power button, a photographing key, an editing key, a moving image button, a reproduction button, a menu button, a cross key, an OK button, a delete button, an enlargement button, and operation controls such as various input keys, and which detect operation states of these operation controls.
The detection result is output to the microcomputer 10417. A touch panel is provided on the front surface of the LCD10426 as a display, and detects a touch position of the user and outputs the touch position to the microcomputer 10417. The microcomputer 10417 executes various processing sequences corresponding to the user's operation according to the detection result of the operation position from the operation unit 10423.
The flash memory 10424 stores a program for executing various processing sequences of the microcomputer 10417. The microcomputer 10417 controls the entire camera according to the program. The flash memory 10424 stores various adjustment values of the camera, and the microcomputer 10417 reads the adjustment values and controls the camera in accordance with the adjustment values.
The SDRAM10418 is an electrically rewritable volatile memory for temporarily storing image data and the like. The SDRAM10418 temporarily stores the image data output from the a/D converter 10414 and the image data processed by the image processor 10415, JPEG processor 10416, and the like.
The memory interface 10419 is connected to the recording medium 10425, and performs control for writing image data and data such as a header added to the image data to the recording medium 10425 and reading the image data from the recording medium 10425. The recording medium 10425 is, for example, a recording medium such as a memory card that can be attached to and detached from the camera body, but is not limited thereto, and may be a hard disk or the like built in the camera body.
The LCD driver 10410 is connected to the LCD10426, and stores the image data processed by the image processor 10415 in the SDRAM10418, and when display is necessary, reads the image data stored in the SDRAM10418 and displays it on the LCD10426, or the image data compressed by the JPEG processor 10416 is stored in the SDRAM10418, and when display is necessary, the JPEG processor 10416 reads the compressed image data of the SDRAM10418, and decompresses it, and displays the decompressed image data through the LCD 10426.
The LCD10426 is disposed on the back surface of the camera body and displays images. LCD10426LCD), however, not limited to this, various display panels (LCD10426) such as organic EL may be used, but not limited to this, various display panels such as organic EL may be used.
Based on the above mobile terminal hardware structure and communication network system, the present invention provides various embodiments of the photographing method.
Referring to fig. 3, fig. 3 is a flowchart illustrating a first embodiment of a photographing method according to the present invention.
In this embodiment, the photographing method is applied to a mobile terminal, two cameras are arranged on the back of the mobile terminal, and the photographing method includes:
step S10, when the object is shot by the first camera, the zooming of the grating blade in the aperture of the second camera is controlled to obtain the zoomed aperture shape;
step S20, controlling the light transmission quantity of the light through the lens by the zoomed grating blade to form the blurring facula in the shape of the aperture;
step S30, collecting shooting data in the shooting process through a second camera, and calculating depth of field information according to the shooting data;
and step S40, distinguishing a main body and a background in the image shot by the first camera according to the depth of field information to obtain an image with the blurred light spots in the aperture shape.
In this embodiment, the photographing method may be optionally applied to a mobile terminal, and the mobile terminal may be the mobile terminal illustrated in fig. 1. The back of the mobile terminal is provided with two cameras, the front of the mobile terminal is optionally provided with one camera, on the basis that the back of the mobile terminal is provided with the two cameras, the mobile terminal shoots an object through the first camera, shooting data in the shooting process is collected through the second camera to calculate depth of field information, the first camera and the second camera are not limited, any one of the two rear cameras can be used as the first camera, the other one of the two rear cameras is used as the second camera, the distance between the two rear cameras is set according to actual conditions, and the distance is not limited here. Because the positions of the two cameras are adjacent, and the difference of the images actually shot by the two cameras is not large, the depth of field information of the shot data is calculated by the second camera, and the images shot by the first camera are adjusted by adopting the depth of field information, so that the adjustment result is accurate.
The following are the specific steps of implementing photographing in this embodiment:
step S10, when the object is shot by the first camera, the zooming of the grating blade in the aperture of the second camera is controlled to obtain the zoomed aperture shape;
in this embodiment, when shooting is performed by the first camera that is turned on, if an object cannot be shot in the lens of the first camera, shooting is continued by the first camera until the object is shot in the lens of the first camera, and the mobile terminal controls the zoom of the aperture in the second camera so that the grating blades in the aperture zoom. In this embodiment, the objects include, but are not limited to, human bodies, animals, scenes, and the like, and it should be noted that, in the aperture according to the embodiment of the present invention, the shapes and the numbers of the grating blades are variable, that is, the numbers and the shapes of the grating blades can be set according to actual needs, and apertures with different aperture shapes can be obtained by grating blades with different shapes and different numbers, wherein the shapes of the grating blades can be selected from regular sectors and irregular sectors, the shapes and the numbers of the grating blades can be selected from 6 sectors and 7 sectors, and the final aperture shapes are different. For example, referring to fig. 4, when the stop has a fan-shaped stop blade and the number of stop blades is 7, the stop is formed in a hexagonal shape.
Step S20, controlling the light transmission quantity of the light through the lens by the zoomed grating blade to form the blurring facula in the shape of the aperture;
in this embodiment, after the mobile terminal controls the zooming of the grating blade in the second camera aperture, the amount of light passing through the lens can be controlled by the zoomed grating blade to form the aperture-shaped blurred light spot. Specifically, the step S20 includes:
step a, controlling light rays to enter an inner light sensing surface of a second camera through a lens by virtue of a zoomed grating blade;
and b, obtaining a virtual light spot in the shape of the aperture based on the shape formed by the light sensed by the inner light sensing surface.
In this embodiment, when the mobile terminal controls the light to pass through by the zoomed grating blade, the light is controlled to pass through the lens and enter the inner light sensing surface of the second camera, and then the shape formed by the light is sensed on the inner light sensing surface.
As described in detail above, the shapes and the numbers of the grating blades are different, and the aperture shapes formed are also different, so after the zooming of the grating blades is controlled, the light transmission amount of the light transmitted through the lens can be controlled by the zoomed grating blades, so as to form the aperture-shaped virtual spot by the light transmission amount, taking fig. 4 as an example, the position light of 7 grating blades cannot pass through, the position light of the hexagon formed by 7 grating blades can pass through, and therefore when the light passes through the grating blades, the hexagonal virtual spot can be formed. It is understood that when the grating blades are retracted toward the center of the aperture, the amount of light passing through the lens and passing through the aperture is smaller, and the aperture shape is formed smaller, and when the grating blades are enlarged toward the edge of the aperture, the amount of light passing through the lens and passing through the aperture is larger.
Step S30, collecting shooting data in the shooting process through a second camera, and calculating depth of field information according to the shooting data;
in this embodiment, after controlling the amount of light transmitted through the lens by the zoomed grating blade to form the aperture-shaped virtual light spot, the mobile terminal uses the shooting data in the shooting process through the second camera to calculate the depth-of-field information according to the collected shooting data. Specifically, the step S30 includes:
c, acquiring shooting data in the shooting process through a second camera, and caching the shooting data;
and d, calculating the depth of field information according to the cached shooting data.
The mobile terminal collects shooting data in the shooting process through the second camera, caches the shooting data, optionally caches the shooting data in a preset storage area, wherein the preset storage area is not limited, and then calculates the depth of field information according to the cached shooting data. In this embodiment, the step of calculating depth information from the shooting data includes:
step e, acquiring an aperture value, a focal length, a circle of confusion diameter and a shooting distance in the shooting data;
f, calculating depth of field information according to the aperture value, the focal length, the diameter of the circle of confusion and the shooting distance, wherein the calculation formula is as follows:
Δ L indicates a depth of field, Δ L1 indicates a front depth of field, Δ L2 indicates a rear depth of field, F indicates an aperture value, F indicates a focal length, indicates a circle diameter of confusion, and L indicates a shooting distance.
It should be noted that the aperture value refers to a current aperture value of the second camera, and when the aperture of the second camera is adjustable, the aperture value is also adjusted accordingly, and when the mobile terminal controls the aperture of the second camera to zoom the aperture of the second camera, the aperture value can be determined.
In this embodiment, in the shooting process of the mobile terminal through the camera, the aperture value, the focal length, the diameter of the circle of confusion and the shooting distance in the shooting data are obtained through the second camera, and the focal length, the diameter of the circle of confusion, the shooting distance and other parameters can refer to fig. 5.
And step S40, distinguishing a main body and a background in the image shot by the first camera according to the depth of field information to obtain an image with the blurred light spots in the aperture shape.
In the present embodiment, after calculating the depth of field information, the step S40 includes, according to the depth of field information, distinguishing the subject and the background in the image captured by the first camera to obtain an image with the blurred spot having the aperture shape, specifically, referring to fig. 6:
step S41, distinguishing a main body and a background in the image shot by the first camera according to the depth of field value of the depth of field information;
and step S42, fusing the blurred light spots in the aperture shape into the background in the distinguished background to obtain an image with the blurred light spots in the aperture shape.
In this embodiment, after the depth of field information is calculated from the shooting data collected by the second camera, the depth of field value of the depth of field information is determined, and then the main body and the background are distinguished in the image shot by the first camera according to the depth of field value of the depth of field information.
According to the photographing method provided by the embodiment, two cameras are arranged on the back of the mobile terminal, when an object is photographed through the first camera, zooming of grating blades in an aperture of the second camera is controlled firstly to obtain a zoomed aperture shape, then the light passing amount of light passing through a lens is controlled through the zoomed grating blades to form a blurred light spot in the aperture shape, photographing data in a photographing process are collected through the second camera, depth of field information is calculated according to the photographing data, and finally a main body and a background in an image photographed by the first camera are distinguished according to the depth of field information to obtain an image with the blurred light spot in the aperture shape. The invention utilizes the double cameras and the diaphragm blurring function of the mobile terminal to control the zooming of the grating blades in the diaphragm so as to realize the shooting of the blurring light spot image with the specific diaphragm shape, does not need post-processing, improves the shooting convenience of the blurring light spot image with the specific shape, does not need to be realized by a camera and a specific lens, and reduces the cost.
Further, a second embodiment of the photographing method of the present invention is proposed based on the first embodiment.
The second embodiment of the photographing method is different from the first embodiment of the photographing method in that, in the present embodiment, referring to fig. 7, the step S10 includes:
step S11, when an object is shot through the first camera, an aperture-shaped selection window is popped up in the shooting application, wherein, a selection control with various shapes is displayed in the selection window;
step S12, when the touch selection control is detected in the selection window, determining a shape corresponding to the selection control, and controlling the zoom of the grating blade in the aperture of the second camera according to the determined shape to obtain a zoomed aperture shape.
In this embodiment, the various shapes include, but are not limited to: the shapes of the circle, the square, the heart, the pentagram and the tree are all realized by adjusting the shapes and the number of the grating blades, namely the shapes and the number of the grating blades are adjusted, so that different aperture shapes can be displayed. In this embodiment, when an object is photographed by a first camera, a mobile terminal pops up a selection window of an aperture shape in a photographing application, various aperture shapes are displayed through a selection control, that is, each selection control corresponds to one aperture shape, when a selection control touched by a user is detected in the selection window, a shape corresponding to the selection control is determined, then a preset aperture value corresponding to the shape is determined, and finally, zooming of a grating blade in an aperture of a second camera is controlled according to the preset aperture value corresponding to the shape, that is, in this embodiment, the aperture value corresponding to each shape is set in advance, so that when zooming of the grating blade in the aperture is controlled according to the determined shape, zooming of the grating blade can be controlled according to the preset aperture value, so that the aperture shape corresponding to the aperture value is obtained.
Further, in order to improve the accuracy of the raster blade control, referring to fig. 8, the step S12 includes:
step S121, when a touch control selection control is detected in a selection window, determining the shape corresponding to the selection control, and popping up a selection frame of an aperture value;
and S122, when the selected aperture value is received in the aperture value selection frame, controlling the zooming of the grating blade in the aperture of the second camera according to the determined shape and the determined aperture value so as to obtain the zoomed aperture shape.
That is, in this embodiment, the mobile terminal sets aperture values corresponding to the different shapes of the stop blades in advance, and displays the aperture values in the selection frame according to the set aperture values for the user to select.
For better understanding, the following are exemplified:
referring to fig. 9, the aperture inside the second camera is originally circular, the mobile terminal sets seven fan-shaped grating blades in the aperture in advance, when the mobile terminal controls the seven fan-shaped grating blades to be fully enlarged, the aperture is circular at the moment, when the mobile terminal controls the seven fan-shaped grating blades to be reduced, seven fan-shaped pages begin to gather to the center of the aperture, a hexagon is formed in the aperture, and as can be seen from fig. 9, the more the seven fan-shaped pages are reduced to the center of the aperture, the smaller the hexagon in the shape of the aperture is. Note that the degree of zooming of the sector page is different, and the aperture value finally formed is also different, and in fig. 9, the aperture value of the first aperture is 2, the aperture value of the second aperture is 2.8, and the aperture value of the last aperture is 22.
Therefore, when the mobile terminal shoots an object through the first camera, firstly, a selection window of an aperture shape is popped up in the camera shooting application, wherein selection controls of various shapes are displayed in the selection window, if a touch control selection control is detected in the selection window, the shape corresponding to the selection control is determined, and a selection frame of an aperture value is popped up, and if the selected aperture value is received in the selection frame of the aperture value, the zoom of a grating blade in the aperture of the second camera is controlled according to the determined shape and the determined aperture value, so that the zoomed aperture shape is obtained.
In the embodiment, the aperture shape is selected and the aperture value is selected, so that the mobile terminal controls the zooming of the grating blade in the aperture of the second camera according to the selected aperture shape and aperture value, the aperture shape with the aperture value is obtained, when the subsequent control light penetrates through the aperture, the virtual light spot corresponding to the aperture shape with the aperture value can be obtained, the virtual light spot image with the aperture shape with the aperture value is obtained, and the convenience and the intelligence for shooting the virtual light spot image are improved.
Further, a third embodiment of the photographing method of the present invention is proposed based on the first or second embodiment.
The third embodiment of the photographing method is different from the first or second embodiment of the photographing method in that, in this embodiment, the step S10 further includes:
step g, when the object is shot through the first camera, judging whether the aperture is scalable;
and h, if the aperture is scalable, controlling the scaling of the grating blades in the aperture of the second camera to obtain the scaled aperture shape.
In this embodiment, when an object is initially captured by a first camera, if the object is captured, it is first identified whether an aperture built in a mobile terminal is scalable, and if the aperture is scalable, the mobile terminal controls scaling of a grating blade in an aperture of a second camera, and the mobile terminal controls scaling of a grating blade in an aperture of a second camera is consistent with the above embodiment, which is not described herein again.
In addition, the mobile terminal can also be selected to judge whether the aperture is scalable or not when the object is shot through the first camera;
if the aperture is scalable, popping an aperture-shaped selection window in the camera application, wherein a plurality of shapes of selection controls are displayed in the selection window;
and when the touch control selection control is detected in the selection window, determining the shape corresponding to the selection control, and controlling the zooming of the grating blade in the aperture of the second camera according to the determined shape to obtain the zoomed aperture shape.
In this embodiment, when an object is photographed by the first camera, it is determined whether the aperture is scalable, if the aperture is scalable, the scaling of the grating blade in the aperture of the second camera is controlled, and if the aperture is not scalable, the light transmission amount of the light is controlled according to the fixed aperture, so that the mobile terminal displays the circular blurring light spot in the image, thereby improving the accuracy of the circular blurring light spot.
Further, a fourth embodiment of the photographing method of the present invention is proposed based on the third embodiment.
The fourth embodiment of the photographing method differs from the third embodiment of the photographing method in that after the step g, the method further comprises:
step i, if the aperture is not zoomed, displaying a selection interface for a user to select the shape of the blurring light spot;
step j, when the selected shape is received in a display interface, acquiring a pre-stored blurring light spot corresponding to the shape;
step k, replacing the circular light spot formed by the current aperture with the pre-stored virtual light spot corresponding to the shape;
step l, acquiring shooting data in a shooting process through a second camera, and calculating depth-of-field information according to the shooting data;
and m, distinguishing a main body and a background in the image shot by the first camera according to the depth of field information to obtain an image with the pre-stored blurring light spots corresponding to the shape.
In this embodiment, when the mobile terminal detects that the aperture is not scalable, it indicates that the aperture built in the camera is fixed, and therefore, the blurring light spot formed in the captured image is also only a circular light spot. In this embodiment, in order to display the blurred light spots of different shapes in the subsequently photographed image, a selection interface is displayed, the shapes of various blurred light spots are displayed in the selection interface, and after the user selects any one of the shapes of the blurred light spots in the display interface, the mobile terminal obtains the pre-stored blurred light spot of the selected shape from the database according to the selected shape. It should be noted that, in this embodiment, the mobile terminal stores the virtualization light spots of various different shapes in advance, so after the pre-stored virtualization light spot is obtained, the pre-stored virtualization light spot can be used to replace the circular light spot formed by the current aperture, so that in the subsequent shooting process, the depth of field information is calculated according to the shooting data collected by the second camera, then the main body and the background of the image shot by the first camera are distinguished according to the depth of field information, and then the image with the pre-stored virtualization light spot corresponding to the shape is obtained.
In this embodiment, when the aperture inside the camera is not adjustable, the circular blurring light spot in the subsequent shot image is replaced by the blurring light spots in various shapes which are arranged in advance, so that the shot image has the blurring light spots in different shapes, not only the circular blurring light spot, the shooting flexibility of the mobile terminal is improved, and the convenience and the intelligence for shooting blurring light spot images in different shapes are also improved.
Further, a fifth embodiment of the photographing method of the present invention is proposed based on the first to fourth embodiments.
The fifth embodiment of the photographing method is different from the first to fourth embodiments of the photographing method in that a component is installed inside the lens cover of the second camera, and a duct with a specific shape is formed through the component, and the photographing method further includes:
n, when the first camera shoots an object, controlling the light transmission quantity of the light rays passing through the lens through the aperture of the second camera to form a blurring light spot with a specific shape;
step o, acquiring shooting data in the shooting process through a second camera, and calculating depth-of-field information according to the shooting data;
and p, distinguishing a main body and a background in the image shot by the first camera according to the depth of field information to obtain an image with the blurring light spot with the specific shape.
In this embodiment, in addition to controlling the mobile terminal to capture blurred light spot images with different shapes according to the above two manners, the cover head of the second camera may be set in advance, specifically, a component is set inside the lens cover of the second camera, the component is provided with a hole with a specific shape, in this embodiment, the component may be an aluminum sheet, a light blocking sheet, or a black card paper, taking a black card paper as an example, the center of the black card paper is provided with a hole with a specific shape, as shown in fig. 10, the center of the black card paper is provided with a hollow hole with a heart shape, the black card paper is set inside the lens cover of the second camera, so that when light passes through the lens of the second camera, a blurred light spot with a heart shape is formed through the light passing through the black card paper, and then the mobile terminal collects shooting data by the second camera, and calculating depth of field information according to the shooting data, and finally distinguishing a main body and a background in the image shot by the first camera according to the depth of field information to obtain the image with the virtual light spot with the specific shape.
In the embodiment, as for the aspect of a hardware structure, the shooting of the blurred light spot images with different shapes is realized, the zooming of the grating blades in the built-in aperture is not required to be adjusted, blurred light spots with various shapes are not required to be built in, the shot images also have blurred light spots with various shapes, the flexibility and convenience of image shooting are improved, and the cost is also reduced.
For better understanding of the embodiment, referring to fig. 11-12, fig. 11 shows a circular blurring speckle image captured by a conventional mobile terminal, and as shown in fig. 12, in the embodiment of the present invention, the zoom of the grating blade in the aperture is controlled by the dual-camera and aperture blurring function, so as to achieve the capture of a specific heart-shaped blurring speckle image.
In addition, the embodiment of the invention also provides a computer readable storage medium.
The computer readable storage medium is built in the mobile terminal described in fig. 1, and the computer readable storage medium stores a photographing program, and the photographing program, when executed by a processor, implements the steps of the photographing method as described above.
The specific implementation of the computer-readable storage medium in this embodiment is substantially the same as that of the above embodiments of the photographing method, and details are not described here.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a mobile terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. The photographing method is characterized by being applied to a mobile terminal, two cameras are arranged on the back of the mobile terminal, and the photographing method comprises the following steps:
when an object is shot through the first camera, zooming of grating blades in an aperture of the second camera is controlled to obtain a zoomed aperture shape;
controlling the light transmission quantity of light rays through the lens through the zoomed grating blade so as to form a virtual light spot in the shape of the aperture;
acquiring shooting data in a shooting process through a second camera, and calculating depth-of-field information according to the shooting data;
distinguishing a main body and a background in the image shot by the first camera according to the depth of field information to obtain an image with the aperture-shaped virtual light spot;
the step of controlling the amount of light passing through the lens by the zoomed grating blade to form the aperture-shaped virtual light spot comprises:
controlling light rays to enter an inner light sensing surface of the second camera through the lens through the zoomed grating blade;
and obtaining the virtual light spot of the aperture shape based on the shape formed by the light sensed by the inner light sensing surface.
2. The photographing method according to claim 1, wherein the step of controlling the zoom of the grating blade in the aperture of the second camera to obtain the zoomed aperture shape when the object is photographed by the first camera comprises:
when an object is shot through a first camera, popping up an aperture-shaped selection window in a camera application, wherein a selection control with various shapes is displayed in the selection window;
and when the touch control selection control is detected in the selection window, determining the shape corresponding to the selection control, and controlling the zooming of the grating blade in the aperture of the second camera according to the determined shape to obtain the zoomed aperture shape.
3. The photographing method of claim 2, wherein when a touch selection control is detected in the selection window, determining a shape corresponding to the selection control, and controlling zooming of a grating blade in an aperture of the second camera according to the determined shape to obtain a zoomed aperture shape comprises:
when a touch selection control is detected in a selection window, determining the shape corresponding to the selection control, and popping up a selection frame of an aperture value;
and when the selected aperture value is received in the aperture value selection frame, controlling the zooming of the grating blade in the aperture of the second camera according to the determined shape and the determined aperture value so as to obtain the zoomed aperture shape.
4. The photographing method according to claim 1, wherein the step of calculating depth information from the photographing data includes:
acquiring an aperture value, a focal length, a circle of confusion diameter and a shooting distance in shooting data;
calculating the depth of field information according to the aperture value, the focal length, the diameter of the circle of confusion and the shooting distance, wherein the calculation formula is as follows:
Δ L indicates a depth of field, Δ L1 indicates a front depth of field, Δ L2 indicates a rear depth of field, F indicates an aperture value, F indicates a focal length, indicates a circle of confusion, and L indicates a shooting distance.
5. The photographing method according to claim 1, wherein the step of distinguishing the subject and the background in the image captured by the first camera according to the depth information to obtain the image with the blurred spot having the aperture shape comprises:
distinguishing a main body and a background in an image shot by a first camera according to the depth of field value of the depth of field information;
and in the distinguished background, fusing the blurring light spots in the aperture shapes into the background to obtain images with the blurring light spots in the aperture shapes.
6. The photographing method according to claim 1, wherein the step of controlling the zoom of the grating blade in the aperture of the second camera to obtain the zoomed aperture shape when the object is photographed by the first camera further comprises:
when an object is shot through the first camera, judging whether the aperture is scalable;
and if the aperture is scalable, controlling the scaling of the grating blade in the aperture of the second camera so as to obtain the scaled aperture shape.
7. The photographing method according to claim 6, wherein after the step of determining whether the aperture is scalable when the object is photographed by the first camera, the method further comprises:
if the aperture is not zoomed, displaying a selection interface for a user to select the shape of the blurring light spot;
when the selected shape is received in a display interface, acquiring a pre-stored blurring light spot corresponding to the shape;
replacing the circular light spot formed by the current aperture with the pre-stored virtual light spot corresponding to the shape;
acquiring shooting data in a shooting process through a second camera, and calculating depth-of-field information according to the shooting data;
and distinguishing a main body and a background in the image shot by the first camera according to the depth of field information to obtain an image with the pre-stored blurring light spots corresponding to the shape.
8. The photographing method according to any one of claims 1 to 7, wherein a component is provided inside a lens cover of the second camera, the component being provided with a specific shaped aperture, the photographing method further comprising:
when an object is shot through the first camera, the light passing amount of light passing through the lens is controlled through the second camera aperture so as to form a virtual light spot in a specific shape;
acquiring shooting data in a shooting process through a second camera, and calculating depth-of-field information according to the shooting data;
and distinguishing a main body and a background in the image shot by the first camera according to the depth of field information to obtain an image with the blurring light spot with the specific shape.
9. A mobile terminal, characterized in that the mobile terminal comprises a memory, a processor and a photographing program stored on the memory and operable on the processor, the photographing program, when executed by the processor, implementing the steps of the photographing method according to any one of claims 1 to 8.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon a photographing program which, when executed by a processor, implements the steps of the photographing method according to any one of claims 1 to 8.
CN201711455917.0A 2017-12-27 2017-12-27 Photographing method, mobile terminal and computer-readable storage medium Active CN108156378B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711455917.0A CN108156378B (en) 2017-12-27 2017-12-27 Photographing method, mobile terminal and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711455917.0A CN108156378B (en) 2017-12-27 2017-12-27 Photographing method, mobile terminal and computer-readable storage medium

Publications (2)

Publication Number Publication Date
CN108156378A CN108156378A (en) 2018-06-12
CN108156378B true CN108156378B (en) 2020-12-18

Family

ID=62463728

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711455917.0A Active CN108156378B (en) 2017-12-27 2017-12-27 Photographing method, mobile terminal and computer-readable storage medium

Country Status (1)

Country Link
CN (1) CN108156378B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109559272A (en) * 2018-10-30 2019-04-02 深圳市商汤科技有限公司 A kind of image processing method and device, electronic equipment, storage medium
CN111311482A (en) * 2018-12-12 2020-06-19 Tcl集团股份有限公司 Background blurring method and device, terminal equipment and storage medium

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009110137A (en) * 2007-10-29 2009-05-21 Ricoh Co Ltd Image processor, image processing method, and image processing program
US8810691B2 (en) * 2010-09-03 2014-08-19 Olympus Imaging Corp. Imaging apparatus, imaging method and computer-readable recording medium
CN104035263A (en) * 2013-03-08 2014-09-10 Jvc建伍株式会社 Light quantity adjusting device and imaging apparatus
CN203365877U (en) * 2013-06-09 2013-12-25 上海大学 Lens cover for shooting heart-shaped fuzzy light spot
CN104793910B (en) * 2014-01-20 2018-11-09 联想(北京)有限公司 A kind of method and electronic equipment of information processing
CN103945210B (en) * 2014-05-09 2015-08-05 长江水利委员会长江科学院 A kind of multi-cam image pickup method realizing shallow Deep Canvas
CN107016639A (en) * 2017-03-30 2017-08-04 努比亚技术有限公司 A kind of image processing method and device
CN107018331A (en) * 2017-04-19 2017-08-04 努比亚技术有限公司 A kind of imaging method and mobile terminal based on dual camera
CN107197146B (en) * 2017-05-31 2020-06-30 Oppo广东移动通信有限公司 Image processing method and device, mobile terminal and computer readable storage medium

Also Published As

Publication number Publication date
CN108156378A (en) 2018-06-12

Similar Documents

Publication Publication Date Title
CN107613191B (en) Photographing method, photographing equipment and computer readable storage medium
CN109361865B (en) Shooting method and terminal
KR101839569B1 (en) Method and terminal for acquiring panoramic image
CN108063859B (en) Automatic photographing control method, terminal and computer storage medium
CN107730460B (en) Image processing method and mobile terminal
CN108038825B (en) Image processing method and mobile terminal
CN107040723B (en) Imaging method based on double cameras, mobile terminal and storage medium
CN108156378B (en) Photographing method, mobile terminal and computer-readable storage medium
CN111182205B (en) Photographing method, electronic device, and medium
CN111601032A (en) Shooting method and device and electronic equipment
CN110913139A (en) Photographing method and electronic equipment
CN109639970B (en) Shooting method and terminal equipment
CN107707821B (en) Distortion parameter modeling method and device, correction method, terminal and storage medium
CN107633475B (en) Picture processing method, terminal and computer readable storage medium
CN110944114B (en) Photographing method and electronic equipment
CN111147752B (en) Zoom factor adjusting method, electronic device, and medium
CN110266957B (en) Image shooting method and mobile terminal
CN110881105B (en) Shooting method and electronic equipment
CN111314621A (en) Photographing method and electronic equipment
CN111050069A (en) Shooting method and electronic equipment
CN110913130A (en) Shooting method and electronic equipment
CN108156392B (en) Shooting method, terminal and computer readable storage medium
CN108600623B (en) Refocusing display method and terminal device
CN107613204B (en) Focusing area adjusting method, terminal and computer storage medium
CN110830721B (en) Image processing method, electronic device and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant