CN112866564A - Photographing method, terminal and computer readable medium - Google Patents

Photographing method, terminal and computer readable medium Download PDF

Info

Publication number
CN112866564A
CN112866564A CN202011633272.7A CN202011633272A CN112866564A CN 112866564 A CN112866564 A CN 112866564A CN 202011633272 A CN202011633272 A CN 202011633272A CN 112866564 A CN112866564 A CN 112866564A
Authority
CN
China
Prior art keywords
image
photographing
current
silhouette
picture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011633272.7A
Other languages
Chinese (zh)
Other versions
CN112866564B (en
Inventor
徐爱辉
崔小辉
李风光
王�琦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nubia Technology Co Ltd
Original Assignee
Nubia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nubia Technology Co Ltd filed Critical Nubia Technology Co Ltd
Priority to CN202011633272.7A priority Critical patent/CN112866564B/en
Publication of CN112866564A publication Critical patent/CN112866564A/en
Application granted granted Critical
Publication of CN112866564B publication Critical patent/CN112866564B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)

Abstract

The invention discloses a photographing method, a terminal and a computer readable medium, wherein the method comprises the following steps: when a photographing instruction is monitored, judging whether a condition of photographing in a silhouette mode is met; if yes, executing the photographing instruction in the silhouette mode; and if not, executing the photographing instruction. According to the technical scheme, when the photographing instruction is monitored, whether the photographing condition of the silhouette mode is met or not is judged, and when the photographing condition of the silhouette mode is met, the photographing instruction is executed in the silhouette mode to photograph the image of the silhouette mode, so that the image in the silhouette mode is directly photographed, later-stage picture repairing is not needed, convenience in image formation of the silhouette mode is greatly improved, and satisfaction of a user is effectively improved.

Description

Photographing method, terminal and computer readable medium
Technical Field
The present invention relates to a photographing method, and more particularly, to a photographing method, a terminal and a computer readable medium capable of directly photographing a silhouette mode image.
Background
With the popularization of mobile terminals such as smart phones, user demands based on the mobile terminals are also increasingly diversified. For example, there is an increasing demand for a photographing function of a mobile terminal, specifically, for example, capturing an image with a silhouette effect.
However, the existing mobile terminal only needs to perform the image repairing processing on the shot conventional image to realize the image with the silhouette effect, so that the operation is complicated, and the user satisfaction is not high.
Disclosure of Invention
The invention mainly aims to provide a photographing method, a terminal and a computer readable medium, aiming at realizing the direct photographing of an image with a silhouette effect.
In order to achieve the above object, the present invention provides a photographing method, comprising the steps of:
when a photographing instruction is monitored, judging whether a condition of photographing in a silhouette mode is met;
if yes, executing the photographing instruction in the silhouette mode;
and if not, executing the photographing instruction.
Optionally, the step of determining whether the condition for taking a picture in the silhouette mode is satisfied includes:
judging whether the current viewing picture contains an image to be subjected to silhouette processing;
if yes, judging that the condition of taking a picture in a silhouette mode is met;
if not, the condition for photographing in the silhouette mode is determined to be not satisfied.
Optionally, the step of determining whether the current viewfinder frame includes an image to be silhouetted includes:
judging whether the current viewfinder picture contains images of people and/or animals;
if yes, judging that the current framing picture contains an image to be subjected to silhouette processing;
and if not, judging that the current framing picture does not contain the image to be subjected to silhouette processing.
Optionally, after the step of determining that the current viewfinder frame includes an image to be subjected to silhouette processing, the method further includes:
judging whether the current viewing picture contains the image of the sea or not;
if yes, judging that the condition of taking a picture in a silhouette mode is met;
if not, the condition for photographing in the silhouette mode is determined to be not satisfied.
Optionally, the step of determining whether the current viewfinder frame includes an image of the sea includes:
judging whether the current framing picture contains an image of the sea or not by an image identification method;
if yes, judging whether the current framing picture contains a coastline or not by a straight line detection method;
if yes, the current framing picture is judged to contain the image of the sea.
Optionally, after the step of determining that the current viewfinder frame includes an image to be subjected to silhouette processing, the method further includes:
judging whether the current viewing picture contains an image of sunrise or sunset;
if yes, judging that the condition of taking a picture in a silhouette mode is met;
if not, the condition for photographing in the silhouette mode is determined to be not satisfied.
Optionally, the step of determining whether the current viewfinder frame includes an image of sunrise or sunset includes:
judging whether the current framing picture contains sunrise or sunset images through an image identification method;
if so, acquiring the current time, and judging whether the current time accords with the sunrise time or the sunset time;
if yes, the current framing picture is judged to contain the image of sunrise or sunset.
Optionally, the step of executing the photographing instruction in the silhouette mode includes:
acquiring the image to be subjected to silhouette processing in the current view-finding picture;
adjusting the brightness of the image to be subjected to silhouette processing to be 0;
and executing the photographing instruction.
In addition, the present invention further provides a terminal, which includes a memory, a processor, and an implementation program of the photographing method stored in the memory and operable on the processor, wherein the implementation program of the photographing method, when executed by the processor, includes the following steps:
when a photographing instruction is monitored, judging whether a condition of photographing in a silhouette mode is met;
if yes, executing the photographing instruction in the silhouette mode;
and if not, executing the photographing instruction.
Furthermore, the present invention also provides a computer readable medium, on which an implementation program of a photographing method is stored, the implementation program of the photographing method implementing the following steps when executed:
when a photographing instruction is monitored, judging whether a condition of photographing in a silhouette mode is met;
if yes, executing the photographing instruction in the silhouette mode;
and if not, executing the photographing instruction.
According to the technical scheme, when the photographing instruction is monitored, whether the photographing condition of the silhouette mode is met or not is judged, and when the photographing condition of the silhouette mode is met, the photographing instruction is executed in the silhouette mode to photograph the image of the silhouette mode, so that the image in the silhouette mode is directly photographed, later-stage picture repairing is not needed, convenience in image formation of the silhouette mode is greatly improved, and satisfaction of a user is effectively improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the structures shown in the drawings without creative efforts.
Fig. 1 is a schematic diagram of a hardware structure of a mobile terminal implementing various embodiments of the present invention;
fig. 2 is a diagram of a communication network system architecture according to an embodiment of the present invention;
FIG. 3 is a flowchart illustrating a photographing method according to an embodiment of the present invention;
FIG. 4 is a flowchart of one embodiment of the photographing method shown in FIG. 3;
FIG. 5 is a flowchart of one embodiment of the step of executing the photo command in the silhouette mode shown in FIG. 4;
FIG. 6 is a flowchart illustrating a first embodiment of the photographing method shown in FIG. 4;
FIG. 7 is a flowchart illustrating a second embodiment of the photographing method shown in FIG. 4;
FIG. 8 is a flowchart illustrating an embodiment of the step of determining whether the current viewfinder frame contains a sea image shown in FIG. 7;
FIG. 9 is a flowchart illustrating a third embodiment of the photographing method shown in FIG. 4;
FIG. 10 is a flowchart illustrating an embodiment of the step of determining whether the current viewfinder frame contains sunrise or sunset images shown in FIG. 9;
fig. 11 is a flowchart of a fourth embodiment of the photographing method shown in fig. 4.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for facilitating the explanation of the present invention, and have no specific meaning in itself. Thus, "module", "component" or "unit" may be used mixedly.
The terminal may be implemented in various forms. For example, the terminal described in the present invention may include a mobile terminal such as a mobile phone, a tablet computer, a notebook computer, a palmtop computer, a Personal Digital Assistant (PDA), a Portable Media Player (PMP), a navigation device, a wearable device, a smart band, a pedometer, and the like, and a fixed terminal such as a Digital TV, a desktop computer, and the like.
The following description will be given by way of example of a mobile terminal, and it will be understood by those skilled in the art that the construction according to the embodiment of the present invention can be applied to a fixed type terminal, in addition to elements particularly used for mobile purposes.
Referring to fig. 1, which is a schematic diagram of a hardware structure of a mobile terminal for implementing various embodiments of the present invention, the mobile terminal 100 may include: RF (Radio Frequency) unit 101, WiFi module 102, audio output unit 103, a/V (audio/video) input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 1 is not intended to be limiting of mobile terminals, which may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile terminal in detail with reference to fig. 1:
the radio frequency unit 101 may be configured to receive and transmit signals during information transmission and reception or during a call, and specifically, receive downlink information of a base station and then process the downlink information to the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System for Mobile communications), GPRS (General Packet Radio Service), CDMA2000(Code Division Multiple Access 2000), WCDMA (Wideband Code Division Multiple Access), TD-SCDMA (Time Division-Synchronous Code Division Multiple Access), FDD-LTE (Frequency Division duplex Long Term Evolution), and TDD-LTE (Time Division duplex Long Term Evolution).
WiFi belongs to short-distance wireless transmission technology, and the mobile terminal can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 102, and provides wireless broadband internet access for the user. Although fig. 1 shows the WiFi module 102, it is understood that it does not belong to the essential constitution of the mobile terminal, and may be omitted entirely as needed within the scope not changing the essence of the invention.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the WiFi module 102 or stored in the memory 109 into an audio signal and output as sound when the mobile terminal 100 is in a call signal reception mode, a call mode, a recording mode, a voice recognition mode, a broadcast reception mode, or the like. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the mobile terminal 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 may include a speaker, a buzzer, and the like.
The a/V input unit 104 is used to receive audio or video signals. The a/V input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, the Graphics Processing Unit 1041 Processing image data of a still image or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the WiFi module 102. The microphone 1042 may receive sounds (audio data) via the microphone 1042 in a phone call mode, a recording mode, a voice recognition mode, or the like, and may be capable of processing such sounds into audio data. The processed audio (voice) data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode. The microphone 1042 may implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated in the course of receiving and transmitting audio signals.
The mobile terminal 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 1061 and/or a backlight when the mobile terminal 100 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
The display unit 106 is used to display information input by a user or information provided to the user. The Display unit 106 may include a Display panel 1061, and the Display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Specifically, the user input unit 107 may include a touch panel 1071 and other input devices 1072. The touch panel 1071, also referred to as a touch screen, may collect a touch operation performed by a user on or near the touch panel 1071 (e.g., an operation performed by the user on or near the touch panel 1071 using a finger, a stylus, or any other suitable object or accessory), and drive a corresponding connection device according to a predetermined program. The touch panel 1071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and can receive and execute commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. In particular, other input devices 1072 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like, and are not limited to these specific examples.
Further, the touch panel 1071 may cover the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although the touch panel 1071 and the display panel 1061 are shown in fig. 1 as two separate components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the mobile terminal, and is not limited herein.
The interface unit 108 serves as an interface through which at least one external device is connected to the mobile terminal 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the mobile terminal 100 or may be used to transmit data between the mobile terminal 100 and external devices.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the mobile terminal. Processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The mobile terminal 100 may further include a power supply 111 (e.g., a battery) for supplying power to various components, and preferably, the power supply 111 may be logically connected to the processor 110 via a power management system, so as to manage charging, discharging, and power consumption management functions via the power management system.
Although not shown in fig. 1, the mobile terminal 100 may further include a bluetooth module or the like, which is not described in detail herein.
In order to facilitate understanding of the embodiments of the present invention, a communication network system on which the mobile terminal of the present invention is based is described below.
Referring to fig. 2, fig. 2 is an architecture diagram of a communication Network system according to an embodiment of the present invention, where the communication Network system is an LTE system of a universal mobile telecommunications technology, and the LTE system includes a UE (User Equipment) 201, an E-UTRAN (Evolved UMTS Terrestrial Radio Access Network) 202, an EPC (Evolved Packet Core) 203, and an IP service 204 of an operator, which are in communication connection in sequence.
Specifically, the UE201 may be the terminal 100 described above, and is not described herein again.
The E-UTRAN202 includes eNodeB2021 and other eNodeBs 2022, among others. Among them, the eNodeB2021 may be connected with other eNodeB2022 through backhaul (e.g., X2 interface), the eNodeB2021 is connected to the EPC203, and the eNodeB2021 may provide the UE201 access to the EPC 203.
The EPC203 may include an MME (Mobility Management Entity) 2031, an HSS (Home Subscriber Server) 2032, other MMEs 2033, an SGW (Serving gateway) 2034, a PGW (PDN gateway) 2035, and a PCRF (Policy and Charging Rules Function) 2036, and the like. The MME2031 is a control node that handles signaling between the UE201 and the EPC203, and provides bearer and connection management. HSS2032 is used to provide registers to manage functions such as home location register (not shown) and holds subscriber specific information about service characteristics, data rates, etc. All user data may be sent through SGW2034, PGW2035 may provide IP address assignment for UE201 and other functions, and PCRF2036 is a policy and charging control policy decision point for traffic data flow and IP bearer resources, which selects and provides available policy and charging control decisions for a policy and charging enforcement function (not shown).
The IP services 204 may include the internet, intranets, IMS (IP Multimedia Subsystem), or other IP services, among others.
Although the LTE system is described as an example, it should be understood by those skilled in the art that the present invention is not limited to the LTE system, but may also be applied to other wireless communication systems, such as GSM, CDMA2000, WCDMA, TD-SCDMA, and future new network systems.
Based on the above mobile terminal hardware structure and communication network system, the present invention provides various embodiments of the method.
As shown in fig. 3, fig. 3 is a flowchart of a photographing method according to an embodiment of the invention.
In this embodiment, the photographing method includes the following steps:
step S310, a photographing instruction is monitored.
Step S320, judging whether the conditions for taking a picture in a silhouette mode are met; if yes, go to step S330; if not, step S340 is performed.
Step S330, executing the photographing instruction in the silhouette mode;
in step S340, the photographing instruction is executed.
The technical scheme of the embodiment is mainly applied to a mobile terminal (such as a smart phone, a tablet computer and the like) and is used for realizing the photographing operation in the silhouette mode of the mobile terminal.
In this embodiment, the user starts the photographing function of the mobile terminal. And the mobile terminal displays the current framing picture on a display interface. When a user presses a photographing key (which may be an entity key or a virtual key) or triggers a photographing instruction in other manners, the prior art may directly execute the photographing instruction to implement the photographing operation of the mobile terminal. In this embodiment, when the user triggers the photographing instruction, a determining step is first performed to determine whether the condition for photographing in the silhouette mode is satisfied. And when the condition of taking pictures in the silhouette mode is met, starting the silhouette mode, and executing the shooting instruction in the silhouette mode to shoot an image in the silhouette mode. When the condition for taking a picture in the silhouette mode is not satisfied, the taking instruction is directly executed in the prior art manner to take an image in the normal mode.
Specifically, in this embodiment, the specific manner for determining whether the condition for taking a picture in the silhouette mode is satisfied may be: judging whether a preset image, such as a person and/or animal image, is contained in the current viewfinder image; and whether a trigger instruction for starting the silhouette mode is received, for example, the user selects the silhouette mode on an operation interface.
The specific way of executing the photographing instruction in the silhouette mode may be: acquiring an image of the person and/or the animal in a current viewfinder picture (the specific operation mode can be image identification, extraction and the like); then adjusting the brightness of the image of the person and/or animal to 0; and finally executing the photographing instruction and the like.
According to the technical scheme, when the photographing instruction is monitored, whether the photographing condition of the silhouette mode is met or not is judged, and when the photographing condition of the silhouette mode is met, the photographing instruction is executed in the silhouette mode to photograph the image of the silhouette mode, so that the image in the silhouette mode can be directly photographed, later-stage image repairing is not needed, convenience in image formation of the silhouette mode is greatly improved, and satisfaction of a user is effectively improved.
Referring to fig. 4, fig. 4 is a flowchart illustrating an embodiment of the photographing method shown in fig. 3. In this embodiment, the photographing method includes:
step S410, a photographing instruction is monitored.
Step S420, judging whether the current framing picture contains an image to be subjected to silhouette processing; if yes, go to step S430; if not, step S440 is performed.
Step S430, executing the photographing instruction in the silhouette mode.
In step S440, the photographing instruction is executed.
In this embodiment, the user starts the photographing function of the mobile terminal. And the mobile terminal displays the current framing picture on a display interface. When a user presses a photographing key (which may be an entity key or a virtual key) or triggers a photographing instruction in other manners, the prior art may directly execute the photographing instruction to implement the photographing operation of the mobile terminal. In this embodiment, when the user triggers the photographing instruction, a determining step is first performed to determine whether the current viewfinder frame includes an image to be silhouetted. And when the current viewing picture contains an image to be subjected to silhouette processing, starting a silhouette mode, and executing the photographing instruction in the silhouette mode to photograph the image in the silhouette mode. When the current viewfinder picture does not contain the image to be subjected to silhouette processing, the photographing instruction is directly executed according to the mode of the prior art so as to photograph the image in the normal mode.
According to the technical scheme, when the photographing instruction is monitored, whether the current framing picture contains the image to be subjected to silhouette processing is judged at first, and when the current framing picture contains the image to be subjected to silhouette processing, the photographing instruction is executed in the silhouette mode to photograph the image in the silhouette mode, so that the image in the silhouette mode is directly photographed without post-image correction, convenience in image formation in the silhouette mode is greatly improved, and satisfaction of a user is effectively improved.
Further, as shown in fig. 5, fig. 5 is a flowchart of an embodiment of the step of executing the photographing instruction in the silhouette mode shown in fig. 4.
In this embodiment, the step of executing the photographing instruction in the silhouette mode includes:
in step S431, the image to be silhouetted is acquired in the current finder screen.
And step S432, adjusting the brightness of the image to be subjected to silhouette processing to be 0.
And step S433, executing the photographing instruction.
In this embodiment, an image recognition mode is provided to recognize the image to be silhouetted in the current view-finding picture, and then the brightness parameter of the image position to be silhouetted is adjusted to 0, so as to adjust the image to be silhouetted to be a silhouetted image; and then executing a photographing instruction, thereby realizing the direct photographing of the silhouette mode image.
Further, as shown in fig. 6, fig. 6 is a flowchart of the first embodiment of the photographing method shown in fig. 4. In this embodiment, the photographing method includes:
step S510, a photographing instruction is monitored.
Step S520, judging whether the current framing picture contains images of people and/or animals; if so, go to step S530; if not, step S540 is performed.
Step S530, executing the photographing instruction in the silhouette mode;
in step S540, the photographing instruction is executed.
In this embodiment, the user starts the photographing function of the mobile terminal. And the mobile terminal displays the current framing picture on a display interface. When a user presses a photographing key (which may be an entity key or a virtual key) or triggers a photographing instruction in other manners, the prior art may directly execute the photographing instruction to implement the photographing operation of the mobile terminal. In this embodiment, when the user triggers the photographing instruction, a determining step is first performed to determine whether the current viewfinder image contains images of people and/or animals. And when the current viewfinder picture contains the images of people and/or animals, starting a silhouette mode, and executing the photographing instruction in the silhouette mode to photograph the images in the silhouette mode. When the current viewfinder picture is judged not to contain the images of people and/or animals, the photographing instruction is directly executed according to the prior art mode to photograph the images in the normal mode.
According to the technical scheme, when the photographing instruction is monitored, whether the current viewing picture contains images of people and/or animals is judged firstly, and when the current viewing picture contains images of people and/or animals, the photographing instruction is executed in the silhouette mode to photograph the images in the silhouette mode, so that the images in the silhouette mode are directly photographed without post-image repair, convenience in image formation in the silhouette mode is greatly improved, and satisfaction of a user is effectively improved.
Further, as shown in fig. 7, fig. 7 is a flowchart of a second embodiment of the photographing method shown in fig. 4. In this embodiment, the photographing method includes:
step S610, a photographing instruction is monitored.
Step S620, judging whether the current framing picture contains images of people and/or animals; if so, go to step S630; if not, step S650 is performed.
Step S630, determining whether the current finder screen includes an image of the sea; if so, go to step S640; if not, step S650 is performed.
Step S640, executing the photographing instruction in the silhouette mode;
in step S650, the photographing instruction is executed.
In this embodiment, the user starts the photographing function of the mobile terminal. And the mobile terminal displays the current framing picture on a display interface. When a user presses a photographing key (which may be an entity key or a virtual key) or triggers a photographing instruction in other manners, the prior art may directly execute the photographing instruction to implement the photographing operation of the mobile terminal. In this embodiment, when the user triggers the photographing instruction, a determining step is first performed to determine whether the current viewfinder image contains images of people and/or animals. And when the current view-finding picture contains the images of people and/or animals, judging whether the current view-finding picture contains the images of the sea or not. And when the current view finding picture contains the images of people and/or animals and the images of the sea, starting a silhouette mode, and executing the photographing instruction in the silhouette mode to photograph the images in the silhouette mode. When the current viewfinder picture is judged not to contain the images of people and/or animals and/or the sea, the photographing instruction is directly executed according to the prior art mode to photograph the images in the normal mode.
According to the technical scheme, when the photographing instruction is monitored, whether the current viewing picture simultaneously contains images of people and/or animals and the sea or not is judged, and when the current viewing picture simultaneously contains images of people and/or animals and the sea, the photographing instruction is executed in the silhouette mode to photograph the images in the silhouette mode, so that the images in the silhouette mode are directly photographed, later-stage image repairing is not needed, convenience in image formation in the silhouette mode is greatly improved, and satisfaction of a user is effectively improved.
It should be noted that, in the present embodiment, the sequence of the step S620 and the step S630 may be interchanged without affecting the technical effect of the present embodiment.
Further, as shown in fig. 8, fig. 8 is a flowchart of an embodiment of the step of determining whether the current viewfinder frame includes the image of the sea shown in fig. 7.
In this embodiment, the step of determining whether the current finder screen includes an image of the sea includes:
step S631, determining whether the current finder screen includes an image of the sea or not by an image recognition method; if so, go to step S632; if not, step S634 is performed.
Step S632 of determining whether the current finder screen includes a coastline or not by a straight line detection method; if so, go to step S633; if not, step S634 is performed.
In step S633, it is determined that the current finder screen includes an image of the sea.
In step S634, it is determined that the current finder screen does not include an image of the sea.
Specifically, in the present embodiment, firstly, through an image recognition technology, recognition is performed in the current framing picture to recognize whether the current framing picture contains an image of the sea or not. After the current framing picture is identified to contain the image of the sea through the image identification technology, in order to ensure the identification accuracy, the straight line detection method is continuously used for detecting whether the identified image of the sea contains the straight line of the coastline, and only when the image of the sea is identified through the image identification technology and the straight line of the coastline is monitored through the straight line monitoring method, the current framing picture is judged to contain the image of the sea.
According to the technical scheme, the sea image is recognized through the image recognition technology and the straight line detection method, so that the accuracy of sea image recognition can be effectively improved, and the situation of error recognition is effectively avoided.
Further, as shown in fig. 9, fig. 9 is a flowchart of a third embodiment of the photographing method shown in fig. 4. In this embodiment, the photographing method includes:
step S710, a photographing instruction is monitored.
Step S720, judging whether the current framing picture contains images of people and/or animals; if yes, go to step S730; if not, step S750 is performed.
Step 730, whether the current framing picture contains an image of sunrise or sunset; if yes, go to step S740; if not, step S750 is performed.
Step S740, executing the photographing instruction in the silhouette mode;
in step S750, the photographing instruction is executed.
In this embodiment, the user starts the photographing function of the mobile terminal. And the mobile terminal displays the current framing picture on a display interface. When a user presses a photographing key (which may be an entity key or a virtual key) or triggers a photographing instruction in other manners, the prior art may directly execute the photographing instruction to implement the photographing operation of the mobile terminal. In this embodiment, when the user triggers the photographing instruction, a determining step is first performed to determine whether the current viewfinder image contains images of people and/or animals. When the current viewfinder picture is judged to contain the images of people and/or animals, whether the current viewfinder picture contains the images of sunrise or sunset is judged. And when the current viewfinder picture is judged to contain not only the images of people and/or animals, but also the images of sunrise or sunset, starting a silhouette mode, and executing the photographing instruction in the silhouette mode to photograph the images in the silhouette mode. When it is determined that the image of the person and/or animal and/or sunrise or sunset is not included in the current finder picture, the photographing instruction is directly executed in a manner of the related art to photograph an image in the normal mode.
According to the technical scheme, when the photographing instruction is monitored, whether the current viewing picture simultaneously contains the images of people and/or animals and sunrise or sunset is judged, and when the current viewing picture simultaneously contains the images of people and/or animals and sunrise or sunset, the photographing instruction is executed in the silhouette mode to photograph the images in the silhouette mode, so that the images in the silhouette mode are directly photographed, later-stage image repairing is not needed, convenience in image formation in the silhouette mode is greatly improved, and satisfaction of users is effectively improved.
It should be noted that, in the present embodiment, the sequence of the step S720 and the step S730 can be interchanged without affecting the technical effect of the present embodiment.
Further, as shown in fig. 10, fig. 10 is a flowchart of an embodiment of the step of determining whether the current viewfinder frame includes sunrise or sunset images shown in fig. 9.
In this embodiment, the step of determining whether the current finder screen includes an image of the sea includes:
step S731 of determining whether or not the current finder screen includes an image of sunrise or sunset by an image recognition method; if so, go to step S732; if not, step S735 is executed.
In step S732, the current time is acquired.
Step S733, judging whether the current time accords with the sunrise time or the sunset time; if yes, go to step S734; if not, step S735 is executed.
In step S734, it is determined that the current finder screen includes an image of sunrise or sunset.
In step S735, it is determined that the current finder screen does not include an image of sunrise or sunset.
Specifically, in the present embodiment, first, through an image recognition technology, recognition is performed in the current framing picture to recognize whether an image of sunrise or sunset is included in the current framing picture. After the current framing picture is identified to contain the sunrise or sunset image through the image identification technology, in order to guarantee the identification accuracy, the current time of the mobile terminal is obtained, whether the sunrise or sunset time is met at the moment is judged according to the current time, and the current framing picture is judged to contain the sunrise or sunset image only when the sunrise or sunset image is identified through the image identification technology and the sunrise or sunset time is met through the current time.
According to the technical scheme, the sunrise or sunset image is identified through the image identification technology and time comparison (comparison between the current time and the sunrise or sunset occurrence time), so that the accuracy of sunrise or sunset image identification can be effectively improved, and the situation of mistaken identification is effectively avoided.
Further, as shown in fig. 11, fig. 11 is a flowchart of a fourth embodiment of the photographing method shown in fig. 4. In this embodiment, the photographing method includes:
step S810, monitoring a photographing instruction.
Step S820 of determining whether or not the current finder image includes an image of a person and/or an animal; if yes, go to step S830; if not, step S860 is performed.
Step S830, judging whether the current framing picture contains the image of the sea; if so, go to step S840; if not, step S860 is performed.
Step 840, judging whether the current view finding picture contains sunrise or sunset images; if yes, go to step S850; if not, step S860 is performed.
Step S850, executing the photographing instruction in the silhouette mode;
in step S860, the photographing instruction is executed.
In this embodiment, the user starts the photographing function of the mobile terminal. And the mobile terminal displays the current framing picture on a display interface. When a user presses a photographing key (which may be an entity key or a virtual key) or triggers a photographing instruction in other manners, the prior art may directly execute the photographing instruction to implement the photographing operation of the mobile terminal. In this embodiment, when the user triggers the photographing instruction, a determining step is first performed to determine whether the current viewfinder image contains images of people and/or animals. And when the current view-finding picture contains the images of people and/or animals, judging whether the current view-finding picture contains the images of the sea or not. When the current framing picture is judged to contain the image of the sea, whether the current framing picture contains the image of sunrise or sunset is judged. And when the current viewfinder picture is judged to contain not only the images of people and/or animals, but also the images of the sea, the sunrise or the sunset, starting a silhouette mode, and executing the photographing instruction in the silhouette mode to photograph the images in the silhouette mode. The specific step of determining whether the current viewfinder image contains the image of the sea may refer to the step shown in fig. 8; the specific steps of determining whether the current viewfinder image contains an image of sunrise or sunset may refer to the steps shown in fig. 10; and will not be described in detail herein.
According to the technical scheme, when the photographing instruction is monitored, whether the current viewing picture simultaneously contains the people and/or the animals, the sea and the images of the sunrise or the sunset is judged, and when the current viewing picture simultaneously contains the people and/or the animals, the sea and the images of the sunrise or the sunset, the photographing instruction is executed in the silhouette mode to photograph the images in the silhouette mode, so that the images in the silhouette mode are directly photographed without needing post-image repair, the convenience of image formation in the silhouette mode is greatly improved, and the satisfaction degree of a user is effectively improved.
It should be noted that, in the present embodiment, the sequence of the step S820, the step S830, and the step S840 may be interchanged without affecting the technical effect of the present embodiment.
The invention also provides a terminal, which comprises a memory, a processor and an implementation program of the photographing method, wherein the implementation program of the photographing method is stored in the memory and can be run on the processor, and when being executed by the processor, the implementation program of the photographing method implements all the steps in the embodiment of the photographing method. Since the terminal can execute all the steps in any of the above embodiments, the mobile terminal at least has all the beneficial effects brought by the technical solutions of the above method embodiments, and details are not repeated here.
The present invention further provides a computer readable medium, wherein an implementation program of the photographing method is stored on the computer readable medium, and when the implementation program of the photographing method is executed, all the steps in any of the above embodiments can be implemented. Since the computer-readable medium can execute all the steps in any of the above embodiments, the computer-readable medium at least has all the advantages brought by the technical solutions of the above method embodiments, and details are not repeated herein.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (10)

1. A method of taking a picture, comprising the steps of:
when a photographing instruction is monitored, judging whether a condition of photographing in a silhouette mode is met;
if yes, executing the photographing instruction in the silhouette mode;
and if not, executing the photographing instruction.
2. The photographing method according to claim 1, wherein the step of determining whether or not a condition sufficient for photographing in the silhouette mode is satisfied comprises:
judging whether the current viewing picture contains an image to be subjected to silhouette processing;
if yes, judging that the condition of taking a picture in a silhouette mode is met;
if not, the condition for photographing in the silhouette mode is determined to be not satisfied.
3. A photographing method as defined in claim 2, wherein the step of determining whether the current viewfinder frame contains an image to be silhouetted processed comprises:
judging whether the current viewfinder picture contains images of people and/or animals;
if yes, judging that the current framing picture contains an image to be subjected to silhouette processing;
and if not, judging that the current framing picture does not contain the image to be subjected to silhouette processing.
4. A photographing method as defined in claim 3, wherein after the step of determining that the current through-view picture contains an image to be silhouetted processed, further comprising:
judging whether the current viewing picture contains the image of the sea or not;
if yes, judging that the condition of taking a picture in a silhouette mode is met;
if not, the condition for photographing in the silhouette mode is determined to be not satisfied.
5. The photographing method according to claim 4, wherein the step of determining whether the current finder picture includes an image of the sea comprises:
judging whether the current framing picture contains an image of the sea or not by an image identification method;
if yes, judging whether the current framing picture contains a coastline or not by a straight line detection method;
if yes, the current framing picture is judged to contain the image of the sea.
6. A photographing method as defined in claim 3, wherein after the step of determining that the current through-view picture contains an image to be silhouetted processed, further comprising:
judging whether the current viewing picture contains an image of sunrise or sunset;
if yes, judging that the condition of taking a picture in a silhouette mode is met;
if not, the condition for photographing in the silhouette mode is determined to be not satisfied.
7. A photographing method as defined in claim 6, wherein the step of determining whether the current finder picture includes an image of sunrise or sunset includes:
judging whether the current framing picture contains sunrise or sunset images through an image identification method;
if so, acquiring the current time, and judging whether the current time accords with the sunrise time or the sunset time;
if yes, the current framing picture is judged to contain the image of sunrise or sunset.
8. The photographing method of claim 2, wherein the step of executing the photographing instruction in the silhouette mode comprises:
acquiring the image to be subjected to silhouette processing in the current view-finding picture;
adjusting the brightness of the image to be subjected to silhouette processing to be 0;
and executing the photographing instruction.
9. A terminal, characterized by comprising a memory, a processor and a program for implementing the photographing method stored on the memory and operable on the processor, wherein the program for implementing the photographing method, when executed by the processor, implements the steps of the photographing method according to any one of claims 1 to 8.
10. A computer-readable medium, characterized in that the computer-readable medium has stored thereon a program for implementing the photographing method, which when executed implements the steps of the photographing method according to any one of claims 1 to 8.
CN202011633272.7A 2020-12-31 2020-12-31 Photographing method, terminal and computer readable medium Active CN112866564B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011633272.7A CN112866564B (en) 2020-12-31 2020-12-31 Photographing method, terminal and computer readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011633272.7A CN112866564B (en) 2020-12-31 2020-12-31 Photographing method, terminal and computer readable medium

Publications (2)

Publication Number Publication Date
CN112866564A true CN112866564A (en) 2021-05-28
CN112866564B CN112866564B (en) 2024-03-22

Family

ID=76000026

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011633272.7A Active CN112866564B (en) 2020-12-31 2020-12-31 Photographing method, terminal and computer readable medium

Country Status (1)

Country Link
CN (1) CN112866564B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05232570A (en) * 1992-02-21 1993-09-10 Nikon Corp Setting display device for camera and setting device for camera
CN103747184A (en) * 2014-01-24 2014-04-23 惠州Tcl移动通信有限公司 Method and system for automatically switching shooting scene mode by mobile terminal
CN104202524A (en) * 2014-09-02 2014-12-10 三星电子(中国)研发中心 Method and device for backlight filming
CN107959795A (en) * 2017-11-30 2018-04-24 努比亚技术有限公司 A kind of information collecting method, equipment and computer-readable recording medium
CN108156381A (en) * 2017-12-28 2018-06-12 北京小米移动软件有限公司 Photographic method and device
JP2020091745A (en) * 2018-12-06 2020-06-11 凸版印刷株式会社 Imaging support device and imaging support method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05232570A (en) * 1992-02-21 1993-09-10 Nikon Corp Setting display device for camera and setting device for camera
CN103747184A (en) * 2014-01-24 2014-04-23 惠州Tcl移动通信有限公司 Method and system for automatically switching shooting scene mode by mobile terminal
CN104202524A (en) * 2014-09-02 2014-12-10 三星电子(中国)研发中心 Method and device for backlight filming
CN107959795A (en) * 2017-11-30 2018-04-24 努比亚技术有限公司 A kind of information collecting method, equipment and computer-readable recording medium
CN108156381A (en) * 2017-12-28 2018-06-12 北京小米移动软件有限公司 Photographic method and device
JP2020091745A (en) * 2018-12-06 2020-06-11 凸版印刷株式会社 Imaging support device and imaging support method

Also Published As

Publication number Publication date
CN112866564B (en) 2024-03-22

Similar Documents

Publication Publication Date Title
CN110035176B (en) Brightness adjusting method of mobile terminal, mobile terminal and storage medium
CN107948360B (en) Shooting method of flexible screen terminal, terminal and computer readable storage medium
CN107566734B (en) Intelligent control method, terminal and computer readable storage medium for portrait photographing
CN109195213B (en) Mobile terminal screen control method, mobile terminal and computer readable storage medium
CN108958936B (en) Application program switching method, mobile terminal and computer readable storage medium
CN110187808B (en) Dynamic wallpaper setting method and device and computer-readable storage medium
CN110086993B (en) Image processing method, image processing device, mobile terminal and computer readable storage medium
CN110180181B (en) Method and device for capturing wonderful moment video and computer readable storage medium
CN108900779B (en) Initial automatic exposure convergence method, mobile terminal and computer-readable storage medium
CN112689191A (en) Screen projection control method, terminal and computer readable storage medium
CN111447371A (en) Automatic exposure control method, terminal and computer readable storage medium
CN107896304B (en) Image shooting method and device and computer readable storage medium
CN107241504B (en) Image processing method, mobile terminal and computer readable storage medium
CN109309762B (en) Message processing method, device, mobile terminal and storage medium
CN108848321B (en) Exposure optimization method, device and computer-readable storage medium
CN112437472B (en) Network switching method, equipment and computer readable storage medium
CN112153305A (en) Camera starting method, mobile terminal and computer storage medium
CN112604281A (en) Game visual field control method, mobile terminal and computer readable storage medium
CN112135045A (en) Video processing method, mobile terminal and computer storage medium
CN112543248A (en) Air-separating operation method, terminal and storage medium
CN110083294B (en) Screen capturing method, terminal and computer readable storage medium
CN109561221B (en) Call control method, device and computer readable storage medium
CN109151201B (en) Anti-addiction method, mobile terminal and computer storage medium
CN111614902A (en) Video shooting method and device and computer readable storage medium
CN112532838B (en) Image processing method, mobile terminal and computer storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant