CN115623323A - Shooting method and electronic equipment - Google Patents

Shooting method and electronic equipment Download PDF

Info

Publication number
CN115623323A
CN115623323A CN202211386048.1A CN202211386048A CN115623323A CN 115623323 A CN115623323 A CN 115623323A CN 202211386048 A CN202211386048 A CN 202211386048A CN 115623323 A CN115623323 A CN 115623323A
Authority
CN
China
Prior art keywords
shooting
electronic device
user
weather
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211386048.1A
Other languages
Chinese (zh)
Inventor
邢一博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202211386048.1A priority Critical patent/CN115623323A/en
Publication of CN115623323A publication Critical patent/CN115623323A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Telephone Function (AREA)

Abstract

The embodiment of the application provides a shooting method and electronic equipment, wherein the method can be applied to the electronic equipment comprising a camera, and comprises the following steps: the method comprises the steps of determining a weather scene category where an image to be shot is located in a preview interface, presenting a plurality of shooting effect options under the weather scene category, and determining preset shooting parameters corresponding to target shooting effect options according to selection operation of the target shooting effect options in the plurality of shooting effect options; the shooting parameters are utilized to carry out shooting in response to the triggering operation of the shooting control, so that the electronic equipment can determine the preset shooting parameters corresponding to the target shooting effect option according to the selection operation of the user on the target shooting effect option, the automatic regulation of the shooting parameters is realized, the user does not need to manually regulate exposure in a professional mode, the complex operation of the user in the shooting process can be reduced, and the threshold of shooting creation is reduced.

Description

Shooting method and electronic equipment
Technical Field
The present disclosure relates to the field of electronic technologies, and in particular, to a shooting method and an electronic device.
Background
At present, electronic equipment such as a mobile phone is equipped with a camera, and a user can shoot images through the camera. For example, a person may be photographed, a landscape may be photographed, and the like.
In a specific weather scene, some users want to express their own emotions by photographing the states of weather elements in the specific scene. For example, in a rainy scene, the user wants to express an emotion in a rainy scene by means of a state of shooting rain, such as rain silk or rain drops, or in a snowy scene, the user wants to express an emotion in a snowy scene by means of a state of shooting snow, such as snow flakes or snow silk.
However, since the meteorological elements fall down quickly in a specific scene, the meteorological elements cannot be captured in a common photographing mode of electronic equipment such as a smart phone at present, so that a user needs to manually adjust parameters such as exposure time in a professional mode to photograph the meteorological elements.
Disclosure of Invention
The shooting method and the electronic equipment solve the problems that in the shooting process of the electronic equipment, a user needs to manually adjust parameters, the operation is complex, the user needs to have certain shooting experience, and the shooting creation threshold is high.
In order to achieve the purpose, the technical scheme is as follows:
in a first aspect, the present application provides a shooting method, which may be applied to an electronic device including a camera, and includes that a weather scene category where an image to be shot in a preview interface is located may be first determined, for example, rainy days or snowy days. Then, a plurality of shooting effect options under the weather scene category can be presented, a user can independently select the plurality of shooting effect options according to requirements to determine a target shooting effect option, a preset shooting parameter corresponding to the target shooting effect option is determined according to the selection operation of the target shooting effect option in the plurality of shooting effect options, and if the user is determined to send out the operation of triggering the shooting control, shooting can be carried out by utilizing the determined shooting parameter.
In some embodiments, the photographing parameter may include an exposure time, and the electronic device may preset a corresponding relationship between the plurality of shooting effect options and the exposure time, and further may determine the preset exposure time corresponding to a target shooting effect option in the plurality of shooting effect options according to a selection operation of the target shooting effect option.
It should be noted that, the longer the exposure time, the more the light-entering amount of the lens, and the more the corresponding frame exposure brightness in some embodiments. In order to avoid too long exposure time and excessive screen brightness, the sensitivity may be adjusted according to the exposure time, and at this time, the shooting parameters may include: the electronic equipment can determine the exposure time according to the target shooting effect option, correspondingly and automatically adjust the sensitivity based on the preset adjustment relation between the exposure time and the sensitivity, and automatically adjust the sensitivity downwards according to the adjusted exposure time, so that the user does not need to manually adjust the sensitivity, and the user operation can be reduced.
In some embodiments, after the electronic device performs shooting by using the target shooting parameter, the electronic device may obtain the first picture, and then perform segmentation processing on the first picture to segment a background image and meteorological elements of the first picture, and the electronic device may adjust brightness of the background image of the first picture according to a user operation, and may also automatically adjust contrast of the background image and meteorological elements of the first picture according to a contrast algorithm, so as to obtain the second picture.
In some embodiments, different shooting effect options corresponding to different filters may be preset, a preset filter corresponding to a target shooting effect option may be called according to a selection operation of a user on the target shooting effect option in the multiple shooting effect options, and in response to a trigger operation of the shooting control, the electronic device may perform shooting using the shooting parameters and the filter corresponding to the target shooting effect option. Therefore, the user does not need to manually select the filter, the user operation is reduced, and the user can be helped to create better photographic works.
In other embodiments, the user may select a remove meteorological element capture effect option and the electronic device may invoke the smart algorithm to remove meteorological elements from the first photograph in response to the selection operating to select the remove meteorological element capture effect option in order to avoid meteorological elements, such as raindrops or rain, snow or snow, from obstructing the scenic construction.
In some embodiments, the determining a weather scene category where an image to be captured is located in the preview interface may specifically be:
the method comprises the steps of obtaining first weather information, and determining the weather scene type of an image to be shot in a preview interface based on the obtained first weather information, wherein the first weather information is weather information for determining the geographical position of the electronic device corresponding to the moment of the image to be shot, on one hand, the weather scene type of the image to be shot in the preview interface can be determined only according to the first weather information, such as light rain, heavy rain and the like, on the other hand, the category result of the scene in the image can be detected by combining a scene detection algorithm, and comprehensive judgment is carried out according to the first weather information, so that the category result of the scene in the image is assisted and judged by means of time information and the geographical position of a user, and the accuracy of scene judgment can be improved.
In some embodiments, when it is determined that the weather scene category may be a rainy scene and that the weather element in the weather scene category may be rain, the plurality of photographic effect options may be: an option to shoot raindrops and an option to shoot rain filaments.
In some embodiments, when it is determined that the weather scene category is a rainy scene and the weather element in the weather scene category is rain, the plurality of shooting effect options may also be: and shooting different shooting effect options of raindrops under the raindrop option or shooting different shooting effect options of rainwires under the rainwire option.
In a second aspect, the present application provides a shooting method, which can be applied to an electronic device including a camera, and includes determining a weather scene category in which an image to be shot in a preview interface is located, for example, rainy days or snowy days, then detecting a level of a meteorological element in the weather scene category, for example, light rain, medium rain, light snow or heavy snow, according to the level of the meteorological element, based on a correspondence between a preset level of the meteorological element and a meteorological element atmosphere level, determining a meteorological element atmosphere level in the image to be shot, according to the meteorological element atmosphere level and a correspondence between the preset meteorological element atmosphere level and a shooting parameter, determining a shooting parameter corresponding to the meteorological element atmosphere level, and if it is determined that a user sends a trigger shooting control operation, shooting can be performed by using the determined shooting parameter, so that the corresponding meteorological element atmosphere level can be automatically determined by detecting the level of the meteorological element in the image to be shot, that the user does not need to autonomously select a shooting effect option, and user operation is reduced.
In a third aspect, the present application provides an electronic device, comprising: a camera processor and a memory;
the camera is used for collecting video stream;
one or more computer programs stored in the memory, the one or more computer programs comprising instructions; the instructions, when executed by the processor, cause the electronic device to perform the method of any of the first aspects.
In a fourth aspect, the present application provides a computer storage medium comprising computer instructions which, when run on a mobile terminal, cause the electronic device to perform the method according to any of the first aspects.
In a fifth aspect, the present application provides a computer program product comprising instructions; the instructions, when executed by an electronic device, cause the electronic device to perform the method of any of the first aspects.
According to the technical scheme, the method has the following beneficial effects:
the application provides a shooting method and electronic equipment, wherein the method can be applied to the electronic equipment comprising a camera, and comprises the steps of determining the weather scene type of an image to be shot in a preview interface, presenting a plurality of shooting effect options under the weather scene type, and determining preset shooting parameters corresponding to target shooting effect options according to the selection operation of the target shooting effect options in the plurality of shooting effect options; the method comprises the steps that triggering operation of a shooting control is responded, shooting is carried out by utilizing shooting parameters, the shooting parameters corresponding to each shooting effect option in a plurality of shooting effect options can be preset, and the electronic equipment can determine the preset shooting parameters corresponding to the target shooting effect options according to selection operation of a user on the target shooting effect options in the plurality of shooting effect options, so that the electronic equipment can call the corresponding shooting parameters according to the selection operation of the user, automatic regulation of the shooting parameters is realized, manual regulation of exposure of the user in a professional mode is not needed, complex operation of the user in the shooting process can be reduced, and a threshold of shooting creation is reduced.
It should be appreciated that the description of technical features, solutions, benefits, or similar language in this application does not imply that all of the features and advantages may be realized in any single embodiment. Rather, it should be appreciated that any discussion of a feature or advantage is meant to encompass a particular feature, aspect, or advantage in at least one embodiment. Therefore, the descriptions of technical features, technical solutions or advantages in the present specification do not necessarily refer to the same embodiment. Furthermore, the technical features, technical solutions and advantages described in the present embodiments may also be combined in any suitable manner. One skilled in the relevant art will recognize that an embodiment may be practiced without one or more of the specific features, aspects, or advantages of a particular embodiment. In other instances, additional features and advantages may be recognized in certain embodiments that may not be present in all embodiments.
Drawings
Fig. 1 is a diagram illustrating an example of an electronic device according to an embodiment of the present disclosure;
fig. 2 is a diagram illustrating a software structure of an electronic device according to an embodiment of the present disclosure;
fig. 3A is a schematic diagram illustrating a camera application opened by a user according to an embodiment of the present application;
fig. 3B is a schematic view of a camera interface preview provided in the embodiment of the present application;
fig. 3C is a schematic diagram of a countdown of the camera applied in the photographing process according to the embodiment of the present application;
fig. 3D is a schematic diagram illustrating a photographing mode setting according to an embodiment of the present application;
fig. 3E is a schematic diagram of switching a photographing mode by a camera application according to an embodiment of the present application;
fig. 4A is a schematic view of a camera interface preview scene provided in an embodiment of the present application;
fig. 4B is a schematic diagram of detecting scene types in a photographing process according to an embodiment of the present application;
fig. 4C is a schematic diagram of displaying a mutual interface on a camera application according to an embodiment of the present application;
fig. 5 is a flowchart of a shooting method provided in an embodiment of the present application;
fig. 6 is a flowchart of another shooting method provided in the embodiment of the present application.
Detailed Description
The terms "first", "second" and "third", etc. in the description and claims of this application and the description of the drawings are used for distinguishing between different objects and not for limiting a particular order.
In the embodiments of the present application, words such as "exemplary" or "for example" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "e.g.," is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
For clarity and conciseness of the following descriptions of the various embodiments, a brief introduction to the related art is first given:
the mobile phone photographing mode can be divided into various modes, such as a portrait mode, a night view mode, a professional mode, and the like, wherein the professional mode refers to that a user needs to manually control and adjust parameters, and needs to switch the photographing mode of the mobile phone to the professional mode in a writing scene, but the professional mode requires the user to manually adjust some photographing parameters such as exposure time and sensitivity, and further requires the user to have a certain photographing experience.
In view of this, the shooting method and the electronic device provided by the application can determine the weather scene category where the image to be shot is located in the preview interface, present a plurality of shooting effect options under the weather scene category, and determine the preset shooting parameters corresponding to the target shooting effect options according to the selection operation of the target shooting effect options in the plurality of shooting effect options; the shooting parameters are utilized to carry out shooting in response to the triggering operation of the shooting control, so that the electronic equipment can determine the preset shooting parameters corresponding to the target shooting effect option according to the selection operation of the user on the target shooting effect option, the automatic regulation of the shooting parameters is realized, the manual regulation of exposure by the user in a professional mode is not needed, the complex operation of the user in the shooting process can be reduced, and the threshold of shooting creation is reduced.
In some embodiments, the electronic device may be a mobile phone, a tablet Computer, a desktop Computer, a laptop Computer, a notebook Computer, an Ultra-mobile Personal Computer (UMPC), a handheld Computer, a netbook, a Personal Digital Assistant (PDA), a wearable electronic device, a smart watch, or the like, and the specific form of the electronic device is not particularly limited in this application. In this embodiment, a structure of an electronic device may be as shown in fig. 1, where fig. 1 is a schematic structural diagram of an electronic device provided in an embodiment of the present application.
As shown in fig. 1, the electronic device may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identification Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the present embodiment does not constitute a specific limitation to the electronic device. In other embodiments, an electronic device may include more or fewer components than illustrated, or combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processor (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), among others. Wherein, the different processing units may be independent devices or may be integrated in one or more processors.
The controller can be a neural center and a command center of the electronic device. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bidirectional synchronous serial bus comprising a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate through an I2C bus interface to implement a touch function of the electronic device.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 through an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through the I2S interface, so as to implement a function of receiving a call through a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a bluetooth headset.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, the processor 110 and the camera 193 communicate through a CSI interface to implement the shooting function of the electronic device. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the electronic device.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device, and may also be used to transmit data between the electronic device and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the interface connection relationship between the modules illustrated in this embodiment is only an exemplary illustration, and does not constitute a limitation on the structure of the electronic device. In other embodiments of the present application, the electronic device may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in an electronic device may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including wireless communication of 2G/3G/4G/5G, etc. applied to the electronic device. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication applied to electronic devices, including Wireless Local Area Networks (WLANs) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), global Navigation Satellite Systems (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of the electronic device is coupled to the mobile communication module 150 and antenna 2 is coupled to the wireless communication module 160 so that the electronic device can communicate with the network and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), general Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), long Term Evolution (LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device implements the display function through the GPU, the display screen 194, and the application processor, etc. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may be a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-OLED, a quantum dot light-emitting diode (QLED), or the like. In some embodiments, the electronic device may include 1 or N display screens 194, N being a positive integer greater than 1.
A series of Graphical User Interfaces (GUIs) may be displayed on the display screen 194 of the electronic device, and these GUIs are the main screen of the electronic device. Generally, the size of the display screen 194 of the electronic device is fixed, and only a limited number of controls can be displayed in the display screen 194 of the electronic device. A widget is a GUI element, which is a software component included in an application program, controls all data processed by the application program and interactive operations on the data, and a user can interact with the widget through direct manipulation (direct manipulation) to read or edit information related to the application program. Generally, a control may include a visual interface element such as an icon, control, menu, tab, text box, dialog box, status bar, navigation bar, widget, and the like.
The electronic device may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display screen 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when taking a picture, the shutter is opened, and the ISP can also perform algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV and other formats. In some embodiments, the electronic device may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device selects a frequency point, the digital signal processor is used for performing fourier transform and the like on the frequency point energy.
Video codecs are used to compress or decompress digital video. The electronic device may support one or more video codecs. Thus, the electronic device can play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can realize applications such as intelligent cognition of electronic equipment, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the electronic device and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The data storage area can store data (such as audio data, phone book and the like) created in the using process of the electronic device. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. The processor 110 executes various functional applications of the electronic device and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The electronic device may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus can listen to music through the speaker 170A or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic device answers a call or voice information, it can answer the voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking near the microphone 170C through the mouth. The electronic device may be provided with at least one microphone 170C. In other embodiments, the electronic device may be provided with two microphones 170C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association) standard interface of the USA.
The pressure sensor 180A is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronics determine the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic device detects the intensity of the touch operation according to the pressure sensor 180A. The electronic device may also calculate the position of the touch from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion pose of the electronic device. In some embodiments, the angular velocity of the electronic device about three axes (i.e., x, y, and z axes) may be determined by the gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. Illustratively, when the shutter is pressed, the gyroscope sensor 180B detects a shake angle of the electronic device, calculates a distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, the electronic device calculates altitude from barometric pressure values measured by barometric pressure sensor 180C, aiding in positioning and navigation.
The magnetic sensor 180D includes a hall sensor. The electronic device may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the electronic device is a flip, the electronic device may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E can detect the magnitude of acceleration of the electronic device in various directions (typically three axes). When the electronic device is at rest, the magnitude and direction of gravity can be detected. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device may measure distance by infrared or laser. In some embodiments, taking a picture of a scene, the electronic device may utilize the distance sensor 180F to range to achieve fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device emits infrared light to the outside through the light emitting diode. The electronic device uses a photodiode to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device. When insufficient reflected light is detected, the electronic device may determine that there are no objects near the electronic device. The electronic device can detect that the electronic device is held by a user and close to the ear for conversation by utilizing the proximity light sensor 180G, so that the screen is automatically extinguished, and the purpose of saving power is achieved. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. The electronic device may adaptively adjust the brightness of the display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the electronic device is in a pocket to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic equipment can utilize the collected fingerprint characteristics to realize fingerprint unlocking, access to an application lock, fingerprint photographing, fingerprint incoming call answering and the like.
The temperature sensor 180J is used to detect temperature. In some embodiments, the electronic device implements a temperature processing strategy using the temperature detected by temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the electronic device heats the battery 142 when the temperature is below another threshold to avoid an abnormal shutdown of the electronic device due to low temperatures. In other embodiments, the electronic device performs a boost on the output voltage of the battery 142 when the temperature is below a further threshold to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also called a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device at a different position than the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human voice vibrating a bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor 180M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic device may receive a key input, and generate a key signal input related to user settings and function control of the electronic device.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be attached to and detached from the electronic device by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic equipment can support 1 or N SIM card interfaces, and N is a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic equipment realizes functions of conversation, data communication and the like through the interaction of the SIM card and the network. In some embodiments, the electronic device employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device and cannot be separated from the electronic device.
In addition, an operating system runs on the above components. Such as the iOS operating system developed by apple, the Android open source operating system developed by google, the Windows operating system developed by microsoft, and so on. An operating application may be installed on the operating system.
The operating system of the electronic device may employ a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the application takes an Android system with a layered architecture as an example, and exemplarily illustrates a software structure of an electronic device.
Fig. 2 is a diagram illustrating a software structure of an electronic device according to an embodiment of the present application.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, a framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom.
The application layer may include a series of application packages. As shown in fig. 2, the application package may include applications such as camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc.
The framework layer provides an Application Programming Interface (API) and a programming framework for the application programs of the application layer. The framework layer includes some predefined functions. As shown in FIG. 2, the framework layer may include a window manager, a content provider, a view system, a phone manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and answered, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide communication functions of the electronic device. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the framework layer run in a virtual machine. And executing java files of the application layer and the framework layer into binary files by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
Although the Android system is taken as an example in the embodiment of the present application for description, the basic principle is also applicable to electronic devices based on operating systems such as iOS and Windows.
The electronic equipment provided by the embodiment of the application carries out photo shooting through the camera. For example, the electronic device may take a picture through a single-path camera or may take a picture through multiple-path cameras, which is not limited in the present application. The shooting of the picture through the single-path camera can mean that the electronic equipment shoots the picture through the single-path camera 193; taking pictures through multiple cameras may mean that the electronic device takes pictures through multiple cameras 193 (e.g., two cameras, which may be a main camera and a portrait camera, or a wide-angle camera and a telephoto camera). The embodiment of the application does not specifically limit whether the single-path camera shoots or the multi-path camera shoots, and technical personnel can set the electronic equipment to select the single-path camera or the multi-path camera according to the actual scene.
When a user performs a touch operation on the touch sensor 180K, the touch sensor 180K may acquire the touch operation of the user and report the touch operation to the processor 110, and after receiving the touch operation sent by the touch sensor 180K, the processor 110 may start an application corresponding to the touch operation in response to the touch operation.
As shown in fig. 3A, fig. 3A illustrates a schematic diagram of a camera application opened by a user, for example, a camera is opened by a user touch operation, the touch sensor 180K may receive a touch operation of the camera icon 301 by the user and report the touch operation of the camera icon 301 to the processor 110, and after receiving the touch operation of the camera icon 301, the processor 110 may start an application (which may be referred to as a camera application for short) corresponding to the camera icon 301 in response to the touch operation and display a shooting preview interface of the camera on the display screen 194.
In addition, in the embodiment of the present application, the electronic device may start the camera application in other manners, and display a shooting preview interface of the camera on the display screen 194. For example, a user issues a camera opening voice instruction, the electronic device may report the received camera opening voice instruction issued by the user to the processor 110 through the microphone 170C, and after receiving the camera opening voice instruction issued by the user, the processor 110 may start the camera application in response to the user voice instruction and display a shooting preview interface of the camera application on the display screen 194. Still alternatively, the user may store a shortcut instruction to open the camera in the electronic device in advance, for example, the shortcut instruction to open the camera may be set as an operation of sliding upward at a lower end of the screen of the electronic device. When a user triggers a shortcut instruction for opening a camera, the touch sensor 180K may receive the shortcut operation instruction for opening the camera of the user and report the shortcut operation instruction to the processor 110, and after receiving the shortcut operation instruction, the processor 110 may respond to the shortcut operation instruction to start an application corresponding to the camera icon 301 and display a shooting preview interface of the camera on the display screen 194.
In some embodiments, the camera application may automatically enter the photographing mode after the processor 110 starts the camera application. As shown in fig. 3B, various functional controls such as a mode selection control 302, a camera setting control 303, and the like may be included on the shooting preview interface of the camera application. The touch sensor 180K may receive a touch operation of the user on the function control, and report the touch operation to the processor 110, so that the processor 110 controls the camera application to enter a corresponding interface in response to the touch operation. In some examples, the user may trigger the camera application to enter different shooting modes through the mode selection control 302, such as a night mode, a tele mode, etc., as well as a rainy mode, a snowy mode, etc. The functions of filtering, watermarking, panorama and the like are started through the camera setting control 303. For example, the user may set the initial mode of the camera application to the photographing mode through the camera setting control 303, and when the user turns on the camera application again, the camera application automatically enters the photographing mode. The initial mode of the camera application may also be set to the video recording mode through the camera setting control 303, and when the user turns on the camera application again, the camera application automatically enters the video recording mode.
In some examples, the camera application automatically enters a photo mode after the user opens the camera application. The user can trigger a photo shooting operation through a touch shooting key 304 in a shooting preview interface of the camera application, when the user triggers the photo shooting operation, the touch sensor 180K can acquire the photo shooting operation triggered by the user and report the photo shooting operation to the processor 110, and the processor 110 can control the camera application to shoot a photo and can acquire the photo taking time and the photo taking place of the photo shooting operation triggered by the user. The photographing time may be a time corresponding to the time when the user touches the photographing key 304 to trigger the operation of photographing. The photographing place may be a geographical position of the electronic device when the user touches the photographing key 304 to trigger the photographing operation, and further, the user may set a delayed photographing time of the camera application in the electronic device through the camera setting control 303, for example, the delayed photographing time may be set to 3 seconds, 5 seconds, 10 seconds, and the like, where the delayed photographing time refers to photographing a photo after triggering the photographing operation of the camera application for a certain time.
For example, taking the setting of the delay time as 3 seconds as an example, firstly, the user may set the delay photographing time of the electronic device to 3 seconds through the camera setting control 303, the electronic device may obtain the operation of setting the delay photographing time on the screen of the electronic device by the user through the touch sensor 180K, and transmit the operation to the processor 110, and the processor 110 may store the 3 seconds of delay photographing time set by the user. When a user triggers a photo taking operation, the touch sensor 180K may acquire the photo taking operation triggered by the user and report the photo taking operation to the processor 110, the processor 110 may control camera application countdown, for example, fig. 3C shows a countdown schematic diagram of a camera application in a photo taking process, and the processor 110 may control countdown numbers of the camera application to be displayed on a screen of an electronic device, so that the user may visually acquire countdown information. In response to the processor 110 determining that the 3 second countdown is finished, that is, the countdown time meets the delay time set by the user, the processor 110 may control the camera application to take a picture, so that by setting the delay time, the problem that no other person takes a picture in some scenes may be solved. Wherein other people may refer to people who do not need to appear in the photograph.
In some embodiments, the camera application may trigger the photo taking operation by gesture photographing, voice photographing, and the like, in addition to triggering the photo taking operation by touching the photo taking key 304. The gesture photographing refers to the fact that a camera application triggers photo taking operation through recognizing photographing gestures set by a user, and the voice photographing refers to the fact that the camera application triggers photo taking operation through recognizing voice set by the user. Specifically, as shown in fig. 3D, after a user clicks the camera setting control 303, the user may display a control 305 in the camera application, and click a control electronic device for a photographing mode in the control 305 may display a control 306 including a gesture photographing control and a sound photographing control, and then the user may autonomously set a photographing gesture for triggering a photographing operation and/or a user sound for triggering a photographing operation according to a requirement, for example, the gesture photographing takes a bye gesture set by the user as an example, the user may set a bye gesture for triggering a photographing operation in the camera application, and when the user needs to photograph, the user may put a bye gesture, a front camera or a rear camera in the electronic device processing camera 193 may transmit obtained light to a camera photosensitive element through a lens, convert an optical signal into an electrical signal through the photosensitive element, transmit the electrical signal to an ISP for processing, transmit the processed data to the ISP 110, and then the processor 110 may determine that the user triggers a photographing operation, and control the camera application to photograph.
For voice photographing, for example, a user sets a recognition voice as "photographing", when the user needs to photograph, the user can speak a voice of "photographing" two words, the microphone 170C can acquire the voice of the user, and the voice is transmitted to the processor 110 after signal processing, so that the processor 110 can determine that the user triggers a photographing operation, and control the camera to photograph.
During a photograph taken by the camera application, a user may preview an image of a scene displayed by the camera application via the display screen 194. Fig. 3E shows a schematic diagram of switching the photographing mode by the camera application. The user can switch the shooting mode during the shooting process of the camera application by triggering the mode selection control 302, for example, the shooting mode of the camera can be switched to a common shooting mode, a portrait mode, a sport mode, a video recording mode, and the like. Of course, the user can also switch between the above-mentioned multiple shooting modes through the mode selection control 302, for example, the rear-end shooting mode is switched to the front-end shooting mode, and for example, the normal shooting mode is switched to the night mode.
Further, in order to avoid that a function key on a screen blocks a scene image displayed by a camera application in a photo taking process, so that a user experience is poor, an operation triggered by a user can be obtained through the touch sensor 180K, and the operation is reported to the processor 110, the processor 110 can hide the mode selection control 302 applied by the camera in the photo taking process according to the operation of the user in some examples, as shown in fig. 4A, in a process of previewing the scene image displayed by the camera, the operation of the user can be touching the screen, the touch sensor 180K obtains the operation touched by the user, and reports the operation to the processor 110, and the processor 110 hides the mode selection control 302 applied by the camera in the photo taking process, so that the image is displayed on the screen of the electronic device without blocking, so that the experience of the user is improved.
In some embodiments, some users want to shoot some specific scenes, for example, in rainy days, some users want to express the mood in the rainy weather atmosphere by shooting rain or raindrops, or some users want to create some photographic works by shooting rain or raindrops.
Therefore, the falling speed of the meteorological elements such as raindrops is fast, and a user needs to have certain requirements on the shooting parameters of the electronic device when the user wants to shoot a satisfactory photo, at present, the shooting parameters corresponding to the common shooting mode of the electronic device often cannot meet the shooting parameter requirements, so that the state of the meteorological elements cannot be captured, and therefore after the display screen 194 previews a scene image displayed by the camera application, the user can switch the shooting mode of the camera application into a professional mode by clicking the mode selection control 302, so that the shooting parameters such as exposure time and the like can be manually adjusted in the professional mode to shoot the state of the meteorological elements, for example, shooting rain silk or raindrops, or shooting snow flakes or snow silk. However, in this case, the user needs to manually adjust the parameters, the operation is complicated, the user needs to have a certain shooting experience, and the shooting creation threshold is high.
In order to solve the problems that the user needs to manually adjust parameters, the operation is complex, the user needs to have certain shooting experience, and the shooting creation threshold is high. The application provides a shooting method which can reduce complex operation of a user in a shooting process and reduce the threshold of shooting creation.
In order to make the technical solution of the present application clearer and easier to understand, the following describes a shooting method provided by the embodiment of the present application with reference to the above embodiment and the corresponding drawings. The method may be implemented on an electronic device embodying the structure shown in fig. 1, wherein the electronic device includes a camera and a microphone and a display screen. See fig. 5 for a flow chart of a photographing method. As shown in fig. 5, the shooting method provided in the embodiment of the present application may include:
s501: the camera application is started.
In some examples, when a user needs to use a camera application of the electronic device (e.g., take a picture or record a video, etc.), the user may click on the camera icon 301 shown in fig. 3A, and the electronic device starts the camera application after detecting that the user clicks on the camera icon 301. In other examples, the user may cause the electronic device to launch the camera application in other ways. For example, a voice command or other preset gestures are used, which are not limited in the embodiments of the present application.
S502: and acquiring image information in the preview interface.
The preview interface is used for displaying preview video streams, and after the electronic device starts a camera application, the electronic device can acquire the video streams through a camera, that is, image information in the preview interface is acquired, and preview scene images can be displayed on the shooting preview interface. In some embodiments, the camera of the electronic device may be a single-channel camera, and the electronic device may default to a single-channel photographing mode, that is, the electronic device obtains a single-channel video stream through the single-channel camera; in other embodiments, the cameras of the electronic device may be multiple cameras, the electronic device defaults to the multiple photographing mode, and multiple video streams may be acquired through the multiple cameras. For example, the electronic device may obtain the first video stream through the first camera, obtain the second video stream through the second camera, and certainly, the electronic device may also obtain the third video stream through the third camera. Take the multichannel camera to include two way cameras as an example, the camera of the first way can be the main scene camera of gathering, and the camera of the second way can be wide-angle camera, also can be the camera of other functions, and this application embodiment does not restrict to this.
In some embodiments, the electronic device may further receive a switching operation triggered by a user, where the switching operation is used to instruct the electronic device to switch the camera shooting mode. For example, when the camera shooting mode is the single-pass shooting mode, the user may trigger the electronic device display control 307 by clicking the camera mode in the control 305 in fig. 3E, the electronic device may trigger the electronic device display control 308 by clicking the multi-pass shooting mode control in the control 307 by the user, and then the user may trigger the switching operation by touching the control 308, and the electronic device switches the single-pass shooting mode of the camera application to the multi-pass shooting mode in response to the switching operation triggered by the user. For another example, when the camera shooting mode is the multi-channel shooting mode, the user may trigger the switching operation by clicking the camera mode in the control 305 in fig. 3E, and the electronic device switches the multi-channel shooting mode applied by the camera to the single-channel shooting mode in response to the switching operation triggered by the user. The multi-path shooting mode refers to that pictures are shot through cooperation of a plurality of cameras. The multi-path photographing mode can comprise a main photographing long-focus mode, a main photographing wide-angle mode, a main photographing portrait mode and the like, and the electronic equipment can respond to switching operation triggered by a user and can be switched randomly in the multi-path photographing mode so as to meet the requirements of the user.
S503: and determining the weather scene category of the image to be shot in the preview interface.
The scene category is used to represent a scene of an image to be photographed in the photographing preview interface, and for example, the scene category may include a rainy day, a snowy day, a cloudy day, a sunny day, a gourmet day, and the like. In some examples, in a case where the electronic device enters a shooting preview interface and the electronic device is in a shooting mode, the electronic device may automatically detect a scene category of an image from a scene image displayed in a preview interface of a camera application. For example, the image in the shooting preview interface is a food, the electronic device may automatically detect that the scene type of the image of the food is the food, and for example, the image displayed in the shooting preview interface is rainy day, and the electronic device may automatically detect that the scene type of the image is rainy day, and the like.
Fig. 4B shows a schematic diagram of detecting a scene type in a photographing process, where the scene type of the image is a rainy scene, the camera 193 transmits light reflected by the captured photographed scene to the photosensitive element, the light is converted into an electrical signal by the photosensitive element, the electrical signal is transmitted to the ISP to be processed and converted into a digital image, the IPS transmits the processed image to the processor 110, the processor 110 may input the image of the scene to be detected by using a CNN network model through a scene detection algorithm, and the scene type in the output image is a rainy scene. The CNN network model may be obtained by training image data labeled with a scene classification label, for example, data sets such as ImageNet and Places.
In some other embodiments, the electronic device may further obtain corresponding time information and a geographic location of the user while the camera application displays the scene image, and further determine weather information of the geographic location of the user in the time period according to the time information and the geographic location of the user, for example, when the camera application displays the scene image, the corresponding time information is obtained as ten am, the user is determined to be in beijing according to the obtained geographic location of the user, and the weather information of ten am in beijing may be determined according to the time information and the geographic location of the user. In other words, the category of the image scene displayed in the camera application of the time period is rainy day. In some embodiments, the scene detection algorithm may be further combined to detect the result of the category of the scene in the image, and the comprehensive judgment may be performed, so that the accuracy of the scene judgment may be improved by assisting in judging the result of the category of the scene in the image with the time information and the geographic location of the user.
S504: and determining the atmosphere grade of the meteorological elements under the weather scene category according to the weather scene category.
The meteorological elements may include rain or snow. The weather element atmosphere level refers to a state of a weather element that the user wants to photograph, for example, raindrops, rain, snow, and the like. The meteorological element atmosphere levels can be classified into 0 level, 1 level, 2 level and 3 level, and other levels are also possible, which are not limited herein.
Further, after determining the weather scene category in which the image to be captured is located in the preview interface, the controller 110 may control the electronic device to display an interactive interface on the display screen 194, and present the weather element atmosphere level selection control 401 in the weather scene.
The electronic device can switch the corresponding meteorological element atmosphere levels according to the selection operation of the user on the meteorological element atmosphere level selection control 401, and display the switched meteorological element atmosphere levels, taking rainy days as an example, the user touches the number 1 in the rainy day atmosphere level selection control 401 on the display screen 194, that is, the rainy day atmosphere level instruction sent by the user is switched to the 1-level rainy day atmosphere level, the touch sensor 180K can acquire the touch instruction of the user and send the touch instruction to the processor 110, and the processor 110 can control the electronic device to switch the rainy day atmosphere level to the 1-level.
In some embodiments, in order to enable a user to more intuitively know the shooting effect corresponding to the weather element atmosphere level, the weather element atmosphere level selection control 401 may include a plurality of shooting effect options in a weather scene category and may display the plurality of shooting effect options in the weather scene category in the weather element atmosphere level selection control 401, where a correspondence between the shooting effect options and the weather element atmosphere level may be preset, it may be understood that different shooting effect options correspond to different weather element atmosphere levels, the user may select a target shooting effect option in the plurality of shooting effect options as needed, and then may determine the weather element atmosphere level corresponding to the target shooting effect option according to the correspondence.
Taking a rainy day as an example, as shown in fig. 4C, in some embodiments, it is determined that a weather scene category in which an image to be photographed is located in the preview interface is a rainy day scene, the weather element atmosphere level selection control in the weather scene may be a rainy day atmosphere level selection control, the controller 110 may control the electronic device to display a mutual interface on the display screen 194, and present a rainy day atmosphere level selection control 401, the rainy day atmosphere level selection control 401 may include a plurality of shooting effect options in the rainy day scene category, that is, an option for shooting raindrops and an option for shooting rainwires, and the user may select a target shooting effect option as needed, so as to determine a corresponding rainy day atmosphere level. Wherein, different shooting effect options can correspond different rainy day atmosphere grades in rainy day atmosphere selection controlling part 401, for example, it can be 1 level to shoot the raindrop option and correspond rainy day atmosphere grade, shoots the rain silk option and can correspond rainy day atmosphere grade and be 2 levels, shoots the long rain silk option and can correspond rainy day atmosphere grade and be 3 levels.
The foregoing is introduced by taking rainy days as an example, or snow days as an example, and the weather scene category where the image to be shot in the preview interface is determined to be a snow day scene, the meteorological element atmosphere level selection control in the weather scene may be a snow day atmosphere level selection control 401, and the controller 110 may control the electronic device to display the mutual interface on the display screen 194, and present the snow day atmosphere level selection control 401. The snow atmosphere level selection control may include multiple shooting effect options in the snow scene category, that is, an option to shoot snowflakes and an option to shoot snow filaments, where different shooting effect options in the snow atmosphere selection control 401 may correspond to different snow atmosphere levels, for example, the snow shooting option may correspond to the snow atmosphere level of level 1, the snow filament shooting option may correspond to the snow atmosphere level of level 2, and the snow filament shooting option may correspond to the snow atmosphere level of level 3.
In some embodiments, in order to further reduce the complicated operations of the user during the photographing process, in addition to the above-mentioned autonomous selection of the weather element atmosphere level according to the user, the electronic device may further detect the level of the weather element in the weather scene, and then automatically switch to the corresponding weather element atmosphere level according to the corresponding relationship between the level of the weather element in the weather scene and the weather element atmosphere level, which is preset according to the detected level of the weather element in the weather scene. This can reduce user operations.
Further, after determining the scene type of the image, the electronic device may further detect the level of the meteorological elements in the weather scene through an intelligent algorithm, for example, the level of the meteorological elements in the rainy scene may be detected as light rain, medium rain or heavy rain, and the level of the meteorological elements in the snowy scene may be detected as light snow, medium snow or heavy snow.
In other embodiments, while the image to be captured is determined, the weather information of the geographical location of the electronic device corresponding to the time may be determined by obtaining the time information and the geographical location of the user, which may also be referred to as first weather information, and further the level of the weather element of the geographical location of the user at the time may be determined, for example, when the camera application displays the image to be captured, the corresponding time information is obtained as ten am, and the user is determined to be in beijing by obtaining the geographical location of the user, and the weather information of ten am in beijing may be determined according to the time information and the geographical location of the user. In the case that the weather information of ten am in beijing is heavy rain, it may be determined that the weather information of the geographic location where the user is located in the time period is heavy rain, that is, the category of the image scene displayed in the camera application in the time period is heavy rain. Therefore, the magnitude of the meteorological element grade in the weather scene can be measured.
In some embodiments, in order to make the detection result more accurate, the category result of the scene in the image may be detected by combining a scene detection algorithm based on the first weather information, and a comprehensive judgment may be performed, so that the magnitude of the weather element grade in the weather scene may be detected more accurately, and the electronic device may automatically switch to the corresponding weather element atmosphere grade.
Further, a corresponding relationship between the level of the weather element in the weather scene and the level of the weather element atmosphere may be preset, for example, a corresponding relationship between the level of rain in the rainy scene and the level of the rainy scene may be preset, for example, the larger the detected level of rain in the rainy scene is, the larger the rainy atmosphere level automatically switched by the electronic device may be, further, for example, the detected level of rain in the rainy scene is light rain, the corresponding electronic device automatically switches to the rainy atmosphere level of level 1, the detected level of rain in the rainy scene is medium rain, the corresponding electronic device automatically switches to the rainy atmosphere level of level 2, the detected level of rain in the rainy scene is heavy rain, and the corresponding electronic device automatically switches to the rainy atmosphere level of level 3, so that the rainy atmosphere level may be determined according to the preset corresponding relationship between the level of rain in the rainy scene and the rainy atmosphere level.
Of course, the above embodiment is illustrated by rain, and may also be other weather such as snow, and the corresponding relationship between the level of snow in the snow scene and the level of snow atmosphere may be preset, for example, the detected level of snow in the rain scene is small snow, the corresponding electronic device automatically switches to the level 1 snow atmosphere level, the detected level of snow in the snow scene is medium snow, the corresponding electronic device automatically switches to the level 2 snow atmosphere level, the detected level of rain in the snow scene is large snow, and the corresponding electronic device automatically switches to the level 3 snow atmosphere level, so that the snow atmosphere level may be determined according to the preset corresponding relationship between the level of snow in the snow scene and the snow atmosphere level.
S505: and determining shooting parameters according to the atmosphere grade of the meteorological elements.
In some embodiments, the capture parameters may include an exposure time, wherein the exposure time is a time during which a shutter is to be opened in order to project light onto a photosensitive surface of the photographic photosensitive material. The exposure time can affect the light-entering amount of the picture, and can be adjusted by setting the shutter speed, wherein the slower the shutter speed is, the longer the exposure time is, the more the light-entering amount of the lens is, and the more the corresponding picture is exposed.
Further, the electronic device may set a default exposure time in the normal mode and may preset a corresponding relationship between different atmosphere levels of the meteorological elements and the exposure time, for example, in a rainy day scene, in response to a user selecting a raindrop shooting option of the multiple shooting effect options, the corresponding rainy day atmosphere level may be determined to be level 1, the electronic device may adjust the shutter speed to adjust the exposure time by 30% on the basis of the current default exposure time, in response to a user selecting a raindrop shooting option of the multiple shooting effect options, the rainy day atmosphere level may be determined to be level 2, the exposure time by 60% may be adjusted on the basis of the current default exposure time, in response to a user selecting a raindrop shooting option of the multiple shooting effect options, the rainy day atmosphere level may be determined to be level 3, and the exposure time by 90% may be adjusted on the basis of the current default exposure time. For another example, in a snow scene, in response to a user selecting a snow shoot option of the plurality of shoot effect options, the electronic device may determine that the corresponding snow atmosphere level is level 1, may adjust the shutter speed to adjust up an exposure time of 30% based on a current default exposure time, in response to a user selecting a snow shoot option of the plurality of shoot effect options, may determine that the snow atmosphere level is level 2, may adjust up an exposure time of 60% based on the current default exposure time, and in response to a user selecting a snow shoot option of the plurality of shoot effect options, may determine that the snow atmosphere level is level 3, and may adjust up an exposure time of 90% based on the current default exposure time.
In some embodiments, to meet the user's needs, the photographic effect options of the presented meteorological elements may be further refined to photograph more specific states of the meteorological elements, as selected by the user.
For example, taking a rainy scene as an example, a user may perform a rain category selection operation as needed, and autonomously select a rain category as raindrops or rainwires, the electronic device may present different shooting effect options corresponding to the rain category according to the rain category selected by the user, determine a rainy atmosphere level according to the selection operation on the shooting effect options in the rain category, and adjust exposure time.
For example, the longer the exposure time is, the longer the shot raindrops are, the more dense the shot raindrops are, for example, the user selects to shoot the raindrops, the electronic device may present different shooting effect options under the raindrop option, for example, a shooting short raindrop option, a shooting long raindrop option, and the like, and may preset a relationship between the different shooting effect options under the raindrop option and the rainy day atmosphere level and an adjustment relationship between the rainy day atmosphere level and the exposure time under the raindrop scene.
Specifically, the rainy day atmosphere level can be determined to be level 1 in response to the user selecting the shoot short rain option, the electronic device can adjust the shutter speed to increase the exposure time by 60% on the basis of the current default exposure time, the rainy day atmosphere level can be determined to be level 2 in response to the user selecting the shoot rain option, the exposure time by 75% can be increased on the basis of the current default exposure time, and the rainy day atmosphere level can be determined to be level 3 in response to the user selecting the shoot long rain option, and the exposure time by 90% can be increased on the basis of the current default exposure time. It should be noted that the user may also manually switch the shooting effect option to switch the rainy atmosphere level, for example, determining that the rainy atmosphere level is 3 levels, and may adjust the exposure time by 90% based on the current default exposure time, and if the rainy atmosphere level may be switched to 2 levels according to the operation of the user clicking the shooting rain option, the electronic device may adjust the exposure time by 60% based on the current default exposure time, where the current default exposure time refers to the default exposure time set by the electronic device in the normal mode.
The above is an example of photographing the raindrops by the user, and if the user selects to photograph the raindrops, the shorter the exposure time is, the more sparse the photographed raindrops are, and the smaller the exposure time is, the more the exposure time is adjusted, compared with the case of photographing the raindrops, the appropriate exposure time up-regulation amplitude needs to be reduced.
Specifically, the user selects to shoot raindrops, the electronic device may present different shooting effect options under the raindrop option, for example, a short raindrop option, a long raindrop option, and the like, and may preset a relationship between the different shooting effect options under the raindrop option and the rainy day atmosphere level and an adjustment relationship between the rainy day atmosphere level and the exposure time under the raindrop scene. Further, the electronic device may determine that the rainy atmosphere level is level 1 in response to the user selecting the shoot short raindrop option, may determine that the rainy atmosphere level is level 2 in response to the user selecting the shoot raindrop option, may determine that the rainy atmosphere level is level 20 in response to the current default exposure time, and may determine that the rainy atmosphere level is level 3 in response to the user selecting the shoot long raindrop option, and may determine that the rainy atmosphere level is level 30% in response to the current default exposure time by adjusting the shutter speed.
The rain scene type in the rain scene is described as an example, which is only one possible implementation manner provided by the present application, and a person skilled in the art may perform the same processing on other scenes, such as snow or snow silk in the snow scene, based on the same principle according to the actual needs.
It should be noted that the longer the exposure time, the more the amount of light entering the lens, and the more the corresponding screen exposure brightness. In some embodiments, in order to avoid too long exposure time and excessive screen brightness, the user can manually adjust the sensitivity to make the overall exposure brightness of the image screen suitable, but the manual adjustment of the sensitivity by the user increases the user operation and makes the shooting process more complicated. Therefore, in another embodiment provided by the present application, in the embodiment itself, the shooting parameters may include an exposure time and a sensitivity, and the electronic device may adjust the sensitivity down automatically while adjusting the exposure time or after adjusting the exposure time, for example, the adjustment relationship between the exposure time and the sensitivity may be set, so that the sensitivity down-adjustment range may be the same as the up-adjustment range of the exposure time, so that the overall exposure brightness of the image frame is suitable. For example, the exposure time may be adjusted up to 30% based on the current default exposure time, and the sensitivity may be adjusted down to 30% based on the current sensitivity. Therefore, the electronic equipment can automatically adjust the sensitivity down according to the up-adjusted exposure time, so that a user does not need to manually adjust the sensitivity, and the user operation is reduced.
Based on the above description, the present application provides a shooting method, which may determine a weather scene category where an image to be shot is located in a preview interface after entering the preview interface, present a plurality of shooting effect options under the weather scene category according to the determined weather scene category, and determine a preset shooting parameter corresponding to a target shooting effect option in the plurality of shooting effect options according to a selection operation on the target shooting effect option; the method comprises the steps of responding to triggering operation of a shooting control, utilizing shooting parameters to carry out shooting, wherein the shooting parameters corresponding to each shooting effect option in a plurality of shooting effect options can be preset, and electronic equipment can determine the preset shooting parameters corresponding to the target shooting effect options according to selection operation of a user on the target shooting effect options in the plurality of shooting effect options.
In some embodiments, a user may choose a dark background when shooting weather elements in a specific weather scene, for example, when shooting rain or raindrops, the details of the rain or raindrops are set off by utilizing the reflection of the raindrops in the air. However, this way of actively selecting the background by the user requires a certain requirement on the photographing environment. For example, in some environments, a user needs to shoot a rain, and may not catch a dark background due to factors such as environment and weather. Therefore, on the basis of the above-mentioned embodiment, as shown in fig. 6, the shooting method provided in the embodiment of the present application may further include:
s601: and responding to the triggering operation of the shooting control, and shooting by using the shooting parameters to obtain a first picture.
After the exposure time is adjusted, the user may click the control 307 in the shooting preview interface to trigger the operation of taking a picture, and after the electronic device detects that the user triggers the operation of taking a picture, the electronic device may start to take a picture. In some embodiments, the user may also trigger a photo taking operation by gesture photo taking, sound photo taking, and the like, and when the electronic device detects a photo taking gesture set by the user or sound set by the user, the photo taking operation may be triggered to start photo taking.
Further, the electronic device responds to the triggering operation of the shooting control, shooting is carried out by utilizing the shooting parameters, and a first photo is obtained.
In some embodiments, different shooting effect options corresponding to different filters may be preset, a preset filter corresponding to a target shooting effect option may be called according to a selection operation of a user on the target shooting effect option in the multiple shooting effect options, and in response to a trigger operation of the shooting control, the electronic device may shoot by using the shooting parameters and the filter corresponding to the target shooting effect option. Therefore, the user does not need to manually select the filter, the user operation is reduced, and the user can be helped to create better photographic works
S602: and carrying out segmentation processing on the first photo to obtain a second photo.
After the electronic device obtains the first photo, the electronic device may perform segmentation processing on the first photo by using a segmentation algorithm to segment a background image and a weather element of the first photo, and may adjust brightness of the background image of the first photo to obtain the second photo.
In some embodiments, a deep neural network is used for learning mapping from a rain image to a rain-free image through a rain removing algorithm in a segmentation algorithm, a first photo only retaining a background image and not retaining raindrops or raindrops is obtained, then the first photo not treated by the rain removing algorithm and the first photo treated by the rain removing algorithm are subjected to subtraction, the first photo only retaining the raindrops or the raindrops is obtained, the brightness of the background image can be reduced, and/or the brightness of the rain is increased to further highlight the raindrops or the raindrops, so that after the first photo is treated by the segmentation algorithm, the background image and/or the brightness of the rain can be adjusted to obtain a second photo. In this way, the electronic device can segment the rain and the background in the photo, and further can adjust the brightness of the background image. There is no need for the user to autonomously select a dark background.
It should be noted that the foregoing description is given by way of example in a rainy day scene, and is only one possible implementation manner provided by the present application, and a person skilled in the art may perform the same processing on other scenes, such as a snow day scene, based on the same principle according to actual needs.
In some embodiments, the electronic device may enhance the contrast of the image based on histogram equalization through a contrast algorithm. Firstly, the gray level probability density distribution of an image is counted, an accumulative probability density function is obtained through calculation, and then the accumulative probability density function is normalized to the gray level value range of the image and is rounded. And finally, converting the original image according to the normalized and rounded cumulative probability density function, so that the contrast of meteorological elements and the background in the first picture can be automatically improved through the processing of a segmentation algorithm.
In other embodiments, the user wants to photograph scenic buildings, etc., in order to avoid the meteorological elements, such as raindrops or rain, snow or snow, etc., from blocking the scenic buildings, the remove meteorological element photographing effect option may be selected, the meteorological element atmosphere level may be determined to be 0 according to the remove meteorological element photographing effect option, and the electronic device may call an intelligent algorithm to eliminate the meteorological elements in the image according to the meteorological element atmosphere level of 0 selected by the user.
In other embodiments, the user needs to take a scene of the portrait in the meteorological elements to create a better photographic work, but the photo taken after the exposure time is adjusted may be too dark or too bright, and if the brightness of the portrait needs to be adjusted, the brightness of the photo can only be adjusted to be higher overall, which may cause the overall brightness of the photo to be too bright, thereby affecting the appearance of the photo. Therefore, the embodiment of the application can also be used for segmenting the characters and the background by utilizing a single input image through a segmentation algorithm based on a deep learning network. And furthermore, the brightness of the person can be only increased or decreased, the brightness of the whole photo does not need to be adjusted, and the overall appearance of the photo is not influenced.
S603: and determining a filter corresponding to the second picture, and replacing the filter for the second picture.
In order to make the user's photographic work more ambitious, the electronic device may also replace the filter for the second photograph. In some embodiments, the electronic device may provide a plurality of filters for the user to select, and when the user sends a filter selection instruction, the electronic device may receive the filter selection instruction sent by the user, and may further determine a corresponding filter according to the filter selection instruction, and then may replace the filter for the second photograph.
In other embodiments, the electronic device may detect the brightness of the second picture and automatically determine the selected filter based on the brightness of the second picture. For example, if it is detected that the brightness of the second picture is less than the brightness threshold, a filter with a brighter brightness and/or a darker color shade may be selected from the plurality of filters for the second picture, and correspondingly, if it is detected that the brightness of the second picture is greater than the brightness threshold, a filter with a darker brightness and/or a lighter color shade may be selected from the plurality of filters for the second picture.
The technical solution of the present embodiment essentially or partially contributes to the prior art, or all or part of the technical solution may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) or a processor to execute all or part of the steps of the method described in the embodiments. And the aforementioned storage medium includes: flash memory, removable hard drive, read only memory, random access memory, magnetic or optical disk, and the like.
The above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (13)

1. A shooting method is applied to electronic equipment and is characterized by comprising the following steps:
determining the weather scene category of an image to be shot in a preview interface;
presenting a plurality of shooting effect options under the weather scene category, wherein the shooting effect corresponds to a specific shooting effect of the meteorological elements under the weather scene category;
according to the selection operation of a target shooting effect option in the plurality of shooting effect options, determining preset shooting parameters corresponding to the target shooting effect option;
and responding to the triggering operation of the shooting control, and shooting by using the shooting parameters.
2. The method of claim 1, wherein the shooting parameters comprise: the exposure time.
3. The method of claim 2, wherein the shooting parameters comprise: exposure time and sensitivity;
the determining, according to a selection operation of a target shooting effect option of the plurality of shooting effect options, a preset shooting parameter corresponding to the target shooting effect option includes:
and determining exposure time according to the target shooting effect option, and correspondingly adjusting the sensitivity based on the exposure time.
4. The method according to claim 1, wherein after the photographing using the photographing parameters, the method further comprises:
performing segmentation processing on a first picture obtained by shooting by using the shooting parameters to segment a background image and meteorological elements of the first picture;
and adjusting the brightness of the background image of the first photo to obtain a second photo.
5. The method of claim 1, further comprising:
according to the selection operation of a target shooting effect option in the plurality of shooting effect options, determining a preset filter corresponding to the target shooting effect option;
the shooting by using the shooting parameters comprises the following steps:
and responding to the triggering operation of the shooting control, and shooting by using the shooting parameters and the filter.
6. The method according to claim 1, wherein after the photographing using the photographing parameters, the method further comprises:
and in response to the selection operation, selecting a weather element removal shooting effect option, removing the weather element in the first picture, wherein the first picture is obtained by shooting with the shooting parameters.
7. The method of claim 1, wherein the determining the weather scene category in which the image to be captured is located in the preview interface comprises:
determining the weather scene category of the image to be shot in the preview interface based on the acquired first weather information, wherein the first weather information is the weather information of the geographical position of the electronic equipment corresponding to the moment of determining the image to be shot.
8. The method according to any one of claims 1 to 7, wherein the weather scene category is a rainy weather scene, the weather element in the weather scene category is rain, and the plurality of filming effect options are: an option to shoot raindrops and an option to shoot rain.
9. The method according to any one of claims 1 to 7, wherein the weather scene category is a rainy weather scene, the weather element in the weather scene category is rain, and the plurality of filming effect options are: and shooting different shooting effect options of raindrops under the raindrop option or shooting different shooting effect options of rainwires under the rainwire option.
10. A shooting method applied to electronic equipment is characterized by comprising the following steps:
determining the weather scene category of an image to be shot in a preview interface;
detecting weather element atmosphere levels in the weather scene categories;
determining shooting parameters corresponding to the meteorological element atmosphere grades according to the meteorological element atmosphere grades and the corresponding relation between the preset meteorological element atmosphere grades and the shooting parameters; and responding to the triggering operation of the shooting control, and shooting by using the shooting parameters.
11. An electronic device, comprising: a camera processor and a memory;
the camera is used for collecting video stream;
wherein one or more computer programs are stored in the memory, the one or more computer programs comprising instructions; the instructions, when executed by the processor, cause the electronic device to perform the method of any of claims 1-10.
12. A computer storage medium comprising computer instructions which, when executed on an electronic device, cause the electronic device to perform the method of any of claims 1-10.
13. A computer program product, characterized in that, when the computer program product is run on a computer, the computer performs the method according to any of claims 1-10.
CN202211386048.1A 2022-11-07 2022-11-07 Shooting method and electronic equipment Pending CN115623323A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211386048.1A CN115623323A (en) 2022-11-07 2022-11-07 Shooting method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211386048.1A CN115623323A (en) 2022-11-07 2022-11-07 Shooting method and electronic equipment

Publications (1)

Publication Number Publication Date
CN115623323A true CN115623323A (en) 2023-01-17

Family

ID=84879023

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211386048.1A Pending CN115623323A (en) 2022-11-07 2022-11-07 Shooting method and electronic equipment

Country Status (1)

Country Link
CN (1) CN115623323A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104243822A (en) * 2014-09-12 2014-12-24 广州三星通信技术研究有限公司 Method and device for shooting images
CN104486558A (en) * 2014-12-31 2015-04-01 厦门美图之家科技有限公司 Video processing method and device for simulating shooting scene
CN106303250A (en) * 2016-08-26 2017-01-04 维沃移动通信有限公司 A kind of image processing method and mobile terminal
CN108965699A (en) * 2018-07-02 2018-12-07 珠海市魅族科技有限公司 Parameter regulation means and device, terminal, the readable storage medium storing program for executing of reference object
CN111050081A (en) * 2019-12-27 2020-04-21 维沃移动通信有限公司 Shooting method and electronic equipment
CN111654635A (en) * 2020-06-30 2020-09-11 维沃移动通信有限公司 Shooting parameter adjusting method and device and electronic equipment
CN114257730A (en) * 2020-09-22 2022-03-29 阿里巴巴集团控股有限公司 Image data processing method and device, storage medium and computer equipment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104243822A (en) * 2014-09-12 2014-12-24 广州三星通信技术研究有限公司 Method and device for shooting images
CN104486558A (en) * 2014-12-31 2015-04-01 厦门美图之家科技有限公司 Video processing method and device for simulating shooting scene
CN106303250A (en) * 2016-08-26 2017-01-04 维沃移动通信有限公司 A kind of image processing method and mobile terminal
CN108965699A (en) * 2018-07-02 2018-12-07 珠海市魅族科技有限公司 Parameter regulation means and device, terminal, the readable storage medium storing program for executing of reference object
CN111050081A (en) * 2019-12-27 2020-04-21 维沃移动通信有限公司 Shooting method and electronic equipment
CN111654635A (en) * 2020-06-30 2020-09-11 维沃移动通信有限公司 Shooting parameter adjusting method and device and electronic equipment
CN114257730A (en) * 2020-09-22 2022-03-29 阿里巴巴集团控股有限公司 Image data processing method and device, storage medium and computer equipment

Similar Documents

Publication Publication Date Title
CN113794800B (en) Voice control method and electronic equipment
WO2021052232A1 (en) Time-lapse photography method and device
CN110506416B (en) Method for switching camera by terminal and terminal
CN113645351B (en) Application interface interaction method, electronic device and computer-readable storage medium
CN110138959B (en) Method for displaying prompt of human-computer interaction instruction and electronic equipment
CN109951633A (en) A kind of method and electronic equipment shooting the moon
CN112887583B (en) Shooting method and electronic equipment
CN112532892B (en) Image processing method and electronic device
CN114390139B (en) Method for presenting video by electronic equipment in incoming call, electronic equipment and storage medium
WO2020029306A1 (en) Image capture method and electronic device
CN110633043A (en) Split screen processing method and terminal equipment
CN113891009B (en) Exposure adjusting method and related equipment
CN113170037B (en) Method for shooting long exposure image and electronic equipment
US20220343648A1 (en) Image selection method and electronic device
WO2022042766A1 (en) Information display method, terminal device, and computer readable storage medium
WO2023241209A9 (en) Desktop wallpaper configuration method and apparatus, electronic device and readable storage medium
CN112150499A (en) Image processing method and related device
CN113141483B (en) Screen sharing method based on video call and mobile device
CN115484380A (en) Shooting method, graphical user interface and electronic equipment
CN114077365A (en) Split screen display method and electronic equipment
CN112449101A (en) Shooting method and electronic equipment
CN113949803B (en) Photographing method and electronic equipment
CN112532508B (en) Video communication method and video communication device
CN113923372B (en) Exposure adjusting method and related equipment
CN115623323A (en) Shooting method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination