CN115802145A - Shooting method and electronic equipment - Google Patents

Shooting method and electronic equipment Download PDF

Info

Publication number
CN115802145A
CN115802145A CN202111049688.9A CN202111049688A CN115802145A CN 115802145 A CN115802145 A CN 115802145A CN 202111049688 A CN202111049688 A CN 202111049688A CN 115802145 A CN115802145 A CN 115802145A
Authority
CN
China
Prior art keywords
zoom magnification
preview image
frame
view frame
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111049688.9A
Other languages
Chinese (zh)
Inventor
林嵩晧
林于超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202111049688.9A priority Critical patent/CN115802145A/en
Priority to PCT/CN2022/112456 priority patent/WO2023035868A1/en
Publication of CN115802145A publication Critical patent/CN115802145A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The application discloses a shooting method and electronic equipment, relates to the technical field of electronics, and can improve shooting experience of a user during high-magnification amplification. The method comprises the following steps: the electronic equipment receives and responds to a first operation to start a camera with a zoom magnification of a first zoom magnification and displays a shooting interface comprising a first view frame and a first preview image; receiving and responding to a second operation to adjust the camera to a second zoom magnification and enlarge and display the first preview image displayed in the first view frame to a second preview image; and receiving and responding to a third operation to adjust the camera to a third zoom magnification and display a third preview image in the first view frame, and display a second view frame in the first view frame. The first preview image is a view-finding picture under a first zoom magnification, the second preview image is a view-finding picture under a second zoom magnification, the third preview image is a view-finding picture under a preset zoom magnification, and the fourth preview image is a view-finding picture under a third zoom magnification.

Description

Shooting method and electronic equipment
Technical Field
The present application relates to the field of electronic technologies, and in particular, to a shooting method and an electronic device.
Background
With the development of technology, a shooting function has been popularized in electronic devices. When a user shoots a distant object using the electronic device, the shot image needs to be enlarged by zooming so as to make the shot object clearer.
However, in a shooting scene with a higher magnification, because the coverage area of the shot picture after being magnified is smaller, when the camera head of the electronic device shakes or the shooting object moves, the shooting object is easily lost by the user, and the shooting experience of the user is poor.
Disclosure of Invention
The application provides a shooting method and electronic equipment, which can improve shooting experience of a user during high-magnification amplification. In order to achieve the purpose, the technical scheme is as follows:
in a first aspect, the present application provides a shooting method, which is applied to an electronic device including a camera, and includes: the electronic equipment receives and responds to a first operation to start a camera with a first zoom magnification and display a shooting interface which comprises a first view finder and a first preview image; the electronic equipment receives and responds to a second operation to adjust the camera to a second zoom magnification and the first preview image displayed in the first view frame is displayed in an enlarged mode to be a second preview image; the electronic device receives and adjusts the camera to a third zoom magnification in response to a third operation and displays a third preview image in the first view frame, and displays a second view frame within the first view frame. The first preview image is a framing picture under a first zoom magnification, the second preview image is a framing picture under a second zoom magnification, the third preview image is a framing picture under a preset zoom magnification, the fourth preview image is a framing picture under a third zoom magnification, the second zoom magnification is larger than the first zoom magnification and smaller than or equal to the preset zoom magnification, the third zoom magnification is larger than the preset zoom magnification, the second framing frame covers a partial area of the first framing frame, and the fourth preview image is displayed in the second framing frame.
It can be seen that, in the shooting method provided in the embodiment of the present application, the first view frame in the shot preview image is not continuously enlarged with the increase of the zoom magnification used by the electronic device by the user, but the preview image is displayed through the second view frame after the first view frame is enlarged to a certain magnification, that is, after the first view frame is larger than the preset zoom magnification. Therefore, when the lens shakes or the shooting object moves, the user can easily find the shooting object from the first view frame, and the shooting experience of the user in high-magnification amplification is improved.
Optionally, the preset zoom magnification may be a maximum optical zoom magnification of the camera. For example, if the maximum optical zoom magnification of the camera is 10 times, the preset zoom magnification is 10 times.
Optionally, the second preview image may be a portion of the first preview image.
Optionally, if the second zoom magnification is smaller than the preset zoom magnification, the third preview image is a part of the second preview image after being enlarged and displayed; and if the second zoom magnification is equal to the preset zoom magnification, the third preview image is the same as the second preview image.
In one possible implementation manner, after the second viewfinder frame is displayed in the first viewfinder frame, the method may further include: the electronic equipment receives a fourth operation, wherein the fourth operation is a shooting operation; and responding to the fourth operation, and shooting by the electronic equipment to obtain an image with the same content as the fourth preview image.
It can be seen that the final imaging obtained by shooting when the first finder frame and the second finder frame coexist in the shooting interface is the same image as the preview image content in the second finder frame.
In one possible implementation manner, after the second viewfinder is displayed in the first viewfinder, the method may further include: receiving a fifth operation of increasing the zoom magnification of the camera, which is input by a user; and responding to the fifth operation, increasing the zoom magnification of the camera, reducing the second view frame, and continuously displaying the third preview image in the first view frame.
In one possible implementation, after displaying the second viewfinder in the first viewfinder, the method further includes: receiving a sixth operation of reducing the zoom magnification of the camera input by a user; and responding to the sixth operation, reducing the zoom magnification of the camera, increasing the second view frame, and continuing to display the third preview image in the first view frame.
In one possible implementation, after displaying the second viewfinder in the first viewfinder, the method further includes: receiving a seventh operation of dragging the second view finding frame input by a user; in response to the seventh operation, moving the second viewfinder within the first viewfinder according to the seventh operation.
It can be seen that, in the shooting method provided by the embodiment of the application, the user can change the shooting picture of the second finder frame by moving the second finder frame, and compared with the design that the existing shooting method only can use the center of the finder frame as a shooting point, the shooting method has greater elasticity for imaging, and further improves the shooting experience of the user. For example, when a user fixes a mobile phone by using a tripod or the like, the shooting angle of the mobile phone is already fixed, and if a shooting target moves, the user needs to move the mobile phone to track the shooting target, which is time-consuming and labor-consuming. In the embodiment of the application, as long as the shooting target is located in the first view finder, the user can easily track the shooting target only by moving the second view finder.
In one possible implementation manner, after the second viewfinder is displayed in the first viewfinder, the method further includes: receiving an eighth operation of starting object tracking input by a user; in response to the eighth operation, moving the second finder frame following the object within the second finder frame.
It can be seen that the user can follow track shooting to the shooting object through operation instruction electronic equipment, and when this shooting object moved promptly, the second frame of finding a view was followed this shooting object and is moved, keeps this shooting object in the second frame of finding a view promptly all the time to prevent that the shooting object from losing the shooting object because of moving, further promoted user's shooting experience.
In one possible implementation, after displaying the second viewfinder in the first viewfinder, the method further includes: and reducing the brightness of a target area of the first view frame, wherein the target area of the first view frame is an area which is not overlapped with the second view frame in the first view frame.
In another possible implementation manner, after the second viewfinder is displayed in the first viewfinder, the method further includes: and carrying out fuzzy processing on the target area of the first view frame.
In yet another possible implementation manner, after the second viewfinder is displayed in the first viewfinder, the method further includes: and overlaying a layer on the target area of the first view frame.
Under the condition that the shooting interface has the first view frame and the second view frame, the target area of the first view frame is subjected to image processing, so that a user can distinguish the target area of the first view frame from the second view frame conveniently, and the shooting experience of the user is further improved.
In a second aspect, the present application provides an electronic device comprising: at least one processor, a memory, and at least one camera, the memory, the camera coupled to the processor, the memory for storing computer program code, the computer program code including computer instructions that, when read from the memory by the processor, cause the electronic device to perform operations comprising: receiving a first operation; responding to the first operation, starting the camera and displaying a shooting interface, wherein the zooming magnification of the camera is a first zooming magnification, the shooting interface comprises a first view frame, and the first view frame comprises a first preview image, wherein the first preview image is a view frame under the first zooming magnification; receiving a second operation; in response to the second operation, adjusting the camera to a second zoom magnification, and displaying the first preview image displayed in the first view frame as a second preview image in an enlarged manner; the second zoom magnification is larger than the first zoom magnification, the second zoom magnification is smaller than or equal to a preset zoom magnification, and the second preview image is a framing picture under the second zoom magnification; receiving a third operation; responding to the third operation, adjusting the camera to a third zoom magnification, displaying a third preview image in the first view frame, and displaying a second view frame in the first view frame; the third zoom magnification is larger than the preset zoom magnification, the second view frame covers a partial area of the first view frame, and a fourth preview image is displayed in the second view frame; the third preview image is a view frame under the preset zoom magnification, and the fourth preview image is a view frame under the third zoom magnification.
Optionally, the preset zoom magnification may be a maximum optical zoom magnification of the camera.
Optionally, the second preview image may be a portion of the first preview image.
Optionally, if the second zoom magnification is smaller than the preset zoom magnification, the third preview image is a part of the second preview image after being enlarged and displayed; and if the second zoom magnification is equal to the preset zoom magnification, the third preview image is the same as the second preview image.
In one possible implementation, the processor is further configured to cause the electronic device to perform the following operations: receiving a fourth operation, wherein the fourth operation is a shooting operation; responding to the fourth operation, the electronic equipment shoots to obtain an image with the same content as the fourth preview image,
in one possible implementation, the processor is further configured to cause the electronic device to perform the following operations: receiving a fifth operation of increasing the zoom magnification of the camera, which is input by a user; and responding to the fifth operation, increasing the zoom ratio of the camera, reducing the second viewing frame, and continuously displaying the third preview image in the first viewing frame.
In one possible implementation, the processor is further configured to cause the electronic device to perform the following operations: receiving a sixth operation of reducing the zoom magnification of the camera input by a user; and responding to the sixth operation, reducing the zoom magnification of the camera, increasing the second view frame, and continuing to display the third preview image in the first view frame.
In one possible implementation, the processor is further configured to cause the electronic device to perform the following operations: receiving a seventh operation of dragging the second view finding frame input by a user; in response to the seventh operation, moving the second viewfinder within the first viewfinder according to the seventh operation.
In one possible implementation, the processor is further configured to cause the electronic device to perform the following operations: receiving a seventh operation of dragging the second view finding frame input by a user; receiving an eighth operation of starting object tracking input by a user; in response to the eighth operation, moving the second finder frame following the object within the second finder frame.
In one possible implementation, the processor is further configured to cause the electronic device to perform the following operations: and reducing the brightness of a target area of the first view frame, wherein the target area of the first view frame is an area which is not overlapped with the second view frame in the first view frame.
In another possible implementation, the processor is further configured to cause the electronic device to perform the following operations: and carrying out fuzzy processing on the target area of the first view frame.
In yet another possible implementation manner, the processor is further configured to cause the electronic device to perform the following operations: and overlaying a layer on the target area of the first view frame.
In a third aspect, an embodiment of the present application further provides an electronic device, where the apparatus includes: at least one processor configured to implement the method of the first aspect or any possible implementation thereof when the at least one processor executes program code or instructions.
Optionally, the electronic device may further comprise at least one memory for storing the program code or instructions.
In a fourth aspect, an embodiment of the present application further provides a chip, including: input interface, output interface, at least one processor. Optionally, the chip further comprises a memory. The at least one processor is configured to execute the code in the memory, and when the at least one processor executes the code, the chip implements the method described in the first aspect or any possible implementation manner thereof.
Optionally, the chip may also be an integrated circuit.
In a fifth aspect, an embodiment of the present application further provides a terminal, where the terminal includes the electronic device or the chip.
In a sixth aspect, the present application further provides a computer-readable storage medium for storing a computer program, where the computer program includes instructions for implementing the method described in the first aspect or any possible implementation manner thereof.
In a seventh aspect, embodiments of the present application further provide a computer program product containing instructions, which when executed on a computer, enable the computer to implement the method described in the above first aspect or any possible implementation manner thereof.
The electronic device, the computer storage medium, the computer program product, and the chip provided in this embodiment are all used for executing the shooting method provided above, and therefore, the beneficial effects achieved by the electronic device, the computer storage medium, the computer program product, and the chip can refer to the beneficial effects in the shooting method provided above, and are not described herein again.
Drawings
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 2 is a schematic diagram of a software structure of an electronic device according to an embodiment of the present application;
fig. 3 is a schematic view of a user interface of an electronic device according to an embodiment of the present application;
FIG. 4 is a schematic view of a user interface of another electronic device according to an embodiment of the present application;
FIG. 5 is a schematic view of a user interface of another electronic device according to an embodiment of the present application;
FIG. 6 is a schematic user interface diagram of another electronic device provided in an embodiment of the present application;
FIG. 7 is a schematic user interface diagram of another electronic device provided in an embodiment of the present application;
fig. 8 is a schematic flow chart of a digital zoom shooting process of the shooting method according to the embodiment of the present application;
FIG. 9 is a schematic user interface diagram of another electronic device provided in an embodiment of the present application;
fig. 10 is a schematic flowchart of a shooting method according to an embodiment of the present application.
Detailed Description
The term "and/or" herein is merely an association describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone.
The terms "first" and "second" and the like in the description and drawings of the present application are used for distinguishing different objects or for distinguishing different processes for the same object, and are not used for describing a specific order of the objects.
Furthermore, the terms "including" and "having," and any variations thereof, as referred to in the description of the present application, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
It should be noted that in the description of the embodiments of the present application, words such as "exemplary" or "for example" are used to indicate examples, illustrations or illustrations. Any embodiment or design described herein as "exemplary" or "e.g.," is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
In the description of the present application, the meaning of "a plurality" means two or more unless otherwise specified.
Before describing the embodiments of the present application, some terms referred to in the embodiments of the present application will be explained.
Optical zooming: optical zooming is the magnification of an image from a physical level by optical refraction on the design of the camera itself. Since the optical refraction is adopted, the image quality is not damaged. Also since optical zooming is a physical technique, this part of the image is seen and obtained.
Digital zooming: digital zooming is software magnification and dot filling from pixel dimension on an imaged image through software technology. Since the pixel level passes through the computational complement of the algorithm, the image quality is impaired.
The shooting method provided by the embodiment of the application can be applied to electronic devices which can realize a shooting function, such as mobile phones, tablet computers, wearable devices, vehicle-mounted devices, augmented Reality (AR)/Virtual Reality (VR) devices, notebook computers, ultra-mobile personal computers (UMPCs), netbooks, personal Digital Assistants (PDAs), and the like, and the embodiment of the application does not limit the specific types of the electronic devices at all.
Note that the electronic apparatus is mounted with a camera such as a telephoto lens.
Fig. 1 is a schematic structural diagram of an example of an electronic device 100 according to an embodiment of the present disclosure. The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a key 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller may be, among other things, a neural center and a command center of the electronic device 100. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bidirectional synchronous serial bus, and the processor 110 may be coupled to the touch sensor 180K through the I2C interface, so that the processor 110 and the touch sensor 180K communicate through the I2C bus interface, thereby implementing the touch function of the electronic device 100. MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the capture functionality of electronic device 100. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the electronic device 100.
It should be understood that the interface connection relationship between the modules illustrated in the embodiments of the present application is only an illustration, and does not limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like.
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, connected to the display screen 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement a photographing function through the ISP, the camera 193, the touch sensor, the video codec, the GPU, the display screen 194, the application processor, and the like. Such as the shooting process described in the embodiments of the present application.
The ISP is used for processing data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB format, YUV format, etc., and it should be understood that, in the description of the embodiment of the present application, an image in an RGB format is taken as an example for description, and the embodiment of the present application does not limit the format of the image. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The pressure sensor 180A is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100. The air pressure sensor 180C is used to measure air pressure. The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, electronic device 100 may utilize range sensor 180F to range for fast focus. The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The ambient light sensor 180L is used to sense the ambient light level. Electronic device 100 may adaptively adjust the brightness of display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and so on. The temperature sensor 180J is used to detect temperature. The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The bone conduction sensor 180M may acquire a vibration signal. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100. The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc. The SIM card interface 195 is used to connect a SIM card.
The software system of the electronic device 100 may employ a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the present application takes an Android system with a layered architecture as an example, and exemplarily illustrates a software structure of the electronic device 100.
Fig. 2 is a block diagram of a software structure of the electronic device 100 according to the embodiment of the present application. The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the ad roid system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) And system library, and a kernel layer, from top to bottom, respectively.
The application layer may include a series of application packages. As shown in fig. 2, the application packages may include camera, photo album, music, settings, etc. applications.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions. As shown in FIG. 2, the application framework layers may include a window manager, content provider, view system, explorer, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
Content providers are used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, which may be used to convey notification type messages, and the notification information displayed in the status bar may disappear automatically after a short dwell, such as a message alert to notify the user that the download is complete. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, to prompt a text message in the status bar, or the notification manager may also sound a prompt such as a vibration of the electronic device, flashing of an indicator light, etc.
The Android runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system. The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), media libraries (media libraries), three-dimensional graphics processing libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like. The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The kernel layer may include a hardware driver module, such as a display driver, a camera driver, a sensor driver, and the like, and the application framework layer may call the hardware driver module of the kernel layer.
In the shooting process introduced in the embodiment of the present application, a user opens a camera application, starts the camera application on the application layer in fig. 2, and sends an instruction to the kernel layer to invoke a camera driver, a sensor driver, and a display driver on the kernel layer, so that the electronic device can start a camera or a lens to acquire an image. In the process of collecting images by the camera, light is transmitted to the image sensor through the camera, and the image sensor performs photoelectric conversion on light signals and converts the light signals into images visible to naked eyes of a user. The output image data is transmitted to the system library in fig. 2 in a data stream form, and the three-dimensional graphic processing library and the image processing library realize drawing, image rendering, synthesis, layer processing and the like to generate a display layer; and the surface manager performs fusion processing and the like on the display layer, transmits the fusion processing and the like to a content provider, a window manager and a view system of the application program framework layer, and controls the display of the display interface. Finally, the preview image is displayed on an image preview area of the camera application or a display screen of the electronic device.
For convenience of understanding, the following embodiments of the present application will describe a photographing process of an electronic device by taking the electronic device having the structure shown in fig. 1 and 2 as an example.
The technical solutions in the following embodiments can be implemented in the electronic device 100 having the above hardware architecture and software architecture.
In the embodiment of the present application, the electronic device 100 is a mobile phone as an example, and technical solutions provided in the embodiment of the present application are described in detail with reference to the drawings.
Fig. 3 is a schematic diagram of a Graphical User Interface (GUI) provided in an embodiment of the present application, where (a) in fig. 3 illustrates that, in an unlocking mode of a mobile phone, a screen display system of the mobile phone displays currently output interface content 301, where the interface content 301 is a main interface of the mobile phone. The interface content 301 shows a plurality of applications (apps), such as applications like cameras, address books, telephones, information, clocks, etc. It should be noted that the interface content 401 may also include other more application programs, which is not limited in this embodiment of the application.
The user can instruct the mobile phone to start the camera application by touching a specific control on the mobile phone screen, pressing a specific physical key or key combination, inputting voice, separating air gestures, and the like. And after the instruction of opening the camera by the user is received, starting the camera by the mobile phone and displaying a shooting interface.
Illustratively, as shown in fig. 3 (a), the user may instruct the cell phone to open the camera application by clicking a "camera" application icon on the main interface, and the cell phone displays a shooting interface as shown in fig. 3 (b).
Further exemplarily, when the mobile phone is in the screen-locked state, the user may also instruct the mobile phone to start the camera application through a gesture of sliding rightward on the screen of the mobile phone, and the mobile phone may also display the shooting interface as shown in fig. 3 (b). Or, when the mobile phone is in the screen lock state, the user may instruct the mobile phone to start the camera application by clicking the shortcut icon of the "camera" application on the screen lock interface, and the mobile phone may also display the shooting interface shown in fig. 3 (b).
Further exemplarily, when the mobile phone runs other applications, the user may also enable the mobile phone to open the camera application for taking a picture by clicking the corresponding control. For example, when the user is using an instant messaging application, the user may also instruct the mobile phone to start the camera application by selecting a control of the camera function, and the mobile phone displays a shooting interface as shown in fig. 3 (b).
As shown in fig. 3 (b), the camera interface generally includes a first view finder 302, a shooting control, and other functional controls ("aperture", "night scene", "portrait", "photograph", "video", "professional", etc.). The first view frame 302 can be used to preview an image (or picture) captured by the camera, and the first view frame 302 shown in fig. 3 (b) can be used to preview people, kites, buildings, and clouds captured by the camera. The user can decide the timing to instruct the cellular phone to perform the photographing operation based on the image (or screen) in the first finder frame 302. The user may instruct the mobile phone to perform the shooting operation, for example, the user may click a shooting control, or the user presses a volume key, or the mobile phone is controlled by a voice instruction to perform shooting.
In some embodiments, a zoom magnification indication 303 may also be included in the capture interface. In general, the default zoom magnification of the mobile phone is the basic magnification, which may also be referred to as the first zoom magnification, "1 ×", that is, no zooming is performed. The user can change the zoom ratio by touching a specific control on the screen of the mobile phone, pressing a specific physical key or key combination, inputting voice, separating gestures and the like.
Illustratively, as shown in fig. 3 (b), the user can adjust the zoom magnification used by the mobile phone by operating a zoom magnification instruction 303 in the shooting interface. For example, when the zoom magnification used by the mobile phone is "1 ×", the user can change the zoom magnification used by the mobile phone to another zoom magnification (e.g., "5 ×", "10 ×", "20 ×", "50 ×" etc.) by clicking the zoom magnification indication 303 one or more times. Here, "5 ×" means a zoom magnification of 5 times, that is, a zoom magnification of 5 times when zooming is not performed; "10 ×" means a zoom magnification of 10 times, that is, a zoom magnification of 10 times when zooming is not performed; "20 ×" means a zoom magnification of 20 times, that is, 20 times when zooming is not performed, and "50 ×" means a zoom magnification of 50 times, that is, 50 times when zooming is not performed.
Further illustratively, as shown in fig. 3 (c), the user may increase the zoom magnification used by the mobile phone by a gesture of sliding two fingers (or three fingers) outward (in the opposite direction to the pinch) in the photographing interface, or decrease the zoom magnification used by the mobile phone by a gesture of pinching two fingers (or three fingers).
Further exemplarily, as shown in fig. 3 (d), the user may also change the zoom magnification used by the mobile phone by dragging the zoom scale 304 in the shooting interface.
As shown in fig. 4 (a), when the zoom magnification used by the mobile phone is "1 ×", that is, when zooming is not performed, the user can see the entire view of the building, the white clouds, the people, and the kite through the preview image of the first finder frame 302 in the camera interface of the mobile phone,
as shown in fig. 4 (b), when the user increases the zoom ratio used by the mobile phone from "1 ×" to "5 ×", the preview image of the first finder frame 302 in the camera interface of the mobile phone is enlarged with the center as the origin, the object in the first finder frame is enlarged accordingly, and the finder range of the first finder frame is reduced accordingly. When the zoom ratio used by the mobile phone is "5 ×", the user can only see a part of the building after being enlarged through the preview image of the first finder frame 302 in the shooting interface of the mobile phone, the size of the clouds, people and kite in the preview image is increased relative to the size in the preview image in (a) in fig. 4, but the clouds are also closer to the edge of the first finder frame due to the reduction of the view range.
As shown in fig. 4 (c), when the zoom ratio used in the mobile phone is "10 ×", the object in the first finder frame is enlarged, and the range of view of the first finder frame is reduced. Due to the reduction of the viewing range, the user cannot see the buildings in the first viewing frame 302 through the preview image of the first viewing frame 302 in the shooting interface of the mobile phone, and the number of the visible white clouds is reduced from 5 to 2.
As can be seen from (a), (b), and (c) in fig. 4, the preview image in the first finder frame can be enlarged with the center as the origin by increasing the zoom (magnification) used by the mobile phone, but the viewing range of the finder frame is also reduced. It can be understood that, in a shooting scene with a higher magnification, since the coverage area (viewing range) of the enlarged shooting picture is smaller, when the lens shakes or the shooting object moves, the shooting object is easily lost by the user, and the shooting experience of the user is poor.
Referring to fig. 4 (c), as shown in fig. 4 (d), in the embodiment of the present application, when the user increases the zoom magnification used by the mobile phone so that the zoom magnification used by the mobile phone is greater than the preset zoom magnification (for example, "10 ×"), the preview image of the first finder frame 302 in the shooting interface of the mobile phone is maintained to be no longer enlarged and the second finder frame 305 is displayed in the preview image of the first finder frame.
The preset zoom magnification may be a maximum magnification of the optical zoom of the mobile phone, and may also be referred to as a maximum optical zoom magnification. The picture shot when the zoom magnification is larger than the preset zoom magnification is obtained by processing the picture shot under the preset zoom magnification. For example, if the maximum magnification of the optical zoom of the mobile phone is "10 ×", the preset zoom magnification may be "10 ×". Then the picture taken by the mobile phone with the magnification of more than 10 x is obtained by clipping and post-processing (processing operations such as point filling and image coding) the picture taken with the magnification of 10 x.
For another example, if the maximum magnification of the optical zoom of the mobile phone is "5 ×", the preset zoom magnification may be "5 ×".
Then the picture taken by the phone at a magnification of greater than "5 x" is obtained by cropping and post-processing the picture taken at "5 x" magnification.
It should be noted that, if the preset zoom magnification is the maximum magnification of the optical zoom of the mobile phone, the shooting process in which the zoom magnification used by the mobile phone is less than or equal to the preset zoom magnification may be referred to as an optical zoom shooting process; the shooting process in which the zoom magnification used by the mobile phone is greater than the preset zoom magnification can be referred to as a digital zoom shooting process.
It can be seen that, in the shooting method provided in the embodiment of the present application, the shot preview image (the first view frame) is not continuously enlarged with the increase of the zoom magnification used by the electronic device by the user, but the preview image is displayed through the second view frame after the shot preview image is enlarged to a certain magnification. Therefore, when the lens shakes or the shooting object moves, the user can easily find the shooting object from the first view frame, and the shooting experience of the user in high-magnification amplification is improved.
It should be noted that the size of the second finder frame may be decreased as the zoom ratio is increased. As shown in fig. 4 (d) and (e), the user increases the zoom magnification used by the mobile phone from "20 ×" to "50 ×", and the size of the second finder frame 305 decreases accordingly.
The size of the second finder frame 305 may be increased as the zoom magnification is decreased. As shown in fig. 4 (e) and (f), the user decreases the zoom magnification used by the mobile phone from "50 ×" to "20 ×", and the size of the second finder frame 305 increases accordingly.
It should be noted that, in the embodiment of the present application, when the zoom magnification used by the mobile phone is greater than the preset zoom magnification (i.e., the mobile phone is in the digital zoom photographing stage), the range of the image photographed by the mobile phone is no longer what is seen or what is obtained, i.e., the photographed image is no longer the first view frame 302 but the second view frame 305. In other words, when the zoom magnification used by the mobile phone is larger than the preset zoom magnification, the user does not need to decide the timing to instruct the mobile phone to perform the photographing operation based on the first finder frame 302, but decides the timing to instruct the mobile phone to perform the photographing operation based on the image (or screen) of the second finder frame 305. As shown in fig. 5 (a), when the zoom magnification (10 ×) used by the mobile phone is not greater than the preset zoom magnification (i.e., the mobile phone is in the optical zoom photographing stage), the image (final image) photographed by the mobile phone is the same as the screen of the first finder frame 302. As shown in (b) and (c) of fig. 5, when the zoom magnification (20 × or 50 ×) used by the mobile phone is larger than the preset zoom magnification, the image captured by the mobile phone is no longer the same as the screen of the first finder frame 302 but is the same as the screen of the second finder frame 305.
In some embodiments, the user may change the position of the second finder frame in the first finder frame to adjust the captured picture of the second finder frame. For example, the user may change the position of the second viewfinder in the first viewfinder by touching a particular control on the cell phone screen, pressing a particular physical key or combination of keys, inputting speech, air gesture, etc.
For example, as shown in fig. 6 (a), the user may adjust the photographed picture in the second finder frame 305 from "person" to "kite" by dragging the second finder frame 305 to move in the first finder frame 302.
It can be seen that, in the shooting method provided by the embodiment of the application, the user can change the shooting picture of the second view finder through multiple modes, and compared with the design that the existing shooting method only can use the center of the view finder as a shooting point, the shooting method has greater elasticity for imaging, and further improves the shooting experience of the user. For example, when a user fixes a mobile phone by using a tripod or the like, the shooting angle of the mobile phone is already fixed, and if a shooting target moves, the user needs to move the mobile phone to track the shooting target, which is time-consuming and labor-consuming. In the embodiment of the application, as long as the shooting target is located in the first view finder, the user can easily track the shooting target only by moving the second view finder.
In some embodiments, in the case where the first view frame 302 and the second view frame 305 coexist in the shooting interface, the target area of the first view frame may be subjected to image processing, which facilitates the user to distinguish the target area of the first view frame from the second view frame. The target area of the first view frame is an area which is not overlapped with the second view frame in the first view frame.
For example, as shown in (b) of fig. 6, in the case where the first finder frame 302 and the second finder frame 305 coexist in the photographing interface, the brightness of the target area of the first finder frame 302 may be reduced so that the user can distinguish the target area of the first finder frame 302 and the second finder frame 305 by the brightness.
Further exemplarily, as shown in fig. 6 (c), in a case where the first view frame 302 and the second view frame 305 coexist in the photographing interface, the target area of the first view frame may be blurred so that the user distinguishes the target area of the first view frame 302 and the second view frame 305 by sharpness.
Further exemplarily, in a case where the first view frame 302 and the second view frame 305 coexist in the shooting interface, a target area of the first view frame 302 may be subjected to layer processing, for example, one layer is covered on the target area, so that a user can distinguish the target area of the first view frame 302 from the second view frame 305.
In some embodiments, the mobile phone may perform tracking shooting on the shooting object in the second finder frame 305. For example, the user may instruct the mobile phone to perform tracking shooting on the shooting object in the second finder frame 305 by touching a specific control on the mobile phone screen, pressing a specific physical key or key combination, inputting voice, and performing a blank gesture, where when the shooting object moves, the second finder frame 305 moves along with the shooting object, that is, the shooting object is always kept in the second finder frame 305.
Illustratively, as shown in fig. 7 (a), the "kite" is located in the second finder frame 305, the following control 306 is included beside the second finder frame 305, the following control 306 displays "off" to indicate that the tracking shot is not currently turned on, and the following control 306 displays "on" to indicate that the tracking shot is currently turned on. As shown in (a) and (b) of fig. 7, when the following control 306 displays "close", the user instructs the mobile phone to track and shoot a "kite" in the second finder frame by clicking the following control 306, the "kite" flies from the right side to the left side, and the mobile phone follows the "kite" in the second finder frame 305, so that the second finder frame 305 also moves from the right side to the left side.
Further illustratively, the second finder frame 305 includes a following control 306, the following control 306 displays "close" to indicate that the tracking shooting is currently started, and the user can instruct the mobile phone to stop the tracking shooting of the object in the second finder frame by clicking the following control 306 when the following control 306 displays "close". The follow control 306 displays "on" indicating that the tracking shot is not currently on. The user can instruct the mobile phone to start tracking and shooting the object in the second view finder by clicking the following control 306 when the following control 306 displays "on".
Further illustratively, the mobile phone may automatically turn on the tracking shot when a preset condition is satisfied. For example, when there is an object in the second finder frame 305 and the second finder frame 305 has not moved for a certain period of time, the cellular phone may start tracking shooting to follow the object in the second finder frame 305.
The above-mentioned specific method for performing tracking shooting on the shooting object in the second view finder may be processed by any method that can be thought by those skilled in the art, and this is not particularly limited in this embodiment of the application.
In some embodiments, the mobile phone may stop the tracking shooting of the photographic subject in the second finder frame. For example, the user may instruct the mobile phone to stop tracking shooting the shooting object in the second viewfinder by touching a specific control on the mobile phone screen, pressing a specific physical key or key combination, inputting voice, and/or a blank gesture. For example, as shown in fig. 7 (b), the user may instruct the mobile phone to stop tracking shooting the "kite" in the second finder frame by clicking the following control 306 beside the second finder frame 305 while the following control 306 displays "on"; after the user clicks the follow-up control 306 and stops the tracking shooting of the object in the second finder frame, the follow-up control 306 may be displayed as "off" as shown in fig. 7 (a).
The digital zoom photographing process of the photographing method provided by the embodiment of the present application, that is, the photographing process in which the first view frame 302 and the second view frame 305 coexist in the photographing interface, as shown in fig. 8, is described below with reference to fig. 8, and the process includes:
s801, the mobile phone records the coordinates of the four corners of the second finder frame 305.
The four-corner coordinates of the second finder frame 305 are the four-corner coordinates of the second finder frame 305 in the target coordinate system. Alternatively, the target coordinate system may be a coordinate system with an arbitrary point (e.g., a vertex, a center point, etc.) in the first viewfinder 302 as a coordinate origin.
Illustratively, as shown in fig. 9 (a), the target coordinate system is a coordinate system having a lower left vertex of the first finder frame 302 as a coordinate origin. It can be seen that the four corner coordinates of the second finder frame 305 are (X1, Y1), (X2, Y2), (X3, Y3), and (X4, Y4), respectively.
S802, mobile phone response operation.
If the operation is a zoom operation, the mobile phone executes S803.
If the operation is an operation of dragging the second finder frame, the mobile phone executes S804.
If the operation is a tracking shooting operation, the mobile phone executes S805.
If the operation is a photographing operation, the mobile phone performs S806.
S803, if the zooming operation is to increase the focal length, the mobile phone reduces the second view finder by taking the center of the second view finder as the origin, and then S801 is executed; if the zoom operation is to reduce the focal length, the mobile phone increases the second finder frame with the center of the second finder frame as the origin, and then executes S801. It should be noted that, if the zoom operation is to zoom out, when the zoom magnification is reduced to the preset zoom magnification, the size of the second finder frame 305 is the same as the size of the first finder frame 302, and the second finder frame 305 may not be displayed any more, that is, when the zoom magnification is reduced to the preset zoom magnification, the second finder frame 305 is stopped being displayed in the first finder frame 302.
S804, the mobile phone moves the second finder frame according to the operation, and then executes S801.
S805, the mobile phone moves the second finder frame following the object in the second finder frame, and then S801 is executed.
And S806, performing imaging cutting and post-processing by the mobile phone according to the coordinates of the four corners of the current second viewing frame to obtain final imaging.
The specific method for obtaining the final image by the post-processing may be any method that can be conceived by those skilled in the art, and this is not particularly limited in the embodiment of the present application. For example, the mobile phone may perform the complement optimization on the image obtained by the imaging clipping, and then perform the image coding on the image after the complement optimization to obtain the final imaging.
Illustratively, as shown in fig. 9 (a) and (b), the mobile phone performs image cropping and post-processing according to the four-corner coordinates of the second viewfinder frame shown in fig. 9 (a) in response to the user's photographing operation to obtain a first image 307 shown in fig. 9 (b).
The shooting method provided by the embodiment of the present application is described below with reference to fig. 10, and as shown in fig. 10, the method includes:
s1001, the electronic equipment receives a first operation, responds to the first operation, starts the camera and displays a shooting interface.
The zoom magnification of the camera is a first zoom magnification, the shooting interface comprises a first view frame, the first view frame comprises a first preview image, and the first preview image is a view frame under the first zoom magnification. The first preview image is a framing picture under a first zoom magnification, and the method comprises the following steps: when shooting is carried out under the first zoom magnification, the image shot by the camera is the same as the framing picture, namely the shot image is the same as the first preview image.
The first zoom magnification is usually a magnification at which the camera is not zoomed, and the first zoom magnification may be represented by "1 ×".
Illustratively, as shown in fig. 3 (a), the electronic device receives and responds to the operation of clicking a camera application icon on a main interface of the electronic device by a user, opens a camera application, and displays a shooting interface as shown in fig. 3 (b). As shown in fig. 3 (b), the shooting interface includes a first finder frame 302, the first finder frame 302 includes a first preview image, and the first preview image is a finder screen with a zoom magnification of "1 ×" of the camera.
And S1002, the electronic equipment receives a second operation, and in response to the second operation, the camera is adjusted to a second zoom magnification, and the first preview image displayed in the first view frame is displayed in an enlarged mode to be a second preview image.
The second zoom magnification is greater than the first zoom magnification, the second zoom magnification is less than or equal to the preset zoom magnification, the second preview image is a viewing frame under the second zoom magnification, and the second preview image can be a part of the first preview image. The second preview image is a framing picture at a second zoom magnification, and the method comprises the following steps: when shooting is carried out under the second zoom magnification, the image shot by the camera is the same as the view-finding picture, namely the shot image is the same as the second preview image.
Alternatively, the preset zoom magnification may be a maximum optical zoom magnification of the camera. For example, if the maximum optical zoom magnification of the camera is 10 times, the preset zoom magnification is 10 times.
Illustratively, as shown in fig. 4 (a) and (b), the electronic apparatus receives and responds to a zoom operation in which the user adjusts the zoom magnification from "1 ×" to "5 ×" in the shooting interface, adjusts the zoom magnification of the camera from "1 ×" to "5 ×" and displays the first preview image displayed in the first finder frame 302 as shown in fig. 4 (b) as a second preview image, which can be seen as a part of the first preview image by fig. 4 (a) and (b).
S1003, the electronic equipment receives a third operation, responds to the third operation, adjusts the camera to be at a third zoom magnification, displays a third preview image in the first view frame, and displays a second view frame in the first view frame.
And the third zoom magnification is greater than the preset zoom magnification, the second viewing frame covers a partial area of the first viewing frame, and the fourth preview image is displayed in the second viewing frame. The third preview image is a framing picture under a preset zoom magnification, and the fourth preview image is a framing picture under the third zoom magnification. The third preview image is a view-finding picture under a preset zoom magnification, which means that: when shooting is carried out under the preset zoom magnification, the image shot by the camera is the same as the third preview image. The fourth preview image is a framing picture at the third zoom magnification, and the method comprises the following steps: when shooting is carried out under the third zoom magnification, the image shot by the camera is the same as the fourth preview image.
Illustratively, as shown in (b), (c), and (d) in fig. 4, the electronic apparatus receives and responds to a zoom operation in which the user adjusts the zoom magnification from "5 ×" to "20 ×" in the shooting interface, and displays a third preview image at the camera zoom magnification of "10 ×" in the first finder frame 302, displays the second finder frame 305 within the first finder frame 302, and displays a finder screen at the camera zoom magnification of "20 ×" in the second finder frame 305 as shown in (d) in fig. 4.
It can be understood that, if the second zoom magnification is smaller than the preset zoom magnification, the third preview image is a portion of the second preview image after being displayed in an enlarged manner. For example, as shown in (b) and (d) of fig. 4, the corresponding second zoom magnification "5 ×" of (b) of fig. 4 is smaller than the preset zoom magnification "10 ×", and it can be seen that the third preview image shown in (d) of fig. 4 is a portion of the enlarged display of the second preview image shown in (b) of fig. 4.
And if the second zoom magnification is equal to the preset zoom magnification, the third preview image is the same as the second preview image. For example, as shown in (c) and (d) of fig. 4, the second zoom magnification "10 x" corresponding to (c) of fig. 4 is equal to the preset zoom magnification "10 x", and it can be seen that the third preview image shown in (d) of fig. 4 is the same as the second preview image shown in (c) of fig. 4.
In one possible implementation, after displaying the second view frame in the first view frame, the method may further include: the brightness of the target area of the first finder frame is reduced as shown in fig. 6 (b). The target area of the first view frame is an area which is not overlapped with the second view frame in the first view frame.
In another possible implementation manner, after displaying the second view frame in the first view frame, the method may further include: the target area of the first finder frame is subjected to blurring processing as shown in (c) of fig. 6.
In yet another possible implementation manner, after displaying the second frame in the first frame, the method may further include: and overlaying a layer on the target area of the first view frame.
As shown in fig. 10, optionally, after S1003, the embodiment of the present application may further include one or more of the following steps: s1004, S1005, S1006, S1007, or S1008. The steps S1004, S1005, S1006, S1007, and S1008 may be parallel, or may have a certain sequence according to actual conditions. For example: after S1004, S1005, or S1006, or 1007 may be performed; alternatively, after S1005, S1004 or S1006 or 1007 may be performed; alternatively, after S1006, S1004 or S1005 or 1007 or S1008 may be performed; alternatively, after S1007, S1004 or S1005 or 1006 or S1008 may be performed.
And S1004, the electronic equipment receives a fifth operation of increasing the zoom magnification of the camera, responds to the fifth operation, increases the zoom magnification of the camera, reduces the second view frame, and continues to display the third preview image in the first view frame.
Illustratively, as shown in fig. 4 (d) and (e), the electronic apparatus receives and responds to a zoom operation in which the user adjusts the zoom magnification from "20 ×" to "50 ×" in the shooting interface, reduces the second finder frame 305 as shown in fig. 4 (e), and continues to display the third preview image with the camera zoom magnification of "10 ×" in the first finder frame.
And S1005, the electronic equipment receives a sixth operation of reducing the zoom ratio of the camera, responds to the sixth operation, reduces the zoom ratio of the camera, increases the second view frame, and continues to display the third preview image in the first view frame.
Illustratively, as shown in fig. 4 (e) and (f), the electronic apparatus receives and responds to a zoom operation in which the user adjusts the zoom magnification from "50 ×" to "20 ×" in the shooting interface, increases the second finder frame 305 as shown in fig. 4 (f), and continues to display the third preview image at the camera zoom magnification of "10 ×" in the first finder frame.
S1006, the electronic device receives a seventh operation of dragging the second view finder, and moves the second view finder in the first view finder according to the seventh operation in response to the seventh operation.
Illustratively, as shown in fig. 6 (a), the electronic apparatus receives and responds to an operation that the user drags the second finder frame to move to the upper right, and moves the second finder frame 305 according to the operation.
S1007, the electronic device receives an eighth operation to start object tracking, and moves the second finder frame following the object in the second finder frame in response to the eighth operation.
Illustratively, as shown in fig. 7 (a) and (b), the second finder frame 305 includes a follow-up control 306, the follow-up control 306 displays "off" indicating that the tracking shot is not currently turned on, and the follow-up control 306 displays "on" indicating that the tracking shot is currently turned on. The electronic device receives and responds to the operation that the user clicks the following control 306 to enable the following control 306 to display "on" when the following control 306 displays "off", tracking shooting is started, tracking shooting is performed on the "kite" in the second view finder, as shown in (b) in fig. 7, the "kite" in the shooting interface flies from the right side to the left side of the shooting interface, the electronic device follows the "kite" in the second view finder 305, and the second view finder 305 also moves from the right side to the left side of the shooting interface.
S1008, the electronic device receives the fourth operation, responds to the fourth operation, shoots and obtains the image with the same content as the fourth preview image
Wherein the fourth operation is a photographing operation.
Illustratively, as shown in fig. 5 (b), the electronic device receives and performs shooting in response to a shooting operation input by the user to the left shooting interface in fig. 5 (b), resulting in an image shown on the right side of fig. 5 (b), which can be seen to be the same as the second preview image displayed in the second finder frame 305 on the left side of fig. 5 (b). .
As another example, as shown in fig. 5 (c), the electronic device receives and performs shooting in response to a shooting operation input by the user to the left shooting interface in fig. 5 (c), resulting in an image shown on the right side of fig. 5 (c), which is seen to be the same as the second preview image displayed in the second finder frame 305 on the left side of fig. 5 (c).
An electronic apparatus for performing the above-described photographing method is described below.
It will be appreciated that the electronic device, in order to implement the above-described functions, comprises corresponding hardware and/or software modules for performing the respective functions. The present application is capable of being implemented in hardware or a combination of hardware and computer software in conjunction with the exemplary algorithm steps described in connection with the embodiments disclosed herein. Whether a function is performed in hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, in conjunction with the embodiments, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiment of the present application, the electronic device may be divided into the functional modules according to the method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated module may be implemented in the form of hardware. It should be noted that the division of the modules in this embodiment is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
In the case of dividing each functional module by corresponding each function, the electronic apparatus may include: a transceiver unit and a processing unit that may implement the methods performed by the electronic device in the above-described method embodiments, and/or other processes for the techniques described herein.
It should be noted that all relevant contents of each step related to the above method embodiment may be referred to the functional description of the corresponding functional module, and are not described herein again.
In case an integrated unit is employed, the electronic device may comprise a processing unit, a storage unit and a communication unit. The processing unit may be configured to control and manage an action of the electronic device, and for example, may be configured to support the electronic device to execute steps performed by the above units. The memory unit may be used to support the electronic device in executing stored program code, and/or data, etc. The communication unit may be used to support communication of the electronic device with other devices.
Wherein the processing unit may be a processor or a controller. Which may implement or execute the various illustrative logical blocks, modules, and circuits described in connection with the disclosure herein. A processor may also be a combination of computing functions, e.g., a combination of one or more microprocessors, a Digital Signal Processing (DSP) and a microprocessor, or the like. The storage unit may be a memory. The communication unit may specifically be a radio frequency circuit, a bluetooth chip, a Wi-Fi chip, or other devices that interact with other electronic devices.
In one possible implementation, an electronic device according to an embodiment of the present application includes a processor and a transceiver. The relevant functions implemented by the transceiving unit and the processing unit can be implemented by a processor.
Optionally, the electronic device may further comprise a memory, the processor and the memory being in communication with each other through an internal connection path. The related functions implemented by the storage unit in the above can be implemented by a memory.
The embodiment of the present application further provides a computer storage medium, where a computer instruction is stored in the computer storage medium, and when the computer instruction runs on an electronic device, the electronic device is enabled to execute the relevant method steps to implement the shooting method in the foregoing embodiment.
The embodiment of the present application further provides a computer program product, which when running on a computer, causes the computer to execute the above related steps to implement the shooting method in the above embodiment.
The embodiment of the present application further provides an electronic device, and this apparatus may specifically be a chip, an integrated circuit, a component, or a module. In particular, the apparatus may comprise a processor and a memory coupled to store instructions, or the apparatus may comprise at least one processor configured to retrieve instructions from an external memory. When the device runs, the processor can execute the instructions to enable the chip to execute the shooting method in the above-mentioned method embodiments.
Illustratively, the electronic device may be a chip that includes one or more processors and interface circuitry. Optionally, the chip may further include a bus.
The processor may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above-described photographing method may be implemented by an integrated logic circuit of hardware in a processor or instructions in the form of software.
Alternatively, the processor may be a general purpose processor, a Digital Signal Processing (DSP) device, an integrated circuit (ASIC), a field-programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic device, or discrete hardware components. The methods, steps disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The interface circuit can be used for sending or receiving data, instructions or information, the processor can process the data, the instructions or other information received by the interface circuit, and the processing completion information can be sent out through the interface circuit.
Optionally, the chip further comprises a memory, which may include read only memory and random access memory, and provides operating instructions and data to the processor. The portion of memory may also include non-volatile random access memory (NVRAM).
Optionally, the memory stores executable software modules or data structures, and the processor may perform corresponding operations by calling the operation instructions stored in the memory (the operation instructions may be stored in an operating system).
Alternatively, the chip may be used in the electronic device or DOP according to the embodiments of the present application. Optionally, the interface circuit may be configured to output the execution result of the processor. Reference may be made to the foregoing embodiments for a shooting method provided in connection with one or more embodiments of the present application, which are not described herein again.
It should be noted that the functions corresponding to the processor and the interface circuit may be implemented by hardware design, software design, or a combination of hardware and software, which is not limited herein.
The electronic device, the computer storage medium, the computer program product, or the chip provided in this embodiment are all configured to execute the corresponding method provided above, and therefore, the beneficial effects that can be achieved by the electronic device, the computer storage medium, the computer program product, or the chip may refer to the beneficial effects in the corresponding method provided above, and are not described herein again.
It should be understood that, in the various embodiments of the present application, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the above-described division of units is only one type of division of logical functions, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The above-described functions, if implemented in the form of software functional units and sold or used as a separate product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the above-described method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, an optical disk, or other various media capable of storing program codes.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (13)

1. A shooting method is applied to an electronic device comprising a camera, and comprises the following steps:
the electronic equipment receives a first operation;
responding to the first operation, the electronic equipment starts the camera and displays a shooting interface, wherein the zoom magnification of the camera is a first zoom magnification, the shooting interface comprises a first view frame, the first view frame comprises a first preview image, and the first preview image is a view frame under the first zoom magnification;
the electronic equipment receives a second operation;
in response to the second operation, the electronic device adjusts the camera to a second zoom magnification, and the first preview image displayed in the first view frame is displayed as a second preview image in an enlarged manner, wherein the second zoom magnification is greater than the first zoom magnification, the second zoom magnification is less than or equal to a preset zoom magnification, and the second preview image is a view finding picture at the second zoom magnification;
the electronic equipment receives a third operation;
in response to the third operation, the electronic device adjusts the camera to a third zoom magnification, displays a third preview image in the first view frame, and displays a second view frame in the first view frame, wherein the third zoom magnification is greater than the preset zoom magnification, the second view frame covers a partial area of the first view frame, and a fourth preview image is displayed in the second view frame; the third preview image is a view frame under the preset zoom magnification, and the fourth preview image is a view frame under the third zoom magnification.
2. The method of claim 1, wherein after displaying the second viewfinder within the first viewfinder, the method further comprises: the electronic equipment receives a fourth operation, wherein the fourth operation is a shooting operation;
and responding to the fourth operation, and shooting by the electronic equipment to obtain an image with the same content as the fourth preview image.
3. The method of claim 1, wherein after displaying the second viewfinder within the first viewfinder, the method further comprises:
receiving a fifth operation of increasing the zoom magnification of the camera;
and responding to the fifth operation, increasing the zoom magnification of the camera, reducing the second view frame, and continuously displaying the third preview image in the first view frame.
4. The method of claim 2 or 3, wherein after displaying the second frame within the first frame, the method further comprises:
receiving a sixth operation of reducing the zoom magnification of the camera;
and responding to the sixth operation, reducing the zoom magnification of the camera, increasing the second view frame, and continuing to display the third preview image in the first view frame.
5. The method of any of claims 2-4, wherein after displaying the second frame within the first frame, the method further comprises:
receiving a seventh operation of dragging the second viewfinder;
in response to the seventh operation, moving the second viewfinder within the first viewfinder according to the seventh operation.
6. The method of any of claims 2-5, wherein after displaying the second viewfinder within the first viewfinder, the method further comprises:
receiving an eighth operation to start object tracking;
in response to the eighth operation, moving the second finder frame following the object within the second finder frame.
7. The method of any of claims 1-6, wherein after displaying the second viewfinder within the first viewfinder, the method further comprises:
reducing the brightness of a target area of the first viewfinder; or alternatively
Blurring a target area of the first view frame; or alternatively
Overlaying a layer on a target area of the first view frame;
the target area of the first view frame is an area which is not overlapped with the second view frame in the first view frame.
8. The method of any one of claims 1 to 7, wherein the preset zoom magnification is a maximum optical zoom magnification of the camera.
9. The method of any of claims 1-8, wherein the second preview image is a portion of the first preview image.
10. The method according to any one of claims 1 to 9, characterized in that:
if the second zoom magnification is smaller than the preset zoom magnification, the third preview image is a part of the second preview image after being magnified and displayed;
and if the second zoom magnification is equal to the preset zoom magnification, the third preview image is the same as the second preview image.
11. An electronic device comprising at least one processor and memory, the at least one processor executing program instructions stored in the memory to cause the electronic device to implement the method of any of claims 1-10.
12. A computer-readable storage medium for storing a computer program, characterized in that the computer program comprises instructions for implementing the method of any of the preceding claims 1 to 10.
13. A computer program product comprising instructions which, when run on a computer or processor, cause the computer to carry out the method of any one of claims 1 to 10.
CN202111049688.9A 2021-09-08 2021-09-08 Shooting method and electronic equipment Pending CN115802145A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111049688.9A CN115802145A (en) 2021-09-08 2021-09-08 Shooting method and electronic equipment
PCT/CN2022/112456 WO2023035868A1 (en) 2021-09-08 2022-08-15 Photographing method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111049688.9A CN115802145A (en) 2021-09-08 2021-09-08 Shooting method and electronic equipment

Publications (1)

Publication Number Publication Date
CN115802145A true CN115802145A (en) 2023-03-14

Family

ID=85473430

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111049688.9A Pending CN115802145A (en) 2021-09-08 2021-09-08 Shooting method and electronic equipment

Country Status (2)

Country Link
CN (1) CN115802145A (en)
WO (1) WO2023035868A1 (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014206592A (en) * 2013-04-11 2014-10-30 キヤノン株式会社 Imaging device, and control method and program of the same
CN111010506A (en) * 2019-11-15 2020-04-14 华为技术有限公司 Shooting method and electronic equipment
CN113489894B (en) * 2019-12-25 2022-06-28 华为技术有限公司 Shooting method and terminal in long-focus scene
CN111031248A (en) * 2019-12-25 2020-04-17 维沃移动通信(杭州)有限公司 Shooting method and electronic equipment
CN112188097B (en) * 2020-09-29 2022-08-09 Oppo广东移动通信有限公司 Photographing method, photographing apparatus, terminal device, and computer-readable storage medium

Also Published As

Publication number Publication date
WO2023035868A1 (en) 2023-03-16

Similar Documents

Publication Publication Date Title
CN111212235B (en) Long-focus shooting method and electronic equipment
US11831977B2 (en) Photographing and processing method and electronic device
US11669242B2 (en) Screenshot method and electronic device
CN112714901B (en) Display control method of system navigation bar, graphical user interface and electronic equipment
US11949978B2 (en) Image content removal method and related apparatus
WO2021185250A1 (en) Image processing method and apparatus
EP4199499A1 (en) Image capture method, graphical user interface, and electronic device
CN113709355B (en) Sliding zoom shooting method and electronic equipment
CN115689963B (en) Image processing method and electronic equipment
CN115967851A (en) Quick photographing method, electronic device and computer readable storage medium
EP4262226A1 (en) Photographing method and related device
CN115802145A (en) Shooting method and electronic equipment
WO2023160224A9 (en) Photographing method and related device
EP4383191A1 (en) Display method and electronic device
CN117395496A (en) Shooting method and related equipment
CN114125197A (en) Mobile terminal and photographing method thereof
CN117135448A (en) Shooting method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination