WO2023035868A1 - Procédé de photographie et dispositif électronique - Google Patents

Procédé de photographie et dispositif électronique Download PDF

Info

Publication number
WO2023035868A1
WO2023035868A1 PCT/CN2022/112456 CN2022112456W WO2023035868A1 WO 2023035868 A1 WO2023035868 A1 WO 2023035868A1 CN 2022112456 W CN2022112456 W CN 2022112456W WO 2023035868 A1 WO2023035868 A1 WO 2023035868A1
Authority
WO
WIPO (PCT)
Prior art keywords
frame
viewfinder
preview image
zoom ratio
electronic device
Prior art date
Application number
PCT/CN2022/112456
Other languages
English (en)
Chinese (zh)
Inventor
林嵩晧
林于超
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2023035868A1 publication Critical patent/WO2023035868A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals

Definitions

  • the present application relates to the field of electronic technology, in particular to a photographing method and electronic equipment.
  • the shooting function has been popularized on electronic devices.
  • the user uses an electronic device to photograph a distant object, the user needs to zoom in to enlarge the photographed image so as to make the photographed object clearer.
  • the present application provides a shooting method and electronic equipment, which can improve the user's shooting experience when zooming in at a high magnification.
  • the application adopts the following technical solutions:
  • the present application provides a shooting method, which is applied to an electronic device including a camera, and the method includes: the electronic device receives and responds to a first operation and starts a camera with a first zoom magnification and displays the camera including the first zoom magnification
  • a viewfinder frame and the first viewfinder frame includes a shooting interface of a first preview image
  • the electronic device receives and responds to the second operation and adjusts the camera to a second zoom ratio
  • the first preview image displayed in the first viewfinder frame is enlarged and displayed as
  • the second preview image the electronic device receives and responds to the third operation and adjusts the camera to a third zoom ratio and displays the third preview image in the first viewfinder frame, and displays the second viewfinder frame in the first viewfinder frame.
  • the first preview image is the viewfinder picture at the first zoom ratio
  • the second preview image is the viewfinder picture at the second zoom ratio
  • the third preview image is the viewfinder picture at the preset zoom ratio
  • the fourth preview image is the first zoom ratio Viewfinder screen under three zoom ratios
  • the second zoom ratio is greater than the first zoom ratio and less than or equal to the preset zoom ratio
  • the third zoom ratio is larger than the preset zoom ratio
  • the second viewing frame covers a part of the first viewing frame and the second The fourth preview image is displayed in the second viewfinder frame.
  • the first viewfinder frame in the shooting preview screen will not continue to enlarge as the user increases the zoom magnification used by the electronic device, but will be enlarged to a certain magnification, that is, greater than the preset magnification.
  • the preview screen is displayed through the second viewfinder. In this way, when the lens shakes or the subject moves, the user can easily find the subject from the first viewfinder frame, thereby improving the user's shooting experience when zooming in at a high magnification.
  • the preset zoom ratio may be the maximum optical zoom ratio of the camera. For example, if the maximum optical zoom ratio of the camera is 10 times, the preset zoom ratio is 10 times.
  • the second preview image may be a part of the first preview image.
  • the third preview image is an enlarged and displayed part of the second preview image; if the second zoom ratio is equal to the preset zoom ratio If the zoom ratio is set, the third preview image is the same as the second preview image.
  • the method may further include: the electronic device receiving a fourth operation, the fourth operation being a shooting operation ; In response to the fourth operation, the electronic device takes pictures to obtain an image with the same content as the fourth preview image.
  • the final imaging obtained by shooting when there are both the first viewfinder frame and the second viewfinder frame in the shooting interface is an image with the same content as the preview image in the second viewfinder frame.
  • the method may further include: receiving a fifth operation of increasing the zoom ratio of the camera input by a user; In response to the fifth operation, increase the zoom ratio of the camera, reduce the second viewing frame, and continue to display the third preview image in the first viewing frame.
  • the method further includes: receiving a sixth operation of reducing the zoom ratio of the camera input by the user; responding In the sixth operation, reduce the zoom ratio of the camera, increase the second viewing frame, and continue to display the third preview image in the first viewing frame.
  • the method further includes: receiving a seventh operation of dragging the second viewfinder frame input by the user; responding In the seventh operation, moving the second viewing frame within the first viewing frame according to the seventh operation.
  • the user can change the shooting picture of the second viewing frame by moving the second viewing frame.
  • the imaging is greater, which further improves the user's shooting experience.
  • the shooting angle of the mobile phone has been fixed. If the shooting target moves, the user needs to move the mobile phone to track the shooting target, which is time-consuming and laborious.
  • the embodiment of the present application as long as the shooting target is located in the first viewfinder frame, the user can easily track the shooting target only by moving the second viewfinder frame.
  • the method further includes: receiving an eighth operation of starting object tracking input by a user; Operation, moving the second viewfinder frame following the object in the second viewfinder frame.
  • the user can instruct the electronic device to track and shoot the subject by operating, that is, when the subject moves, the second viewfinder moves with the subject, that is, the subject is always kept in the second viewfinder, thereby Preventing the subject from being lost due to movement further improves the user's shooting experience.
  • the method further includes: reducing the brightness of the target area of the first viewing frame, wherein the second viewing frame A target area of a viewfinder frame is an area in the first viewfinder frame that does not overlap with the second viewfinder frame.
  • the method further includes: performing blurring processing on a target area of the first viewing frame.
  • the method further includes: superimposing a layer on a target area of the first view frame.
  • the first viewfinder frame and the second viewfinder frame exist at the same time on the shooting interface, by performing image processing on the target area of the first viewfinder frame, it is convenient for the user to distinguish the target area of the first viewfinder frame from the second viewfinder frame, further improving the The user's shooting experience.
  • the present application provides an electronic device, including: at least one processor, a memory, and at least one camera, the memory, the camera and the processor are coupled, the memory is used to store computer program codes, and the computer program codes include computer instructions.
  • the processor reads computer instructions from the memory, so that the electronic device performs the following operations: receiving a first operation; in response to the first operation, starting the camera and displaying a shooting interface, and the zoom ratio of the camera is the first zoom magnification, the shooting interface includes a first viewfinder frame, and the first viewfinder frame includes a first preview image, wherein the first preview image is a viewfinder image at the first zoom magnification; receiving a second operation; responding In the second operation, the camera is adjusted to a second zoom ratio, and the first preview image displayed in the first viewfinder frame is enlarged and displayed as a second preview image; wherein, the second zoom The magnification is greater than the first zoom magnification, and the second zoom magnification is less than or equal to the preset zoom magnification, and the
  • the preset zoom ratio may be the maximum optical zoom ratio of the camera.
  • the second preview image may be a part of the first preview image.
  • the third preview image is an enlarged and displayed part of the second preview image; if the second zoom ratio is equal to the preset zoom ratio If the zoom ratio is set, the third preview image is the same as the second preview image.
  • the processor is further configured to cause the electronic device to perform the following operations: receive a fourth operation, where the fourth operation is a photographing operation; in response to the fourth operation, the electronic device performs shooting to obtain an image with the same content as the fourth preview image,
  • the processor is further configured to cause the electronic device to perform the following operations: receiving a fifth operation of increasing the zoom ratio of the camera input by the user; in response to the fifth operation, increasing The zoom ratio of the camera reduces the second viewfinder frame, and the third preview image continues to be displayed in the first viewfinder frame.
  • the processor is further configured to cause the electronic device to perform the following operations: receive a sixth operation of reducing the zoom ratio of the camera input by the user; in response to the sixth operation, reduce the The zoom ratio of the camera is increased to increase the second viewfinder frame, and the third preview image continues to be displayed in the first viewfinder frame.
  • the processor is further configured to cause the electronic device to perform the following operations: receiving a seventh operation of dragging the second viewfinder input by the user; in response to the seventh operation, according to the The seventh operation moves the second viewfinder frame within the first viewfinder frame.
  • the processor is further configured to cause the electronic device to perform the following operations: receiving a seventh operation of dragging the second viewfinder input by the user; receiving an eighth operation of starting object tracking input by the user; Operation: in response to the eighth operation, moving the second viewing frame following the object in the second viewing frame.
  • the processor is further configured to cause the electronic device to perform the following operation: reduce the brightness of the target area of the first viewfinder frame, where the target area of the first viewfinder frame is the An area in the first viewing frame that does not overlap with the second viewing frame.
  • the processor is further configured to cause the electronic device to perform the following operation: perform blurring processing on the target area of the first viewfinder frame.
  • the processor is further configured to cause the electronic device to perform the following operation: superimpose a layer on the target area of the first viewfinder frame.
  • an embodiment of the present application further provides an electronic device, the device includes: at least one processor, and when the at least one processor executes program codes or instructions, the above first aspect or any possible implementation thereof can be realized method described in .
  • the electronic device may further include at least one memory, and the at least one memory is used to store the program code or instruction.
  • the embodiment of the present application further provides a chip, including: an input interface, an output interface, and at least one processor.
  • the chip also includes a memory.
  • the at least one processor is used to execute the code in the memory, and when the at least one processor executes the code, the chip implements the method described in the above first aspect or any possible implementation thereof.
  • the aforementioned chip may also be an integrated circuit.
  • the embodiment of the present application further provides a terminal, where the terminal includes the foregoing electronic device or the foregoing chip.
  • the present application further provides a computer-readable storage medium for storing a computer program, and the computer program includes a method for realizing the above-mentioned first aspect or any possible implementation thereof.
  • the embodiments of the present application further provide a computer program product including instructions, which, when run on a computer, enable the computer to implement the method described in the above-mentioned first aspect or any possible implementation thereof.
  • the electronic equipment, computer storage medium, computer program product and chip provided in this embodiment are all used to execute the shooting method provided above. Therefore, the beneficial effects that it can achieve can refer to the beneficial effects of the shooting method provided above. effects, which will not be repeated here.
  • FIG. 1 is a schematic structural diagram of an electronic device provided in an embodiment of the present application.
  • FIG. 2 is a schematic diagram of a software structure of an electronic device provided in an embodiment of the present application.
  • FIG. 3 is a schematic diagram of a user interface of an electronic device provided in an embodiment of the present application.
  • FIG. 4 is a schematic diagram of a user interface of another electronic device provided in an embodiment of the present application.
  • FIG. 5 is a schematic diagram of a user interface of another electronic device provided in an embodiment of the present application.
  • FIG. 6 is a schematic diagram of a user interface of another electronic device provided in an embodiment of the present application.
  • FIG. 7 is a schematic diagram of a user interface of another electronic device provided in an embodiment of the present application.
  • FIG. 8 is a schematic flowchart of the digital zoom shooting process of the shooting method provided by the embodiment of the present application.
  • FIG. 9 is a schematic diagram of a user interface of another electronic device provided in an embodiment of the present application.
  • FIG. 10 is a schematic flowchart of a photographing method provided by an embodiment of the present application.
  • first and second in the specification and drawings of this application are used to distinguish different objects, or to distinguish different processes for the same object, rather than to describe a specific order of objects.
  • Optical zoom is to enlarge the image from the physical level through the optical refraction in the design of the camera itself. Due to the way of optical refraction, the image quality is lossless. Also because optical zoom is a physical technology, what you see is what you get in this part of the imaging.
  • Digital zoom uses software technology to zoom in and add points from the pixel dimension on the image that has been imaged. Since it is calculated and supplemented by algorithms at the pixel level, it is detrimental to the image quality.
  • the shooting method provided in the embodiment of the present application can be applied to mobile phones, tablet computers, wearable devices, vehicle-mounted devices, augmented reality (augmented reality, AR)/virtual reality (virtual reality, VR) equipment, notebook computers, super mobile personal computers ( Ultra-mobile personal computer, UMPC), netbook, personal digital assistant (personal digital assistant, PDA) and other electronic devices that can realize the shooting function, the embodiment of the present application does not impose any restrictions on the specific types of electronic devices.
  • the electronic device is equipped with a camera, such as a telephoto lens.
  • FIG. 1 is a schematic structural diagram of an electronic device 100 provided in an embodiment of the present application.
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, and an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone jack 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display screen 194, and A subscriber identification module (subscriber identification module, SIM) card interface 195 and the like.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, bone conduction sensor 180M, etc.
  • the structure illustrated in the embodiment of the present application does not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or fewer components than shown in the figure, or combine certain components, or separate certain components, or arrange different components.
  • the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor (neural-network processing unit, NPU) wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processing unit
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • baseband processor baseband processor
  • neural network processor neural-network processing unit, NPU
  • the controller may be the nerve center and command center of the electronic device 100 .
  • the controller can generate an operation control signal according to the instruction opcode and timing signal, and complete the control of fetching and executing the instruction.
  • processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transmitter (universal asynchronous receiver/transmitter, UART) interface, mobile industry processor interface (mobile industry processor interface, MIPI), general-purpose input and output (general-purpose input/output, GPIO) interface, subscriber identity module (subscriber identity module, SIM) interface, and /or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input and output
  • subscriber identity module subscriber identity module
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus
  • the processor 110 can couple the touch sensor 180K through the I2C interface, so that the processor 110 and the touch sensor 180K communicate through the I2C bus interface to realize the touch function of the electronic device 100 .
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193 .
  • MIPI interface includes camera serial interface (camera serial interface, CSI), display serial interface (display serial interface, DSI), etc.
  • the processor 110 communicates with the camera 193 through the CSI interface to realize the shooting function of the electronic device 100 .
  • the processor 110 communicates with the display screen 194 through the DSI interface to realize the display function of the electronic device 100 .
  • the interface connection relationship between the modules shown in the embodiment of the present application is only a schematic illustration, and does not constitute a structural limitation of the electronic device 100 .
  • the electronic device 100 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the charging management module 140 is configured to receive a charging input from a charger. Wherein, the charger may be a wireless charger or a wired charger.
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives the input from the battery 142 and/or the charging management module 140 to provide power for the processor 110 , the internal memory 121 , the external memory, the display screen 194 , the camera 193 , and the wireless communication module 160 .
  • the electronic device 100 realizes the display function through the GPU, the display screen 194 , and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos and the like.
  • the display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode, AMOLED), flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light emitting diodes (quantum dot light emitting diodes, QLED), etc.
  • the electronic device 100 may include 1 or N display screens 194 , where N is a positive integer greater than 1.
  • the electronic device 100 can realize the shooting function through the ISP, the camera 193 , the touch sensor, the video codec, the GPU, the display screen 194 and the application processor. For example, the shooting process introduced in the embodiment of this application.
  • the ISP is used for processing the data fed back by the camera 193 .
  • the light is transmitted to the photosensitive element of the camera through the lens, the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be located in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object generates an optical image through the lens and projects it to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the light signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other image signals. It should be understood that in the description of the embodiment of the application, the image in RGB format is used as an example. The embodiment of the application does not limit the image format. .
  • the electronic device 100 may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs.
  • the electronic device 100 can play or record videos in various encoding formats, for example: moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4 and so on.
  • MPEG moving picture experts group
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, so as to expand the storage capacity of the electronic device 100.
  • the internal memory 121 may be used to store computer-executable program codes including instructions.
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by executing instructions stored in the internal memory 121 .
  • the internal memory 121 may include an area for storing programs and an area for storing data.
  • the electronic device 100 can implement audio functions through the audio module 170 , the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playback, recording, etc.
  • the pressure sensor 180A is used to sense the pressure signal and convert the pressure signal into an electrical signal.
  • the gyro sensor 180B can be used to determine the motion posture of the electronic device 100 .
  • the air pressure sensor 180C is used to measure air pressure.
  • the magnetic sensor 180D includes a Hall sensor.
  • the electronic device 100 may use the magnetic sensor 180D to detect the opening and closing of the flip leather case.
  • the acceleration sensor 180E can detect the acceleration of the electronic device 100 in various directions (generally three axes).
  • the distance sensor 180F is used to measure the distance.
  • the electronic device 100 may measure the distance by infrared or laser. In some embodiments, when shooting a scene, the electronic device 100 may use the distance sensor 180F for distance measurement to achieve fast focusing.
  • Proximity light sensor 180G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
  • the ambient light sensor 180L is used for sensing ambient light brightness.
  • the electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to implement fingerprint unlocking, access to application locks, take pictures with fingerprints, answer incoming calls with fingerprints, and the like.
  • the temperature sensor 180J is used to detect temperature.
  • Touch sensor 180K also known as "touch panel".
  • the touch sensor 180K can be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the bone conduction sensor 180M can acquire vibration signals.
  • the audio module 170 can analyze the voice signal based on the vibration signal of the vibrating bone mass of the vocal part acquired by the bone conduction sensor 180M, so as to realize the voice function.
  • the keys 190 include a power key, a volume key and the like.
  • the key 190 may be a mechanical key. It can also be a touch button.
  • the electronic device 100 can receive key input and generate key signal input related to user settings and function control of the electronic device 100 .
  • the motor 191 can generate a vibrating reminder.
  • the motor 191 can be used for incoming call vibration prompts, and can also be used for touch vibration feedback. For example, touch operations applied to different applications (such as taking pictures, playing audio, etc.) may correspond to different vibration feedback effects.
  • the motor 191 may also correspond to different vibration feedback effects for touch operations acting on different areas of the display screen 194 .
  • the indicator 192 can be an indicator light, and can be used to indicate charging status, power change, and can also be used to indicate messages, missed calls, notifications, and the like.
  • the SIM card interface 195 is used for connecting a SIM card.
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a micro-kernel architecture, a micro-service architecture, or a cloud architecture.
  • the embodiment of the present application takes the Android system with a layered architecture as an example to illustrate the software structure of the electronic device 100 .
  • FIG. 2 is a block diagram of the software structure of the electronic device 100 according to the embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate through software interfaces.
  • the Android system is divided into four layers, which are application layer, application framework layer, Android runtime and system library, and kernel layer from top to bottom.
  • the application layer can consist of a series of application packages. As shown in FIG. 2, the application package may include application programs such as camera, photo album, music, and settings.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions. As shown in Figure 2, the application framework layer can include window managers, content providers, view systems, resource managers, notification managers, etc.
  • a window manager is used to manage window programs.
  • the window manager can get the size of the display screen, determine whether there is a status bar, lock the screen, capture the screen, etc.
  • Content providers are used to store and retrieve data and make it accessible to applications.
  • Said data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebook, etc.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on.
  • the view system can be used to build applications.
  • a display interface can consist of one or more views.
  • a display interface including a text message notification icon may include a view for displaying text and a view for displaying pictures.
  • the resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and so on.
  • the notification manager enables the application to display notification information in the status bar, which can be used to convey notification-type messages.
  • the notification information displayed in the status bar can disappear automatically after a short stay, such as a message reminder to inform the user that the download is complete.
  • the notification manager can also be a notification that appears on the top status bar of the system in the form of a chart or scroll bar text, such as a notification of an application running in the background, or a notification that appears on the screen in the form of a dialog window. For example, text information is prompted in the status bar, or the notification manager can also emit a prompt sound, such as electronic device vibration, indicator light flashing, and the like.
  • the Android runtime includes core library and virtual machine.
  • the Android runtime is responsible for the scheduling and management of the Android system.
  • the core library consists of two parts: one part is the function function that the java language needs to call, and the other part is the core library of Android.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application program layer and the application program framework layer as binary files.
  • the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
  • a system library can include multiple function modules. For example: surface manager (surface manager), media library (media libraries), 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL), etc.
  • the surface manager is used to manage the display subsystem and provides the fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of various commonly used audio and video formats, as well as still image files, etc.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing, etc.
  • 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer may include hardware driver modules, such as display drivers, camera drivers, sensor drivers, etc.
  • the application framework layer may call the hardware driver modules of the kernel layer.
  • the user opens the camera application, the camera application at the application layer in Figure 2 starts, and sends instructions to the kernel layer to mobilize the camera driver, sensor driver and display driver at the kernel layer, so that the electronic
  • the device can start the camera or lens to collect images.
  • the light is transmitted to the image sensor through the camera, and the image sensor performs photoelectric conversion on the light signal and converts it into an image visible to the naked eye of the user.
  • the output image data is transmitted to the system library in Figure 2 in the form of data flow, and the 3D graphics processing library and image processing library realize drawing, image rendering, synthesis and layer processing, etc., and generate display layers;
  • the display layer performs fusion processing, etc., and passes it to the content provider, window manager and view system of the application framework layer to control the display of the display interface.
  • the preview image is displayed on the image preview area of the camera application or the display screen of the electronic device.
  • Fig. 3 is a schematic diagram of a graphical user interface (graphical user interface, GUI) provided by the embodiment of the present application.
  • the (a) diagram in Fig. 3 shows that in the unlock mode of the mobile phone, the screen display system of the mobile phone displays the currently output Interface content 301, the interface content 301 is the main interface of the mobile phone.
  • the interface content 301 displays a variety of application programs (applications, Apps), such as camera, address book, phone, information, clock and other application programs. It is worth mentioning that the interface content 401 may also include other more application programs, which is not limited in this embodiment of the present application.
  • the user can instruct the mobile phone to open the camera application by touching a specific control on the mobile phone screen, pressing a specific physical button or key combination, inputting voice, and gestures in the air.
  • the mobile phone starts the camera and displays a shooting interface.
  • the user can instruct the mobile phone to start the camera application by clicking the “camera” application icon on the main interface, and the mobile phone displays the shooting interface as shown in (b) in FIG. 3 .
  • the user when the mobile phone is in the locked screen state, the user can also instruct the mobile phone to open the camera application by sliding right on the mobile phone screen, and the mobile phone can also display the shooting interface as shown in (b) in FIG. 3 .
  • the mobile phone when the mobile phone is in the lock screen state, the user can click the shortcut icon of the "Camera" application on the lock screen interface to instruct the mobile phone to open the camera application, and the mobile phone can also display the shooting interface as shown in Figure 3 (b).
  • the user can also click the corresponding control to make the mobile phone start the camera application to take pictures.
  • the user can also instruct the mobile phone to open the camera application by selecting the control of the camera function, and the mobile phone displays the shooting interface as shown in (b) in Figure 3.
  • the shooting interface of the camera generally includes a first viewfinder frame 302, shooting controls and other functional controls ("aperture”, “night scene”, “portrait”, “photographing”, “video recording”, “professional”, etc.).
  • the first viewing frame 302 can be used to preview the images (or pictures) collected by the camera
  • the first viewing frame 302 shown in (b) in Figure 3 can be used to preview the people, kites, buildings and white clouds collected by the camera .
  • the user can decide when to instruct the mobile phone to perform a shooting operation.
  • the user instructing the mobile phone to perform the shooting operation may be, for example, an operation in which the user clicks on a shooting control, or an operation in which the user presses a volume key, or controls the mobile phone to perform shooting through a voice command.
  • the shooting interface may also include a zoom factor indicator 303 .
  • the default zoom magnification of the mobile phone is the basic magnification, which may also be referred to as the first zoom magnification, which is "1 ⁇ ", that is, no zooming is performed. Users can change the zoom ratio by touching specific controls on the mobile phone screen, pressing specific physical buttons or button combinations, inputting voice, and gestures in the air.
  • the user can adjust the zoom ratio used by the mobile phone by operating the zoom ratio indicator 303 in the shooting interface.
  • the zoom magnification currently used by the mobile phone is "1 ⁇ ”
  • the user can click on the zoom magnification indicator 303 one or more times to change the zoom magnification used by the mobile phone to other zoom magnifications (such as "5 ⁇ ”, “10 ⁇ ” , "20 ⁇ ”, “50 ⁇ ”, etc.).
  • “5 ⁇ ” means that the zoom ratio is 5 times, that is, the zoom ratio is 5 times that when no zoom is performed;
  • “10 ⁇ ” refers to that the zoom ratio is 10 times, that is, the zoom ratio is 10 times when no zoom is performed;
  • “20 ⁇ ” means that the zoom ratio is 20 times, that is, the zoom ratio is 20 times when zooming is not performed, and
  • “50 ⁇ ” means that the zoom ratio is 50 times, that is, the zoom ratio is 50 times when zooming is not performed.
  • the user can increase the zoom ratio used by the mobile phone by sliding two (or three) fingers outward in the shooting interface (opposite direction to pinching), or Two-finger (or three-finger) pinching gesture reduces the zoom ratio used by the mobile phone.
  • the user can also change the zoom ratio used by the mobile phone by dragging the zoom scale 304 in the shooting interface.
  • the zoom magnification used by the mobile phone when the user increases the zoom magnification used by the mobile phone so that the zoom magnification used by the mobile phone is greater than the preset zoom magnification (for example: "10 ⁇ ”), the preview image of the first viewfinder frame 302 in the shooting interface of the mobile phone will remain at the preset zoom ratio and will no longer be enlarged, and the second viewfinder frame 305 will be displayed in the preview image of the first viewfinder frame.
  • the preset zoom magnification for example: "10 ⁇ ”
  • the preset zoom ratio mentioned above may be the maximum ratio of the optical zoom of the mobile phone, and may also be referred to as the maximum optical zoom ratio.
  • the photos taken when the zoom ratio is greater than the preset zoom ratio are obtained by processing the photos taken under the preset zoom ratio. For example, if the maximum magnification of the optical zoom of the mobile phone is "10 ⁇ ", the above-mentioned preset zoom magnification may be "10 ⁇ ". Then the photos taken by the mobile phone when the magnification is greater than "10 ⁇ " are obtained by cropping and post-processing the photos taken at the magnification of "10 ⁇ " (such as adding points, image encoding and other processing operations).
  • the above-mentioned preset zoom magnification may be "5 ⁇ ". Then the photos taken by the mobile phone when the magnification is greater than "5 ⁇ " are obtained by cropping and post-processing the photos taken at the magnification of "5 ⁇ ".
  • the shooting process in which the zoom ratio used by the mobile phone is less than or equal to the above-mentioned preset zoom ratio can be called an optical zoom shooting process;
  • the shooting process of greater than the above-mentioned preset zoom ratio may be referred to as a digital zoom shooting process.
  • the shooting preview screen (the first viewfinder frame) will not continue to enlarge as the user increases the zoom magnification used by the electronic device, but after zooming in to a certain magnification, through the second 2.
  • the viewfinder frame displays the preview image. In this way, when the lens shakes or the subject moves, the user can easily find the subject from the first viewfinder frame, thereby improving the user's shooting experience when zooming in at a high magnification.
  • the size of the second viewfinder frame may decrease as the zoom ratio increases. As shown in (d) and (e) of FIG. 4 , the user increases the zoom ratio of the mobile phone from “20 ⁇ ” to “50 ⁇ ”, and the size of the second viewing frame 305 decreases accordingly.
  • the size of the second viewfinder frame 305 may increase as the zoom ratio decreases. As shown in (e) and (f) of FIG. 4 , the user reduces the zoom ratio of the mobile phone from "50 ⁇ " to "20 ⁇ ", and the size of the second viewing frame 305 increases accordingly.
  • the zoom ratio used by the mobile phone when the zoom ratio used by the mobile phone is greater than the preset zoom ratio (that is, the mobile phone is in the digital zoom shooting stage), the image range captured by the mobile phone is no longer what you see is what you get, that is, the captured image It is no longer the first viewing frame 302 but the second viewing frame 305 .
  • the zoom magnification used by the mobile phone when the zoom magnification used by the mobile phone is greater than the preset zoom magnification, the user does not need to decide the timing of instructing the mobile phone to perform a shooting operation based on the first viewfinder frame 302, but based on the image (or screen) of the second viewfinder frame 305 Determines when to instruct the phone to take a capture action.
  • the user can change the position of the second viewfinder frame in the first viewfinder frame, so as to adjust the shooting picture of the second viewfinder frame.
  • the user can change the position of the second viewfinder in the first viewfinder by touching a specific control on the screen of the mobile phone, pressing a specific physical key or combination of keys, inputting voice, and gestures in the air.
  • the user can drag the second viewfinder frame 305 to move it in the first viewfinder frame 302, so as to adjust the shooting picture in the second viewfinder frame 305 from "person”. For "kite”.
  • the user can change the shooting picture of the second viewfinder frame in various ways.
  • the flexibility of imaging Larger, further enhancing the user's shooting experience.
  • the user fixes the mobile phone with a device such as a tripod, the shooting angle of the mobile phone has been fixed. If the shooting target moves, the user needs to move the mobile phone to track the shooting target, which is time-consuming and laborious.
  • the embodiment of the present application as long as the shooting target is located in the first viewfinder frame, the user can easily track the shooting target only by moving the second viewfinder frame.
  • image processing may be performed on the target area of the first viewfinder frame, so that the user can distinguish between the target area of the first viewfinder frame and the second viewfinder frame 305.
  • Second frame wherein, the target area of the first framing frame is an area in the first framing frame that does not overlap with the second framing frame.
  • the brightness of the target area of the first viewfinder frame 302 can be reduced, so that the user can
  • the target area of the first viewfinder frame 302 and the second viewfinder frame 305 are distinguished by brightness.
  • the target area of the first viewfinder frame can be blurred, so that The user distinguishes the target area of the first viewfinder frame 302 from the second viewfinder frame 305 by sharpness.
  • layer processing can be performed on the target area of the first viewfinder frame 302, for example, a layer is overlaid on the target area , which is convenient for the user to distinguish the target area of the first viewfinder frame 302 from the second viewfinder frame 305 .
  • the mobile phone can track and shoot the object in the second viewfinder frame 305 .
  • the user can instruct the mobile phone to track and shoot the subject in the second viewfinder frame 305 by touching a specific control on the screen of the mobile phone, pressing a specific physical key or key combination, inputting a voice, gestures in the air, etc., that is, the subject
  • the second viewfinder frame 305 moves along with the subject, that is, the subject is always kept in the second viewfinder frame 305 .
  • the "kite” is located in the second viewfinder frame 305, and the second viewfinder frame 305 includes a follow control 306, and the follow control 306 displays “off” to indicate that tracking shooting is currently not enabled, The following control 306 displays "on” to indicate that tracking shooting is currently enabled.
  • the user instructs the mobile phone to track and shoot the "kite” in the second viewfinder frame by clicking the follow control 306 when the follow control 306 displays “off”. Fly sideways to the left, and the mobile phone follows the "kite" in the second viewfinder frame 305, so that the second viewfinder frame 305 also moves from the right side to the left side.
  • a follow control 306 is included next to the second viewfinder frame 305.
  • the follow control 306 displays "OFF" to indicate that tracking shooting is currently enabled.
  • the follow control 306 displays “OFF”
  • the user can instruct the mobile phone to stop by clicking the follow control 306.
  • the following control 306 displays "on” to indicate that tracking shooting is currently not enabled. By clicking the follow control 306 when the follow control 306 displays “on”, the user can instruct the mobile phone to start tracking and shooting the object in the second viewfinder frame.
  • the mobile phone may automatically start tracking shooting when a preset condition is met. For example, when there is an object in the second viewfinder frame 305 and the second viewfinder frame 305 does not move within a certain period of time, the mobile phone may start tracking shooting to follow the object in the second viewfinder frame 305 .
  • the mobile phone may stop tracking and shooting the object in the second viewfinder frame.
  • the user can instruct the mobile phone to stop tracking and shooting the subject in the second viewfinder frame by touching a specific control on the mobile phone screen, pressing a specific physical button or key combination, inputting voice, and gestures in the air.
  • the user can click the follow control 306 next to the second view frame 305 when the follow control 306 displays "ON", and instruct the mobile phone to stop watching the "kite" in the second view frame. " to perform tracking shooting; after the user clicks the following control 306 to stop tracking and shooting the object in the second viewfinder frame, as shown in (a) in FIG. 7 , the following control 306 may be displayed as "off".
  • the process includes:
  • the mobile phone records the coordinates of the four corners of the second viewfinder frame 305 .
  • the coordinates of the four corners of the second viewfinder frame 305 are the coordinates of the four corners of the second viewfinder frame 305 in the target coordinate system.
  • the target coordinate system may be a coordinate system with an arbitrary point (such as an apex, a central point, etc.) in the first viewing frame 302 as a coordinate origin.
  • the target coordinate system is a coordinate system with the lower left vertex of the first viewing frame 302 as the coordinate origin. It can be seen that the coordinates of the four corners of the second viewing frame 305 are (X1, Y1), (X2, Y2), (X3, Y3) and (X4, Y4) respectively.
  • the mobile phone responds to the operation.
  • the mobile phone executes S803.
  • the mobile phone executes S804.
  • the mobile phone executes S805.
  • the mobile phone executes S806.
  • the zoom operation is to increase the focal length, the mobile phone uses the center of the second viewfinder frame as the origin to shrink the second viewfinder frame, and then executes S801; if the zoom operation is to reduce the focal length, the mobile phone uses the center of the second viewfinder frame as the origin to zoom out Enlarge the second viewfinder frame, and then execute S801. It should be noted that, if the zoom operation is to reduce the focal length, when the zoom ratio is reduced to a preset zoom ratio, the size of the second viewfinder frame 305 is the same as that of the first viewfinder frame 302, and the second viewfinder frame 305 may not Display again, that is, when the zoom ratio is reduced to a preset zoom ratio, the second viewing frame 305 stops being displayed in the first viewing frame 302 .
  • the mobile phone performs image cropping and post-processing according to the coordinates of the four corners of the current second viewfinder frame to obtain a final image.
  • the specific method for obtaining the final image through the above post-processing can be processed by any method conceivable by those skilled in the art, which is not specifically limited in this embodiment of the present application.
  • the mobile phone may first perform point-fill optimization on the image obtained by image cropping, and then perform image coding on the image after point-fill optimization to obtain the final image.
  • the mobile phone responds to the user's shooting operation, performs imaging cropping and post-processing according to the four-corner coordinates of the second viewfinder frame shown in Figure 9 (a) to obtain the following: The first image 307 shown in (b) of FIG. 9 .
  • the shooting method provided by the embodiment of the present application is introduced below in conjunction with FIG. 10. As shown in FIG. 10, the method includes:
  • the electronic device receives a first operation, and in response to the first operation, starts a camera and displays a shooting interface.
  • the zoom ratio of the camera is the first zoom ratio
  • the shooting interface includes a first viewfinder frame
  • the first viewfinder frame includes a first preview image
  • the first preview image is a viewfinder image under the first zoom ratio.
  • the first preview image being the viewfinder picture at the first zoom ratio means: when shooting at the first zoom ratio, the image captured by the camera is the same as the viewfinder picture, that is, the captured image is the same as the viewfinder picture. The same as the first preview image described above.
  • the first zoom ratio is usually the ratio when the camera is not zooming, and the first zoom ratio may be represented by “1 ⁇ ”.
  • the electronic device receives and responds to the user's operation of clicking the "camera” application icon on the main interface of the electronic device, starts the camera application, and displays the camera application as shown in (b) in Figure 3
  • the shooting interface includes a first viewfinder frame 302
  • the first viewfinder frame 302 includes a first preview image, which is a viewfinder image under the zoom ratio of the camera at “1 ⁇ ”.
  • the electronic device receives a second operation, and in response to the second operation, adjusts the camera to a second zoom ratio and enlarges and displays the first preview image displayed in the first viewing frame as a second preview image.
  • the second zoom magnification is greater than the first zoom magnification, and the second zoom magnification is less than or equal to the preset zoom magnification, and the second preview image is a viewfinder picture under the second zoom magnification, and the second preview image may be an image of the first preview image. part.
  • the second preview image being the viewfinder picture at the second zoom ratio means: when shooting at the second zoom ratio, the image captured by the camera is the same as the viewfinder picture, that is, the captured image is the same as the The second preview image is the same.
  • the preset zoom ratio may be the maximum optical zoom ratio of the camera. For example, if the maximum optical zoom ratio of the camera is 10 times, the preset zoom ratio is 10 times.
  • the electronic device receives and responds to the user's zoom operation of adjusting the zoom ratio from “1 ⁇ " to "5 ⁇ ” in the shooting interface, and adjusts the zoom ratio of the camera to The zoom ratio is adjusted from “1 ⁇ " to "5 ⁇ ” and the first preview image displayed in the first viewfinder frame 302 is enlarged and displayed as the second preview image as shown in (b) in Figure 4, through (a) in Figure 4 ) and (b), it can be seen that the second preview image is a part of the first preview image.
  • the electronic device receives a third operation, and in response to the third operation, adjusts the camera to a third zoom ratio, displays a third preview image in the first viewfinder frame, and displays a second viewfinder frame in the first viewfinder frame.
  • the third zoom ratio is greater than the preset zoom ratio
  • the second viewing frame covers a part of the first viewing frame
  • a fourth preview image is displayed in the second viewing frame.
  • the third preview image is a viewfinder image at a preset zoom ratio
  • the fourth preview image is a viewfinder image at a third zoom ratio.
  • the third preview image being a viewfinder image at a preset zoom ratio means that when shooting is performed at a preset zoom ratio, the image captured by the camera is the same as the third preview image.
  • the fact that the fourth preview image is a viewfinder image at the third zoom ratio means that: when shooting at the third zoom ratio, the image captured by the camera is the same as the fourth preview image.
  • the electronic device receives and responds to the user's zoom operation of adjusting the zoom ratio from "5 ⁇ " to "20 ⁇ " in the shooting interface , and as shown in (d) in FIG.
  • the framing frame 305 displays a framing image of the camera with a zoom ratio of “20 ⁇ ”.
  • the third preview image is an enlarged and displayed part of the second preview image.
  • the second zoom ratio "5 ⁇ " corresponding to (b) in Figure 4 is smaller than the preset zoom ratio "10 ⁇ ”
  • (d) in Figure 4 ) shown in the third preview image is a part of the enlarged display of the second preview image shown in (b) in FIG. 4 .
  • the third preview image is the same as the second preview image.
  • the second zoom ratio "10 ⁇ " corresponding to (c) in Figure 4 is equal to the preset zoom ratio "10 ⁇ "
  • the third preview image shown in d) is the same as the second preview image shown in (c) of FIG. 4 .
  • the method may further include: reducing the brightness of the target area of the first viewfinder frame as shown in FIG. 6( b ).
  • the target area of the first framing frame is an area in the first framing frame that does not overlap with the second framing frame.
  • the method may further include: blurring the target area of the first viewfinder frame as shown in FIG. 6( c ).
  • the method may further include: superimposing a layer on the target area of the first viewfinder frame.
  • this embodiment of the present application may further include one or more of the following steps: S1004, S1005, S1006, S1007, or S1008. It should be noted that the steps of S1004, S1005, S1006, S1007, and S1008 may be in parallel, or may be in a certain order according to the actual situation. For example: after S1004, you can execute S1005 or S1006 or 1007; or, after S1005, you can execute S1004 or S1006 or 1007; or, after S1006, you can execute S1004 or S1005 or 1007 or S1008; or, after S1007, you can execute S1004 or S1005 or 1006 or S1008.
  • the electronic device receives a fifth operation of increasing the zoom ratio of the camera, and in response to the fifth operation, increases the zoom ratio of the camera, shrinks the second viewing frame, and continues to display the third preview image in the first viewing frame.
  • the electronic device receives and responds to the user's zooming operation of adjusting the zoom ratio from "20 ⁇ " to "50 ⁇ " in the shooting interface, as shown in Figure 4
  • the second viewfinder frame 305 is reduced, and the third preview image with the zoom ratio of the camera at "10 ⁇ " continues to be displayed in the first viewfinder frame.
  • the electronic device receives a sixth operation of reducing the zoom ratio of the camera, and in response to the sixth operation, reduces the zoom ratio of the camera, increases the second viewing frame, and continues to display the third preview image in the first viewing frame.
  • the electronic device receives and responds to the user's zooming operation of adjusting the zoom ratio from "50 ⁇ " to "20 ⁇ ” in the shooting interface, as shown in Figure 4
  • the second viewfinder frame 305 is enlarged, and the third preview image under the camera zoom ratio of “10 ⁇ ” continues to be displayed in the first viewfinder frame.
  • the electronic device receives a seventh operation of dragging the second viewfinder frame, and in response to the seventh operation, moves the second viewfinder frame within the first viewfinder frame according to the seventh operation.
  • the electronic device receives and responds to the user's operation of dragging the second viewing frame to the upper right, and moves the second viewing frame 305 according to the operation.
  • the electronic device receives an eighth operation of starting object tracking, and in response to the eighth operation, moves the second viewfinder following the object in the second viewfinder.
  • a follow control 306 is included next to the second viewfinder frame 305, and the follow control 306 displays "off” to indicate that tracking shooting is not currently enabled, and the follow control 306 displays "on” Indicates that tracking shooting is currently enabled.
  • the electronic device receives and responds to the user's operation of clicking the follow control 306 to make the follow control 306 display “on” when the follow control 306 displays “off”, starts tracking shooting, and performs tracking shooting on the "kite" in the second viewfinder frame, as shown in The "kite” in the shooting interface shown in (b) of Fig. 7 flies from the right side of the shooting interface to the left side, and the electronic device follows the "kite” in the second viewfinder frame 305, so that the second viewfinder frame 305 is also moved by The right side of the shooting interface is moved to the left side.
  • the electronic device receives the fourth operation, and takes pictures in response to the fourth operation, and obtains an image with the same content as the fourth preview image
  • the fourth operation is a shooting operation.
  • the electronic device receives and responds to the shooting operation input by the user on the left side of the shooting interface in (b) in FIG. From the shown images, it can be seen that the image shown on the right side of (b) in FIG. 5 is the same as the second preview image displayed in the second viewfinder frame 305 on the left side of (b) in FIG. 5 . .
  • the electronic device receives and responds to the shooting operation input by the user on the shooting interface on the left side of (c) in Figure 5, and performs shooting, and obtains the right image in (c) in Figure 5 From the image shown on the side, it can be seen that the image shown on the right side of (c) in FIG. 5 is the same as the second preview image displayed in the second viewfinder frame 305 on the left side of (c) in FIG. 5 .
  • the electronic device includes hardware and/or software modules corresponding to each function.
  • the present application can be implemented in the form of hardware or a combination of hardware and computer software. Whether a certain function is executed by hardware or computer software drives hardware depends on the specific application and design constraints of the technical solution. Those skilled in the art may use different methods to implement the described functions in combination with the embodiments for each specific application, but such implementation should not be regarded as exceeding the scope of the present application.
  • the embodiment of the present application may divide the electronic device into functional modules according to the above method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module.
  • the above integrated modules may be implemented in the form of hardware. It should be noted that the division of modules in this embodiment is schematic, and is only a logical function division, and there may be other division methods in actual implementation.
  • the electronic device may include: a transceiver unit and a processing unit, and the processing unit may implement the method performed by the electronic device in the above method embodiment, and/or be used for the other processes of the technology.
  • an electronic device may include a processing unit, a storage unit and a communication unit.
  • the processing unit may be used to control and manage actions of the electronic device, for example, may be used to support the electronic device to execute the steps performed by the above-mentioned units.
  • the storage unit can be used to support the electronic device to execute stored program codes, and/or data, and the like.
  • the communication unit can be used to support communication of the electronic device with other devices.
  • the processing unit may be a processor or a controller. It can implement or execute the various illustrative logical blocks, modules and circuits described in connection with the present disclosure.
  • the processor can also be a combination of computing functions, such as a combination of one or more microprocessors, a combination of digital signal processing (digital signal processing, DSP) and a microprocessor, and the like.
  • the storage unit may be a memory.
  • the communication unit may be a device that interacts with other electronic devices, such as a radio frequency circuit, a Bluetooth chip, and a Wi-Fi chip.
  • the electronic device involved in the embodiment of the present application includes a processor and a transceiver.
  • Related functions implemented by the above-mentioned transceiver unit and processing unit may be implemented by a processor.
  • the electronic device may also include a memory, and the processor and the memory communicate with each other through an internal connection path.
  • the relevant functions implemented by the above storage unit may be implemented by a memory.
  • the embodiment of the present application also provides a computer storage medium, the computer storage medium stores computer instructions, and when the computer instructions are run on the electronic device, the electronic device executes the above related method steps to implement the photographing method in the above embodiment.
  • An embodiment of the present application further provides a computer program product, which, when running on a computer, causes the computer to execute the above-mentioned related steps, so as to realize the photographing method in the above-mentioned embodiment.
  • the embodiment of the present application also provides an electronic device, and this device may specifically be a chip, an integrated circuit, a component or a module.
  • the device may include a connected processor and a memory for storing instructions, or the device may include at least one processor for fetching instructions from an external memory.
  • the processor can execute instructions, so that the chip executes the photographing methods in the above method embodiments.
  • the electronic device may be a chip, and the chip includes one or more processors and an interface circuit.
  • the above chip may also include a bus.
  • a processor may be an integrated circuit chip with signal processing capabilities.
  • each step of the above shooting method can be completed by an integrated logic circuit of hardware in the processor or instructions in the form of software.
  • the above-mentioned processor may be a general-purpose processor, a digital signal processing (digital signal processing, DSP) device, an integrated circuit (application specific integrated circuit, ASIC), a field-programmable gate array (field-programmable gate array, FPGA) ) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components.
  • DSP digital signal processing
  • ASIC application specific integrated circuit
  • FPGA field-programmable gate array
  • a general-purpose processor may be a microprocessor, or the processor may be any conventional processor, or the like.
  • the interface circuit can be used for sending or receiving data, instructions or information, and the processor can process the data, instructions or other information received by the interface circuit, and can send the processing completion information through the interface circuit.
  • the chip further includes a memory, which may include a read-only memory and a random access memory, and provides operation instructions and data to the processor.
  • a portion of the memory may also include non-volatile random access memory (non-volatile random access memory, NVRAM).
  • the memory stores executable software modules or data structures
  • the processor can execute corresponding operations by calling operation instructions stored in the memory (the operation instructions can be stored in the operating system).
  • the chip may be used in the electronic device or DOP involved in the embodiment of the present application.
  • the interface circuit can be used to output the execution result of the processor.
  • processors and the interface circuit can be realized by hardware design, software design, or a combination of software and hardware, which is not limited here.
  • the electronic device, computer storage medium, computer program product or chip provided in this embodiment is all used to execute the corresponding method provided above, therefore, the beneficial effects it can achieve can refer to the corresponding method provided above The beneficial effects in the method will not be repeated here.
  • sequence numbers of the above-mentioned processes do not mean the order of execution, and the execution order of the processes should be determined by their functions and internal logic, and should not be used in the embodiments of the present application.
  • the implementation process constitutes any limitation.
  • the disclosed systems, devices and methods may be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of the above units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components can be combined or can be Integrate into another system, or some features may be ignored, or not implemented.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be through some interfaces, and the indirect coupling or communication connection of devices or units may be in electrical, mechanical or other forms.
  • the units described above as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in one place, or may be distributed to multiple network units. Part or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, each unit may exist separately physically, or two or more units may be integrated into one unit.
  • the technical solution of the present application is essentially or the part that contributes to the prior art or the part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium, including Several instructions are used to make a computer device (which may be a personal computer, a server, or a network device, etc.) execute all or part of the steps of the above-mentioned methods in various embodiments of the present application.
  • the aforementioned storage medium includes: U disk, mobile hard disk, read only memory (Read Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk or optical disk and other various media that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

La présente demande concerne le domaine technique de l'électronique et divulgue un procédé de photographie et un dispositif électronique, capables d'améliorer l'expérience de photographie d'un utilisateur lors d'un zoomage à un grossissement élevé. Le procédé comprend les étapes selon lesquelles un dispositif électronique : reçoit un premier actionnement et, en réponse à ce premier actionnement, démarre un appareil photographique dont le grossissement de zoom est un premier grossissement de zoom et affiche une interface de photographie comprenant une première boîte de recherche de vue qui comprend une première image de prévisualisation ; reçoit un deuxième actionnement et, en réponse à ce deuxième actionnement, ajuste le grossissement de zoom de l'appareil photographique sur un deuxième grossissement de zoom, et zoome sur la première image de prévisualisation affichée dans la première boîte de recherche de vue et affiche celle-ci en tant que deuxième image de prévisualisation ; et reçoit un troisième actionnement et, en réponse à ce troisième actionnement, ajuste le grossissement de zoom de l'appareil photographique sur un troisième grossissement de zoom, affiche une troisième image de prévisualisation dans la première boîte de recherche de vue, et affiche une seconde boîte de recherche de vue dans la première boîte de recherche de vue. La première image de prévisualisation est une image de recherche de vue selon le premier grossissement de zoom, la deuxième image de prévisualisation est une image de recherche de vue selon le second grossissement de zoom, la troisième image de prévisualisation est une image de recherche de vue selon un grossissement de zoom prédéfini, et une quatrième image de prévisualisation est une image de recherche de vue selon le troisième grossissement de zoom.
PCT/CN2022/112456 2021-09-08 2022-08-15 Procédé de photographie et dispositif électronique WO2023035868A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111049688.9A CN115802145A (zh) 2021-09-08 2021-09-08 拍摄方法及电子设备
CN202111049688.9 2021-09-08

Publications (1)

Publication Number Publication Date
WO2023035868A1 true WO2023035868A1 (fr) 2023-03-16

Family

ID=85473430

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/112456 WO2023035868A1 (fr) 2021-09-08 2022-08-15 Procédé de photographie et dispositif électronique

Country Status (2)

Country Link
CN (1) CN115802145A (fr)
WO (1) WO2023035868A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014206592A (ja) * 2013-04-11 2014-10-30 キヤノン株式会社 撮像装置、その制御方法及びプログラム
CN111010506A (zh) * 2019-11-15 2020-04-14 华为技术有限公司 一种拍摄方法及电子设备
CN111031248A (zh) * 2019-12-25 2020-04-17 维沃移动通信(杭州)有限公司 一种拍摄方法及电子设备
CN112188097A (zh) * 2020-09-29 2021-01-05 Oppo广东移动通信有限公司 拍摄方法、拍摄装置、终端设备及计算机可读存储介质
CN113037995A (zh) * 2019-12-25 2021-06-25 华为技术有限公司 一种长焦场景下的拍摄方法及终端

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014206592A (ja) * 2013-04-11 2014-10-30 キヤノン株式会社 撮像装置、その制御方法及びプログラム
CN111010506A (zh) * 2019-11-15 2020-04-14 华为技术有限公司 一种拍摄方法及电子设备
CN111031248A (zh) * 2019-12-25 2020-04-17 维沃移动通信(杭州)有限公司 一种拍摄方法及电子设备
CN113037995A (zh) * 2019-12-25 2021-06-25 华为技术有限公司 一种长焦场景下的拍摄方法及终端
CN112188097A (zh) * 2020-09-29 2021-01-05 Oppo广东移动通信有限公司 拍摄方法、拍摄装置、终端设备及计算机可读存储介质

Also Published As

Publication number Publication date
CN115802145A (zh) 2023-03-14

Similar Documents

Publication Publication Date Title
WO2021147482A1 (fr) Procédé de photographie au téléobjectif et dispositif électronique
WO2022267565A1 (fr) Procédé de photographie vidéo, et dispositif électronique et support de stockage lisible par ordinateur
US20230043815A1 (en) Image Processing Method and Electronic Device
EP4199499A1 (fr) Procédé de capture d'image, interface graphique utilisateur et dispositif électronique
CN113099146B (zh) 一种视频生成方法、装置及相关设备
CN113709355B (zh) 滑动变焦的拍摄方法及电子设备
CN115689963B (zh) 一种图像处理方法及电子设备
EP4436198A1 (fr) Procédé de capture d'images dans une vidéo, et dispositif électronique
WO2023093169A1 (fr) Procédé de photographie et dispositif électronique
WO2022262550A1 (fr) Procédé de photographie vidéo et dispositif électronique
WO2022057384A1 (fr) Procédé et dispositif de photographie
WO2024179101A1 (fr) Procédé de photographie
WO2023160230A1 (fr) Procédé photographique et dispositif associé
WO2023231697A1 (fr) Procédé pour photographier et dispositif associé
WO2023035868A1 (fr) Procédé de photographie et dispositif électronique
WO2023072113A1 (fr) Procédé d'affichage et dispositif électronique
WO2023160224A9 (fr) Procédé pour photographier et dispositif associé
CN116193243B (zh) 拍摄方法和电子设备
US20240205533A1 (en) Method for capturing image during video recording and electronic device
WO2023231696A1 (fr) Procédé pour photographier et dispositif associé
WO2024088074A1 (fr) Procédé de photographie de la lune et dispositif électronique
WO2023078133A1 (fr) Procédé et dispositif de lecture vidéo
CN117395496A (zh) 一种拍摄方法及相关设备

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22866342

Country of ref document: EP

Kind code of ref document: A1