CN113556461B - Image processing method, electronic equipment and computer readable storage medium - Google Patents

Image processing method, electronic equipment and computer readable storage medium Download PDF

Info

Publication number
CN113556461B
CN113556461B CN202011053345.5A CN202011053345A CN113556461B CN 113556461 B CN113556461 B CN 113556461B CN 202011053345 A CN202011053345 A CN 202011053345A CN 113556461 B CN113556461 B CN 113556461B
Authority
CN
China
Prior art keywords
electronic device
image
video
interface
electronic equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011053345.5A
Other languages
Chinese (zh)
Other versions
CN113556461A (en
Inventor
张超
刘宏马
张雅琪
贾志平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202011053345.5A priority Critical patent/CN113556461B/en
Priority to CN202210839558.3A priority patent/CN115379112A/en
Priority to PCT/CN2021/116944 priority patent/WO2022068537A1/en
Publication of CN113556461A publication Critical patent/CN113556461A/en
Application granted granted Critical
Publication of CN113556461B publication Critical patent/CN113556461B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals

Abstract

The application discloses an image processing method, which is applied to the field of video recording or photographing and comprises the following steps: the electronic equipment can cut the image collected by the camera through the cutting frame with the preset size and the preset motion track selected by the user in the process of photographing or recording, and the image in the cutting frame is displayed in a photographing preview interface or a recording interface. Therefore, the shot video or picture has an automatic zooming effect, and the detailed characteristics of the shot main body are highlighted.

Description

Image processing method, electronic device and computer readable storage medium
Technical Field
The present application relates to the field of computer vision technologies, and in particular, to an image processing method and a related apparatus.
Background
The development of intelligent terminals has been so far, and photographing and video recording have become one of the most important features. When a user uses a portable electronic device such as a mobile phone and the like to shoot, the user has the requirement of zooming or focusing to shoot a scene at a distance clearly or shoot a wider scene.
At present, when electronic devices such as mobile phones record videos or display preview pictures, if a user wants to adjust a framing focal length of a picture, the electronic devices such as the mobile phones only support manual zoom adjustment of the user, and the user operation is complex. When electronic equipment such as a mobile phone records a video, after a user can only manually adjust the focal length once, the electronic equipment such as the mobile phone can only record the video according to the focal length adjusted by the user, so that the recorded video is difficult to embody the details of a shooting main body in the video, and the focusing operation of the user is complicated.
Disclosure of Invention
The application provides an image processing method, a related device and a related device, which can enable a shot video or picture to have an automatic zooming effect and highlight the detailed characteristics of a shot main body.
In a first aspect, the present application provides an image processing method, including: the electronic equipment displays a first shooting preview interface, the first shooting preview interface comprises a first preview frame, and the first preview frame displays a picture acquired by a camera of the electronic equipment in real time. After the electronic equipment detects a first operation on the first shooting preview interface, the electronic equipment displays a shooting option interface, and the shooting option interface comprises a first shooting option and a second shooting option. After the electronic equipment detects a second operation aiming at the first shooting option, the electronic equipment displays a second shooting preview interface, the second shooting preview interface comprises a second preview frame, and the second preview frame displays a picture acquired by a camera of the electronic equipment in real time. The electronic device begins capturing first video content. And at a first moment after the shooting of the first video content is started, the electronic equipment displays a first part of a picture acquired by a camera of the electronic equipment in real time in the second preview frame. And at a second moment after the first video content is shot, the electronic equipment displays a second part of a picture acquired by a camera of the electronic equipment in real time in the second preview frame. The electronic device displays the shooting option interface. After the electronic equipment detects a third operation aiming at the second shooting option, the electronic equipment displays a third shooting preview interface, the third shooting preview interface comprises a third preview frame, and the third preview frame displays a picture acquired by a camera of the electronic equipment in real time. The electronic device begins capturing the second video content. And at a first moment after the second video content is shot, the electronic equipment displays a third part of a picture acquired by a camera of the electronic equipment in real time in the third preview frame. And at a second moment after the second video content starts to be shot, the electronic equipment displays a fourth part of a picture acquired by a camera of the electronic equipment in real time in the third preview frame, wherein the first part, the second part, the third part and the fourth part are different.
The application provides an image processing method, wherein in the process of photographing or recording, an electronic device can cut images collected by a camera through a cutting frame with a preset size and a preset motion track, and the images in the cutting frame are displayed in a preview interface or a recording interface. Therefore, the shot video or picture has an automatic zooming effect, and the detailed characteristics of the shot main body are highlighted.
In one possible implementation, the electronic device detects a fourth operation on the second capture preview interface before the electronic device starts capturing the first video content. In response to the fourth operation, the electronic device starts capturing the first video content.
In a possible implementation manner, the second shooting preview interface further includes an option switching control; after the electronic device displays a second part of a picture acquired by a camera of the electronic device in real time in the second preview frame, the electronic device detects a fifth operation for the option switching control. In response to the fifth operation, the electronic device displays the shooting option interface. Therefore, after one shooting option is selected, an entrance for replacing the shooting option is provided for the user, and the user can conveniently replace the shooting option to shoot video content.
In a possible implementation manner, after the electronic device displays, in the second preview frame, the second part of the picture acquired by the camera of the electronic device in real time, the method further includes: the electronic device detects a sixth operation on the second shooting preview interface. In response to the sixth operation, the electronic device saves the first part as a video picture at a first time in a video file and saves the second part as a video picture at a second time in the video file. Thus, the electronic device can capture a video with an effect of automatic zooming or lens panning.
In one possible implementation, the method further includes: the electronic equipment detects a seventh operation at a first time after the first video content is started to be shot. In response to the seventh operation, the electronic device saves the first portion as a picture. In this way, the electronic device can save a preview video picture as a picture in the preview process with the effects of automatic zooming, lens translation and the like.
In one possible implementation, the second capture preview interface further includes a first window. The electronic equipment displays a picture acquired by a camera of the electronic equipment in real time in the first window. At a first time after the first video content is started to be shot, the electronic equipment displays the bounding box of the first part in the first window. At a second time after the first video content is started to be shot, the electronic equipment displays the bounding box of the second part in the first window. Therefore, the user can see the picture acquired by the camera in real time conveniently, and the user can find the shot main body conveniently.
In one possible implementation, the second capture preview interface further includes a first window. And at a first moment after the first video content starts to be shot, the electronic equipment displays a video picture at the first moment in the video sample corresponding to the first shooting option in the first window. And at a second moment after the first video content starts to be shot, the electronic equipment displays a video picture at the second moment in the video sample corresponding to the first shooting option in the first window. Therefore, the user can conveniently compare the shooting effect in the video sample in real time.
In an implementation manner of the present application, the display positions of the first window and the second preview frame are any one of the following: at least a partial area of the first window overlaps with a display area of the second preview box; or the first window is displayed at a position outside the second preview frame in the second shooting preview interface; or the first window is displayed in the upper right corner area of the second preview frame; alternatively, the first window is displayed in the upper left corner area of the second preview box.
In a possible implementation manner, the first window includes a window closing control; the electronic device detects an eighth operation on the window closing control in the first window; in response to the eighth operation, the electronic device closes and displays the first window. Therefore, when the user does not need the first window, the user can manually trigger the closing, and the shooting picture displayed on the electronic equipment is not shielded.
In a possible implementation manner, after the electronic device closes and displays the first window, the electronic device displays a window opening control in the second shooting preview interface; the electronic equipment detects a ninth operation aiming at the window opening control; in response to the ninth operation, the electronic device displays the first window on the second photographing preview interface. In this way, the electronic device may be triggered by the user to redisplay the first window when the user needs to retrieve the widget.
In one possible implementation manner, the second shooting preview interface further includes a mirror-moving mode closing control. The electronic device detects a tenth operation that acts on the mirror mode close control. In response to the tenth operation, the electronic device displays the first photographing preview interface. Thus, the user can trigger the shooting mode to be turned off at any time.
In a possible implementation manner, the second shooting preview interface further includes a second window, and the electronic device displays a waveform diagram of the first music piece corresponding to the first shooting option in the second window. Wherein the audio information in the waveform map at a first time after the first video content starts to be captured is different from the audio information in the waveform map at a second time after the first video content starts to be captured, and the audio information comprises one or more of the following: rhythm information, amplitude information, and style information.
Wherein the electronic device can play the first music piece after the electronic device starts shooting the first video content.
In one possible implementation manner, when the electronic device stores the first part as a video picture at a first time in a video file and stores the second part as a video picture at a second time in the video file, the electronic device stores the first music piece in the audio data of the video file.
In a possible implementation manner, before displaying a picture acquired by a camera of the electronic device in real time, the electronic device performs image stabilization processing on the picture acquired by the camera of the electronic device in real time, where the image stabilization processing includes one or more of: electronic image stabilization EIS processing, optical image stabilization processing and mechanical image stabilization processing. Thus, the shot picture can be made smooth and excessive.
In a possible implementation manner, before the electronic device displays a first part of a picture acquired by a camera of the electronic device in real time in the second preview frame, the electronic device performs super-resolution image reconstruction on the first part. Before the electronic equipment displays a second part of a picture acquired by a camera of the electronic equipment in real time in the second preview frame, the electronic equipment carries out super-resolution reconstruction on the second part. Before the electronic equipment displays a third part of a picture acquired by a camera of the electronic equipment in real time in the third preview frame, the electronic equipment carries out image super-resolution reconstruction on the third part. Before the electronic equipment displays a fourth part of a picture acquired by a camera of the electronic equipment in real time in the third preview frame, the electronic equipment carries out image super-resolution reconstruction on the fourth part. Thus, the shot picture can be clearer.
In one possible implementation, the camera of the electronic device includes a wide-angle camera and a non-wide-angle camera. The first preview frame displays a picture acquired by the non-wide-angle camera in real time. And the second preview frame displays the picture acquired by the wide-angle camera in real time. And the third preview frame displays the picture acquired by the wide-angle camera in real time. Thus, the shot picture can be made to have a larger angle of view.
In a second aspect, the present application provides an electronic device comprising: the device comprises a processor, a camera and a touch screen. The touch screen can be used for displaying a first shooting preview interface, the first shooting preview interface comprises a first preview frame, and the first preview frame displays a picture acquired by the camera in real time. The processor is further configured to instruct the touch screen to display a shooting option interface after detecting a first operation on the first shooting preview interface, where the shooting option interface includes a first shooting option and a second shooting option. The processor is further configured to instruct the touch screen to display a second shooting preview interface after detecting a second operation for the first shooting option, where the second shooting preview interface includes a second preview frame, and the second preview frame displays a picture acquired by a camera of the electronic device in real time. The processor is further configured to instruct the camera to start capturing the first video content. The processor is further configured to instruct the touch screen to display a first part of a picture acquired by the camera in real time in the second preview frame at a first time after the shooting of the first video content is started. The processor is further configured to instruct the touch screen to display a second part of a picture acquired by the camera in real time in the second preview frame at a second time after the shooting of the first video content is started. And the touch screen is also used for displaying the shooting option interface. The processor is further configured to instruct the touch screen to display a third shooting preview interface after detecting a third operation for the second shooting option, where the third shooting preview interface includes a third preview frame, and the third preview frame displays a picture acquired by a camera of the electronic device in real time. And the processor is also used for instructing the camera to start shooting the second video content. And the processor is further used for indicating the touch screen to display a third part of a picture acquired by the camera in real time in the third preview frame at a first moment after the second video content starts to be shot. And the processor is further configured to instruct the touch screen to display a fourth portion of a picture acquired by the camera in real time in the third preview frame at a second time after the second video content starts to be shot, wherein the first portion, the second portion, the third portion and the fourth portion are different.
The application provides an image processing method, wherein in the process of photographing or recording, an electronic device can cut an image collected by a camera through a cutting frame with a preset size and a preset motion track, and the image in the cutting frame is displayed in a preview interface or a recording interface. Therefore, the shot video or picture has an automatic zooming effect, and the detailed characteristics of the shot main body are highlighted.
In one possible implementation, before the processor instructs the camera to start shooting the first video content, the processor is further configured to detect a fourth operation for the second shooting preview interface. And the processor is specifically used for responding to the fourth operation and instructing the camera to start shooting the first video content.
In one possible implementation manner, the second shooting preview interface further includes an option switching control. After the processor instructs the touch screen to display a second part of a picture acquired by the camera of the electronic device in real time in the second preview frame, the processor is further configured to detect a fifth operation for the option switching control. And the processor is specifically used for responding to the fifth operation and displaying the shooting option interface. Therefore, after one shooting option is selected, an entrance for replacing the shooting option is provided for the user, and the user can conveniently replace the shooting option to shoot video content.
In a possible implementation manner, after the processor instructs the touch screen to display the second part of the picture acquired by the camera in real time in the second preview frame, the processor is further configured to detect a sixth operation on the second shooting preview interface. The processor is further configured to save the first portion as a video picture at a first time in a video file and save the second portion as a video picture at a second time in the video file in response to the sixth operation. Thus, the electronic device can shoot a video with the effects of automatic zooming, lens translation and the like.
In a possible implementation manner, the processor is further configured to detect the seventh operation at a first time after the first video content starts to be captured. The processor, further responsive to the seventh operation, saves the first portion as a picture. Therefore, the electronic equipment can save a preview video picture as a picture in the previewing process with the effects of automatic zooming, lens translation and the like.
In one possible implementation, the second capture preview interface further includes a first window. And the touch screen is also used for displaying a picture acquired by a camera of the electronic equipment in real time in the first window. And the processor is also used for indicating the touch screen to display the boundary frame of the first part in the first window at a first moment after the first video content is shot. And the processor is further used for indicating the touch screen to display the boundary frame of the second part in the first window at a second moment after the first video content is shot. Therefore, the user can see the picture acquired by the camera in real time conveniently, and the user can find the shot main body conveniently.
In one possible implementation, the second capture preview interface further includes a first window. The processor is further configured to instruct the touch screen to display, in the first window, a video picture at the first time in the video sample corresponding to the first shooting option at a first time after the shooting of the first video content is started. And the processor is also used for indicating the touch screen to display a video picture at a second moment in the video sample corresponding to the first shooting option in the first window at a second moment after the first video content is shot. Therefore, the user can conveniently compare the shooting effect in the video sample in real time.
In an implementation manner of the present application, the display positions of the first window and the second preview frame are any one of the following: at least a partial area of the first window overlaps with a display area of the second preview box; or the first window is displayed at a position outside the second preview frame in the second shooting preview interface; or the first window is displayed in the upper right corner area of the second preview frame; alternatively, the first window is displayed in the upper left corner region of the second preview box.
In a possible implementation manner, the first window includes a window closing control; the processor is further configured to detect an eighth operation directed to the window closing control in the first window. And the processor is also used for responding to the eighth operation and closing and displaying the first window. Therefore, when the user does not need the first window, the user can manually trigger the first window to be closed, and the shooting picture displayed on the electronic equipment is not blocked.
In one possible implementation, the first window is closed and displayed by the electronic device. And the touch screen is also used for displaying a window opening control in the second shooting preview interface. The processor is further configured to detect a ninth operation on the window opening control. The processor is further used for responding to the ninth operation and indicating the touch screen to display the first window on the second shooting preview interface. In this way, the electronic device may be triggered by the user to redisplay the first window when the user needs to retrieve the widget.
In one possible implementation manner, the second shooting preview interface further includes a mirror-moving mode closing control. And the processor is also used for detecting a tenth operation acting on the mirror operation mode closing control. The processor is further used for responding to the tenth operation and indicating the touch screen to display the first shooting preview interface. Thus, the user can trigger the shooting mode to be turned off at any time.
In a possible implementation manner, the second shooting preview interface further includes a second window. And the touch screen is also used for displaying the waveform diagram of the first music piece corresponding to the first shooting option in the second window. Wherein the audio information in the waveform map at a first time after the first video content starts to be captured is different from the audio information in the waveform map at a second time after the first video content starts to be captured, and the audio information includes one or more of the following: rhythm information, amplitude information, and style information. Therefore, the video picture can automatically zoom along with the rhythm of the music, and the visual effect of the shot picture is enhanced.
Wherein, the electronic equipment can also comprise a loudspeaker. The processor is further configured to instruct the speaker to play the first piece of music after starting to capture the first video content.
In one possible implementation, when the processor stores the first portion as a video picture at a first time in a video file and stores the second portion as a video picture at a second time in the video file, the processor is further configured to store the first music piece in audio data of the video file.
In a possible implementation manner, the processor is further configured to perform image stabilization processing on the picture acquired by the camera in real time before displaying the picture acquired by the camera in real time, where the image stabilization processing includes one or more of: electronic image stabilization EIS processing, optical image stabilization processing and mechanical image stabilization processing. Thus, the shot picture can be made smooth and excessive.
In a possible implementation manner, the processor is further configured to perform image super-resolution reconstruction on the first portion before the touch screen displays the first portion of the picture acquired by the camera in real time in the second preview frame. And the processor is also used for performing image super-resolution reconstruction on a second part of the picture acquired by the camera in real time before the touch screen displays the second part in the second preview frame. And the processor is also used for performing image super-resolution reconstruction on the third part before the touch screen displays the third part of the picture acquired by the camera in real time in the third preview frame. And the processor is further used for performing image super-resolution reconstruction on the fourth part before the touch screen displays the fourth part of the picture acquired by the camera in real time in the third preview frame. Thus, the shot picture can be clearer.
In one possible implementation, the camera of the electronic device includes a wide-angle camera and a non-wide-angle camera.
The first preview frame displays a picture acquired by the non-wide-angle camera in real time. And the second preview frame displays the picture acquired by the wide-angle camera in real time. And the third preview frame displays the picture acquired by the wide-angle camera in real time. Thus, the shot picture can be made to have a larger angle of view.
In a third aspect, the present application provides an electronic device comprising a touch screen, a camera, one or more processors, and one or more memories. The one or more processors are coupled with the touch screen, the camera, and the one or more memories for storing computer program code comprising computer instructions that, when executed by the one or more processors, cause the electronic device to perform the method of image processing in any of the possible implementations of the above aspects.
In a fourth aspect, the present application provides an electronic device, comprising: one or more functional modules. One or more functional modules are used for executing the image processing method in any one of the possible implementation manners of the above aspects.
In a fifth aspect, an embodiment of the present application provides a computer storage medium, which includes computer instructions, and when the computer instructions are executed on an electronic device, a communication apparatus is caused to execute an image processing method in any one of the possible implementation manners of the foregoing aspect.
In a sixth aspect, the present application provides a computer program product, which when run on a computer, causes the computer to execute the image processing method in any one of the possible implementation manners of the foregoing aspects.
Drawings
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 2 is a schematic diagram of a software structure of an electronic device according to an embodiment of the present application;
FIGS. 3A-3Q are schematic diagrams of a set of interfaces provided by embodiments of the present application;
FIGS. 4A-4F are schematic diagrams of a set of interfaces provided by another embodiment of the present application;
FIGS. 5A-5E are a set of schematic interfaces provided in accordance with another embodiment of the present application;
FIGS. 6A-6H are schematic diagrams of a set of interfaces provided by another embodiment of the present application;
FIGS. 7A-7C are schematic diagrams illustrating position changes of a crop box on an original image according to an embodiment of the present application;
FIGS. 7D-7F are schematic diagrams illustrating a clipping effect of a set of original images by using a clipping frame according to an embodiment of the present application;
FIGS. 8A-8C are schematic diagrams illustrating position changes of a crop box on an original image according to another embodiment of the present application;
8D-8G are schematic diagrams illustrating a clipping effect of a set of original images by using a clipping frame according to an embodiment of the present application;
FIGS. 9A-9F are schematic diagrams illustrating the position change of the crop box on the original image according to the embodiment of the present application;
FIGS. 9G-9K are a set of schematic diagrams of interfaces provided by another embodiment of the present application;
FIGS. 10A-10K are schematic diagrams of a set of interfaces provided by another embodiment of the present application;
FIGS. 11A-11E are schematic diagrams of a set of interfaces provided by another embodiment of the present application;
FIGS. 12A-12D are a set of schematic diagrams of interfaces provided by another embodiment of the present application;
FIGS. 13A-13E are a set of schematic interfaces provided in accordance with another embodiment of the present application;
FIGS. 14A-14E are a set of schematic interfaces provided by another embodiment of the present application;
FIGS. 15A-15E are a set of schematic interfaces provided by another embodiment of the present application;
fig. 16 is a schematic diagram illustrating a comparison between waveforms of a sound signal and a rhythm signal provided in an embodiment of the present application;
FIGS. 17A-17J are a set of schematic interfaces provided in accordance with another embodiment of the present application;
fig. 18 is a schematic block diagram of a video stream clipping system according to an embodiment of the present disclosure;
fig. 19 is a schematic hardware architecture diagram of a video stream clipping system according to an embodiment of the present application;
fig. 20 is a schematic flowchart of an image processing method according to an embodiment of the present application.
Detailed Description
The technical solution in the embodiments of the present application will be described in detail and removed with reference to the accompanying drawings. In the description of the embodiments herein, "/" means "or" unless otherwise specified, for example, a/B may mean a or B; the "and/or" in the text is only an association relation describing the association object, and indicates that three relations may exist, for example, a and/or B may indicate: three cases of a alone, a and B both, and B alone exist, and in addition, "a plurality" means two or more than two in the description of the embodiments of the present application.
In the following, the terms "first", "second" are used for descriptive purposes only and are not to be understood as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature, and in the description of embodiments of the application, unless stated otherwise, "plurality" means two or more.
Fig. 1 shows a schematic structural diagram of an electronic device 100.
The following specifically describes an embodiment by taking the electronic device 100 as an example. It should be understood that the electronic device 100 shown in fig. 1 is merely an example, and that the electronic device 100 may have more or fewer components than shown in fig. 1, may combine two or more components, or may have a different configuration of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
The electronic device 100 may include: the mobile terminal includes a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the electronic device 100. In other embodiments of the present application, the electronic device 100 may include more or fewer components than shown, or combine certain components, or split certain components, or arrange different components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller may be, among other things, a neural center and a command center of the electronic device 100. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose-input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, a bus or Universal Serial Bus (USB) interface, and the like.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K via an I2C interface, such that the processor 110 and the touch sensor 180K communicate via an I2C bus interface to implement the touch functionality of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may communicate audio signals to the wireless communication module 160 via the I2S interface, enabling answering of calls via a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, audio module 170 and wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit the audio signal to the wireless communication module 160 through the PCM interface, so as to implement the function of answering a call through the bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a bluetooth headset.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the capture functionality of electronic device 100. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the connection relationship between the modules according to the embodiment of the present invention is only illustrative and is not limited to the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charging management module 140, and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may also be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including wireless communication of 2G/3G/4G/5G, etc. applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then passed to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves via the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou satellite navigation system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, connected to the display screen 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a user takes a picture, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, an optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and converting into an image visible to the naked eye. The ISP can also carry out algorithm optimization on noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
Among other things, in some embodiments of the present application, camera 193 may include multiple cameras. The plurality of cameras may include wide-angle cameras and non-wide-angle cameras. In one possible implementation, when the electronic device 100 turns on the mirror operating mode in the embodiments described below, the electronic device 100 may capture the video stream using a wide-angle camera. In this way, the electronic apparatus 100 can acquire a video screen of a wider viewing angle.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
In some embodiments, the electronic device 100 needs to identify image content in the captured picture, detect a location where the specified image content is tracked, and determine an outline of the specified image content. The application processor may feed the preprocessed RGB map (or BGR map or mono or grayscale map, etc.) into the NPU. The NPU may detect and track the position of the designated image content in the RGB map (or BGR map or unicolor map or grayscale map, etc.) through the AI model, and determine the outline of the designated image content. The NPU may output detection box information or contour information specifying the image content to the application processor. The application processor may determine crop box information based on the detection box information and/or contour information of the specified image. The application processor may crop the video stream based on the crop box information, or the application processor may instruct the ISP to crop the video stream based on the crop box information.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in the external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, and the like) required by at least one function, and the like. The storage data area may store data (such as audio data, phone book, etc.) created during use of the electronic device 100, and the like. In addition, the internal memory 121 may include a high speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a Universal Flash Storage (UFS), and the like.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headset interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus 100 can listen to music through the speaker 170A or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into a sound signal. When the electronic apparatus 100 receives a call or voice information, it can receive voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking the user's mouth near the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and perform directional recording.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but have different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., x, y, and z axes) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects a shake angle of the electronic device 100, calculates a distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude, aiding in positioning and navigation, from barometric pressure values measured by barometric pressure sensor 180C.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip phone, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 100 is stationary. The method can also be used for identifying the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and the like.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, electronic device 100 may utilize range sensor 180F to range for fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light to the outside through the light emitting diode. The electronic device 100 detects infrared reflected light from a nearby object using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there are no objects near the electronic device 100. The electronic device 100 can utilize the proximity sensor 180G to detect that the user holds the electronic device 100 close to the ear for talking, so as to automatically turn off the screen to save power. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. Electronic device 100 may adaptively adjust the brightness of display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and so on.
The temperature sensor 180J is used to detect temperature. In some embodiments, electronic device 100 implements a temperature processing strategy using the temperature detected by temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the electronic device 100 heats the battery 142 when the temperature is below another threshold to avoid the low temperature causing the electronic device 100 to shut down abnormally. In other embodiments, when the temperature is lower than a further threshold, the electronic device 100 performs boosting on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation acting thereon or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided via the display screen 194. In other embodiments, the touch sensor 180K may be disposed on the surface of the electronic device 100 at a different position than the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human body pulse to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor 180M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the electronic apparatus 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards can be the same or different. The SIM card interface 195 is also compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
The software system of the electronic device 100 may employ a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the present invention uses an Android system with a hierarchical architecture as an example to exemplarily explain a software structure of the electronic device 100.
Fig. 2 is a block diagram of a software structure of the electronic device 100 according to the embodiment of the present application.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 2, the application package may include camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc. applications.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
Content providers are used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide communication functions for the electronic device 100. Such as management of call status (including connection, hangup, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scrollbar text in a status bar at the top of the system, such as a notification of a running application in the background, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application layer and the application framework layer as binary files. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), Media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., OpenGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The following describes exemplary workflow of the software and hardware of the electronic device 100 in connection with capturing a photo scene.
When the touch sensor 180K receives a touch operation, a corresponding hardware interrupt is issued to the kernel layer. The kernel layer processes the touch operation into an original input event (including touch coordinates, a time stamp of the touch operation, and other information). The raw input events are stored at the kernel layer. And the application program framework layer acquires the original input event from the kernel layer and identifies the control corresponding to the input event. Taking the touch operation as a touch click operation, and taking a control corresponding to the click operation as a control of a camera application icon as an example, the camera application calls an interface of an application framework layer, starts the camera application, further starts a camera drive by calling a kernel layer, and captures a still image or a video through the camera 193.
The application provides an image processing method, wherein in the process of photographing or recording, the electronic device 100 can cut an image acquired by a camera through a cutting frame with a preset size and a preset motion track, and display the image in the cutting frame in a preview interface or a recording interface. Therefore, the shot video or picture has an automatic zooming effect, and the detailed characteristics of the shot subject are highlighted.
An image processing method according to an embodiment of the present application is described below with reference to an application scenario.
In some application scenarios, electronic device 100 may receive a user-selected mirror template prior to recording. The mirror moving template can comprise a size change rule of the cutting frame and a motion track of the cutting frame on an original image acquired by the camera. After the electronic device 100 starts recording, the electronic device 100 may clip each frame of original image collected by the camera through a clipping frame defined in the mirror transportation template, and display the clipped image in a recording interface according to a frame sequence. Therefore, the video shot by the electronic equipment 100 has an automatic zooming effect, and the appreciation of the video is improved.
Illustratively, as shown in fig. 3A, the electronic device 100 may display an interface 310 having a home screen, where the interface 310 displays a page in which application icons are placed, the page including a plurality of application icons (e.g., a weather application icon, a stock application icon, a calculator application icon, a settings application icon, a mail application icon, a gallery application icon 312, a music application icon, a video application icon, a browser application icon, etc.). And page indicators are also displayed below the application icons to indicate the position relationship of the currently displayed page and other pages. Below the page indicator are a plurality of tray icons (e.g., a dialing application icon, an information application icon, a contacts application icon, a camera application icon 311) that remain displayed during page switching. In some embodiments, the page may also include a plurality of application icons and a page indicator, the page indicator may not be a part of the page, and may exist alone, and the picture icon is also optional, which is not limited in this embodiment of the present application.
The electronic apparatus 100 may receive an input operation (e.g., a single click) by the user on the camera application icon 311, and in response to the input operation, the electronic apparatus 100 may display a photographing interface 320 as shown in fig. 3B.
As shown in fig. 3B, the shooting interface 320 may include a playback control 321, a shooting control 322, a camera conversion control 323, a camera-captured frame 324, a setting control 325, a zoom magnification control 326, and one or more shooting mode controls (e.g., a "night mode" control 327A, a "portrait mode" control 327B, a "large aperture" control 327C, a "general shooting mode" control 327D, a "recording mode" control 327E, a "professional mode" control 327F, and a more mode control 327G). The display-back control 321 can be used to display the captured image. The shooting control 322 is used to trigger saving of the image shot by the camera. The camera conversion control 323 can be used to switch the camera for taking a picture. The settings control 325 can be used to set a photographing function. The zoom magnification control 326 may be used to set the zoom magnification of the camera. The zoom magnification control 326 may trigger the electronic device 100 to display a zoom slider bar, which may receive an operation of sliding up (or down) by a user, so that the electronic device 100 increases (or decreases) the zoom magnification of the camera. Possibly, the zoom magnification control 326 may display a zoom up control and a zoom down control for the electronic device 100, where the zoom up control may be configured to receive and respond to an input from a user to trigger the electronic device 100 to increase the zoom magnification of the camera; the zoom reduction control may be operable to receive and respond to a user input to trigger the electronic device 100 to reduce a zoom magnification of the camera. The shooting mode control can be used for triggering and starting an image processing flow corresponding to the shooting mode. For example, a "night mode" control 372A may be used to trigger an increase in brightness and color richness in a captured image, among other things. The "portrait mode" control 327B may be used to trigger blurring of the background of a person in a captured image. The "big aperture" control 327C may be used to trigger the electronic device 100 to invoke image processing algorithms, highlight the subject, blur the background. As shown in fig. 3B, the photographing mode currently selected by the user is the "normal photographing mode".
Electronic device 100 can receive an input operation (e.g., a single click) from a user on "more" control 327G, in response to which electronic device 100 can display mode selection page 330 as shown in FIG. 3C.
As shown in fig. 3C, the mode selection page 330 includes one or more mode controls (e.g., "HDR mode" control, "delayed photography mode" control, "watermark mode" control, "slow motion mode" control, "intelligent knowledge mode" control, "document correction mode" control, "panorama mode" control, "streamer shutter mode" control, "3D dynamic mode" control, "online translation mode" control, "intelligent mirror motion mode" control 331, etc.).
In a possible implementation, the mode selection page 330 may be transparently displayed on the screen 324 captured by the camera.
Electronic device 100 may receive an input operation (e.g., a single click) from a user acting on "smart mirror mode" control 331, in response to which electronic device 100 may display a mirror template interface 340 as shown in fig. 3D.
As shown in fig. 3D, the mirror motion template selection interface 340 includes one or more mirror motion template options (e.g., "travel" mirror motion option 341A, "comfortable" mirror motion option 341B, "dynamic" mirror motion option 341C, etc.), a selection box 342, a mirror motion template display area 351, a mirror motion trajectory display area 353, an intelligent mirror motion prompt box 343, a redisplay control 346, and a mirror motion capture control 345. A mirror template playing control 352 is displayed on the mirror template display area 351. In the smart mirror-moving prompt box 343, with respect to the close control 344, the close control 344 may be used to trigger the electronic device 100 to exit the mirror-moving template selection interface 340. And the size change rules or the movement tracks of the cutting frames corresponding to different lens moving template options are different. As shown in fig. 3D, the selection box 342 has selected the "travel" mirror moving option 341A, and a video sample corresponding to the "travel" mirror moving option 341A is displayed in the mirror moving template display area 351. The moving mirror track display area 353 can display the crop box size change rule corresponding to the "travel" moving mirror option 341A and/or the moving track of the crop box in the original image. For example, the center point of the crop box corresponding to the "travel" mirror moving option 341A may be located at the same position as the center point of the original image, and the size of the crop box corresponding to the "travel" mirror moving option 341A may be reduced from the size of the original image to a preset minimum size in proportion to time, and then increased from the preset minimum size to the size of the original image in proportion to time. Wherein the size of the cropping frame corresponding to the "travel" mirror selection 341A is proportionally changed according to the size of the original image. The preset minimum size is smaller than the size of the original image.
The following embodiments are described in detail, and details are not repeated herein, where the cropping frame size change rules and/or the motion trajectory of the cropping frame in the original image correspond to different mirror-moving templates.
In a possible implementation manner, the mirror-moving template selection interface 340 may be transparently displayed on the screen 324 captured by the camera.
The electronic device 100 may receive an input operation (e.g., a single click) from a user acting on the moving mirror template play control 352, and in response to the input operation, the electronic device 100 may display a template picture, a crop box 355, and a center point 354 of the crop box 355 in the moving mirror trajectory display area 353, as shown in fig. 3E. The electronic device 100 may crop the template picture based on the cropping frame 355 to obtain a cropped picture. The electronic device 100 may adjust the size of the cropped picture to the display size of the moving mirror template display area 351, and display the cropped picture in the moving mirror template display area 351. In one possible implementation, the template picture may be each frame picture in a segment of a video sample.
As shown in FIG. 3F, the crop box 355 is reduced to a predetermined minimum size. The smaller the crop box 355 is, the larger the magnification in the picture displayed in the mirror template display area 351 is.
As shown in FIG. 3G, the crop box 355 may be made smaller and then larger after it has been made smaller to a predetermined minimum size. The magnification in the picture displayed in the mirror template display area 351 is gradually reduced from the highest value.
As shown in fig. 3H, the cropping frame 355 may gradually increase to the size of the original picture, and then decrease to a predetermined size after increasing to the size of the original picture.
The electronic device 100 may receive an input operation (e.g., a click) applied by a user to the moving mirror shooting control 345, and in response to the input operation, the electronic device 100 may display a video recording interface 360 as shown in fig. 3I, and use the cropping frame size change rule and/or the motion track of the cropping frame in the original image corresponding to the selected "travel" moving mirror template 341A to crop a video stream obtained by a camera during video shooting.
As shown in fig. 3I, the video recording interface 360 may include a preview image 371 captured by a camera, a playback control 346, a video capture control 361, a mirror-moving template switching control 362, an intelligent mirror-moving prompt frame 343, and a mirror-moving template preview window 363. A cropping effect picture corresponding to the selected mirror transportation template (for example, the mirror transportation template corresponding to the "travel" mirror transportation option 341A) is displayed in the mirror transportation template preview window 363. A close control 364 is included in the fortune mirror template preview window 363, and the close control 364 can be used to trigger the electronic device 100 to close the fortune mirror template preview window 363. The display position of the mirror template preview window 363 can be at the upper right corner of the display screen or the upper left corner of the display screen, which is not limited herein. The name of the mirror template that has been selected by the electronic device 100 (e.g., "travel") may be displayed on the mirror template toggle control 362. The mirror template switching control 362 can be used to trigger the electronic device 100 to switch the mirror template. In one possible implementation, the electronic device 100 may receive and move the display position of the moving mirror template preview window 363 in response to an input operation (e.g., a hold-and-drag) applied to the moving mirror template preview window 363 by a user.
The electronic device 100 may receive an input operation (e.g., a click) from a user on the video capture control 361, and in response to the input operation, the electronic device 100 may start to crop each frame of image captured by the camera into the video stream according to the crop box size change rule corresponding to the selected moving mirror template and/or the motion trajectory of the crop box in the original image. The electronic device 100 may adjust the cropped image of each frame to a designated size and sequentially display the cropped image in the video interface 360 in the order of frames. The electronic device 100 may save the cropped video stream.
As shown in fig. 3J, after the electronic device 100 starts recording, the electronic device 100 may display a recording time frame 366 and a recording end control 365 on the recording interface 360, and display a playback control 346 and a mode switching control 362 instead of the pause recording control 367 and the photographing control 368. The recording time frame 366 can be used to display the recording time of the electronic device 100. The record end control 365 may be used to end the recording of the electronic device 100. The pause video control 367 may be used to trigger the electronic device 100 to pause video recording. The photographing control 368 can be used to trigger the electronic device 100 to save one or more frames of the video recording frames as pictures during the video recording process. The electronic device 100 may crop each frame of original image acquired by the camera through the crop box, and intercept an image in the crop box. The electronic device 100 may resize the image within the crop box to a specified size for display on the video interface 360.
For example, the center point of the crop box may be in the same location as the center point of the original image. When the electronic device 100 receives the input operation of the user on the video capture control 361 in the first 8s, the size of the crop box may be gradually scaled down from the size of the original image stream to the preset minimum size, and when the electronic device 100 receives the input operation of the user on the video capture control 361 in the 8 th to 16 th seconds, the size of the crop box may be gradually scaled up to the size of the original image. As shown in fig. 3J, when the electronic apparatus 100 receives the 3 rd second after receiving the input operation of the user on the video capture control 361, the electronic apparatus 100 may acquire the original image P1 through the camera and determine the position of the crop box in the original image P1. The electronic apparatus 100 may intercept the image in the crop box on the original image P1, and adjust the intercepted image to a specified size, resulting in a cropped image 372. The electronic device 100 may display the cropped image 372 on the video recording interface 360. The electronic device 100 may display an effect picture of the selected mirror template at the 3 rd second in the mirror template preview window 363. The image content in the cropped image 372 is enlarged more than the image content in the preview image 371 shown in fig. 3I.
As shown in fig. 3K, when the electronic apparatus 100 receives the 8 th second after receiving the input operation of the user on the video capture control 361, the crop box is reduced to the preset minimum size, and the electronic apparatus 100 may acquire the original image P2 through the camera and determine the position of the crop box in the original image P2. The electronic apparatus 100 may intercept an image in the trimming frame on the original image P2, and adjust the intercepted image to a specified size, resulting in a trimmed image 373. Electronic device 100 may display cropped image 373 on recording interface 360. The electronic device 100 may display an effect picture of the selected mirror template at the 8 th second in the template preview window 363. The magnification of the image content in the trimming image 373 is larger than the magnification of the image content in the trimming image 372 shown in fig. 3J.
As shown in fig. 3L, when the electronic device 100 receives the 12 th second after the user input operation on the video capture control 361, and the size of the crop box is larger than the preset minimum size and smaller than the size of the frame image, the electronic device 100 may acquire the original image P3 through the camera and determine the position of the crop box in the original image P3. The electronic device 100 may intercept the image in the crop box on the original image P3 and adjust the intercepted image to a specified size, resulting in a cropped image 374. Electronic device 100 may display cropped image 374 on video interface 360. The electronic device 100 may display an effect picture of the selected moving mirror template at the 12 th second in the template preview window 363. The magnification of the image content in the trimming image 374 is smaller than the magnification of the image content in the trimming image 373 shown in fig. 3K.
The electronic device 100 may receive an input operation (e.g., a single click) from a user on the video recording end control 365, and in response to the input operation, the electronic device 100 may end video recording and store the video recording video clipped from the video stream through the clip box corresponding to the mirror moving template. As shown in fig. 3M, after the electronic device 100 finishes recording, the electronic device 100 may display the pause recording control 367 and the photographing control 368 instead of the playback control 346 and the mode switching control 362, and display an image frame in the video recording video that has been photographed on the playback control 346.
Electronic device 100 may receive an input operation (e.g., a single click) from a user acting on the playback control 346, in response to which electronic device 100 may display a video browsing interface 380 as shown in fig. 3N.
As shown in fig. 3N, the video browsing interface 380 may include a video 381 captured in the smart mirror mode, a video playing control 382, a total video time 383, and a menu 384. The menu 384 may include a share button, a favorites button, an edit button, a delete button, and a more button. The share button may be used to trigger sharing of the recorded video 381. The favorites button may be used to trigger the favorites of the video recording 381 to the picture favorites folder. The edit button may be used to trigger editing functions such as rotate, crop, add filters, blur, etc. to the video 381. A delete button may be used to trigger the deletion of the recorded video 381. More buttons may be used to trigger the opening of more functions associated with the recorded video 381.
Electronic device 100 may receive an input operation (e.g., a single click) from a user on video play control 382, and in response to the input operation, electronic device 100 may play video 381.
As shown in fig. 3O, after the electronic device 100 starts playing the video 381, a pause playing control 386 and a play progress bar 385 may be displayed on the video preview interface 380. The pause play control 386 can be used to pause the video 381. When recorded video 381 has played to the 3 rd second, electronic device 100 may display video screen 391 on video preview interface 380.
As shown in fig. 3P, when the recording video 381 is played to the 8 th second, the electronic device 100 may display a video screen 392 on the video preview interface 380. The magnification of the image content on the video screen 392 is smaller than the magnification of the image content on the video screen 391 shown in fig. 3O.
As shown in fig. 3Q, when recorded video 381 is played to 12 th second, electronic device 100 may display video frame 393 on video preview interface 380. The magnification of the image content in the video frame 393 is smaller than the magnification of the image content in the video frame 392 shown in fig. 3P.
In a possible implementation manner, after the electronic device 100 selects the moving mirror template, the electronic device may receive an input from a user at any time, and trigger to display the moving mirror template selection interface. Therefore, the user can conveniently select to replace other lens transporting templates.
Illustratively, as shown in FIG. 4A, the electronic device 100 may display a video recording interface 360. For the text description of the video interface 360, reference may be made to the embodiment shown in fig. 3I, which is not described herein again.
Electronic device 100 may receive an input operation (e.g., a single click) from a user on a moving mirror template toggle control 362, in response to which electronic device 100 may display a moving mirror template selection interface 340 as shown in fig. 4B. For the text description of the mirror moving template selection interface 340, reference may be made to the embodiment shown in fig. 3D, which is not described herein again.
As shown in FIG. 4B, the "travel" mirror template 341A has been selected by the selection box 342. The electronic device 100 may receive an input operation (e.g., a single click) by the user acting on the "animated" moving mirror template 341C, and in response to the input operation, the electronic device 100 may select the "animated" moving mirror template 341C.
As shown in FIG. 4C, the selection box 342 has selected the "animated" mirror motion template 341C. The electronic device 100 may receive an input operation (e.g., a single click) from a user on the moving mirror photographing control 345, and in response to the input operation, the electronic device 100 may display the photographing interface 410 as shown in fig. 4D and crop a video frame captured by the camera during video photographing using the crop box information (including the crop box size change rule and/or the movement track of the crop box in the original image) corresponding to the selected "dynamic" moving mirror template 341C.
As shown in fig. 4D, the shooting interface 410 may include a preview image 441 captured by a camera, a playback control 346, a video shooting control 361, a mirror-moving template switching control 362, an intelligent mirror-moving prompt frame 343, and a mirror-moving template preview window 431. The cropping effect video corresponding to the selected mirror cropping template (for example, the mirror cropping template corresponding to the "dynamic" mirror cropping option 341C) is displayed in the mirror cropping template preview window 431. A closing control 432 is included in the lucky mirror template preview window 431, and the closing control 432 can be used for triggering the electronic device 100 to close the lucky mirror template preview window 431. The mirror template switching control 362 may display the name (e.g., "animation") of the mirror template selected by the electronic apparatus 100. The mirror template switching control 362 can be used to trigger the electronic device 100 to switch the mirror template.
In one possible implementation, the electronic device 100 may receive an input from a user to close/open a shooting template preview window on a shooting interface. In this way, the electronic device 100 can open/close the mirror transportation template preview window at any time according to the requirement of the user, so that the mirror transportation template preview window does not always shield the shot picture.
For example, as shown in fig. 4D, electronic device 100 may receive an input operation (e.g., a single click) from a user on close control 432, and in response to the input operation, electronic device 100 may close the scope template preview window 431 and display a picture-in-picture open control 433 as shown in fig. 4E.
As shown in fig. 4E, the pip splash control 433 may be displayed on the top of the right side of the display screen, or may be displayed in other positions without limitation. Electronic device 100 may receive an input operation (e.g., a single click) from a user with respect to picture-in-picture splash control 433, and in response to the input operation, electronic device 100 may preview a window 431 of the frosting template as shown in fig. 4F.
In one possible implementation, the electronic device 100 may display a viewfinder presentation window on the shooting interface after determining the mirror-moving template selected by the user. The view finding display window comprises an original image and a cutting frame which are acquired by a camera. In this way, the electronic device 100 may present the dynamic change of the crop box to the user, facilitating the user to perceive the position of the crop box in the original image.
Illustratively, as shown in FIG. 5A, electronic device 100 may display a fortune mirror template selection interface 340. For the text description of the mirror moving template selection interface 340, reference may be made to the embodiment shown in fig. 3D, which is not described herein again. The electronic device 100 may receive an input operation (e.g., a single click) by the user on the moving mirror photographing control 345, and in response to the input operation, the electronic device 100 may display a photographing interface 510 as shown in fig. 5B.
As shown in fig. 5B, the shooting interface 510 includes an original image 511 captured by a camera, a playback control 346, a video shooting control 361, a mirror-moving template switching control 362, an intelligent mirror-moving prompt box 343, and a view-finding display window 521. In the finder display window 521, an original image 511a (the same as the image content included in the original image 511a, and the display scale is different) and a close control 522 are displayed. The close control 522 can be used to trigger the electronic device 100 to close the viewfinder presentation window 521. The viewfinder display window 521 may be located at the upper right corner of the display screen or the upper left corner of the display screen, which is not limited herein.
In one possible implementation, the electronic device 100 may receive and move the display position of the viewfinder presentation window 521 in response to an input operation (e.g., a press-and-drag) applied to the viewfinder presentation window 521 by a user.
The electronic device 100 may receive an input operation (e.g., a single click) from a user on the video capture control 361, and in response to the input operation, the electronic device 100 may start to clip each frame of image of the video stream captured by the camera according to the cropping frame size change rule corresponding to the selected moving mirror template and/or the motion track of the cropping frame in the original image. The electronic device 100 may adjust the cropped image of each frame to a designated size and sequentially display the cropped image in the video interface 360 according to the frame order, and display the position of the cropping frame on the original image in the framing display window 521. The electronic device 100 may save the cropped video stream.
As shown in FIG. 5C, when the electronic device 100 starts recording, the electronic device 100 may display a recording time frame 366 and a recording end control 365 on the recording interface 360. The electronic apparatus 100 may display the original image captured by the camera in the viewfinder presentation window 521, and display the position of the crop box 523 in the original image in the viewfinder presentation window 521. The electronic device 100 may cut out the image in the cropping frame 523 on the original image, and adjust the image to a designated size to obtain a cropped image. The electronic device 100 may display the cropped image as a video screen on the video interface 360. For example, when the electronic device 100 receives the 3 rd second after the user input operation on the video capture control 361, the electronic device 100 may acquire the original image 512a through the camera and determine the position of the crop box 523 in the original image 512 a. The electronic device 100 may cut out an image in the cropping frame 523 on the original image 512a, and adjust the cut-out image to a specified size to obtain the cropped image 512. The electronic device 100 may display the original image 512a in the viewfinder presentation window 521, and display the position of the crop box 523 on the original image 512a in the viewfinder presentation window 521. The electronic device 100 may display the image 512 on the video interface 360. The image content in the cropped image 512 is enlarged more than the image content in the original image 511 shown in fig. 5B.
In a possible implementation manner, the electronic device 100 may have a plurality of cameras, and the electronic device 100 may display the image captured by one camera (e.g., a wide-angle camera) of the plurality of cameras in the viewfinder display window 521, and display an image captured by another camera (e.g., a depth-of-field camera) of the plurality of cameras with an adjusted zoom ratio in the shooting interface 510 as a preview or a video.
In a possible implementation manner, the electronic device 100 may adjust an original image captured by a camera to a specified magnification and display the adjusted original image in the viewfinder display window 521. The electronic device 100 may perform cropping through the cropping frame based on the original image adjusted to the designated magnification to obtain a cropped image, and display the cropped image as a preview screen or a recording screen in the shooting interface 510.
When the electronic device 100 captures a video, the electronic device 100 acquires an image stream through a camera. The size of each frame of the original image in the image stream is fixed. The electronic device 100 may determine the position of the crop box in each frame of the original image in the image stream according to the crop box size change rule corresponding to the selected mirror template and/or the motion track of the crop box in the original image. Then, the electronic device 100 may crop each frame of the original image in the image stream through the cropping frame to obtain a cropped image stream within the cropping frame. The electronic device 100 may resize the cropped image stream within the crop box to a specified size and display the cropped image stream on the video interface 360 in frame order.
For example, the size of the cropping frame may be gradually scaled down from the size of the original image to a preset minimum size within the first 8s after the electronic device 100 receives the input operation of the user on the video capture control 361, and then gradually scaled up to the size of the original image when the electronic device 100 receives the input operation of the user on the video capture control 361 from the 8 th second to the 16 th second.
As shown in fig. 5D, when the electronic device 100 receives the 8 th second after the input operation of the user on the video capture control 361, the crop box 523 is reduced to the preset minimum size. The electronic device 100 may acquire the original image 513a through a camera and determine the position of the cropping frame 523 in the original image 513 a. The electronic device 100 may cut out an image in the trimming frame 523 on the original image 513a, and adjust the cut-out image to a specified size, resulting in the trimmed image 513. The electronic apparatus 100 may display the original image 513a in the viewfinder presentation window 521, and display the position of the crop box 523 on the original image 513a in the viewfinder presentation window 521. Electronic device 100 may display cropped image 513 on video interface 360. The magnification of the image content in the cropped image 513 is greater than the magnification of the image content in the cropped image 512 shown in fig. 5C.
As shown in fig. 5E, when the electronic apparatus 100 receives the 12 th second after the input operation of the user on the video capture control 361, the size of the crop box 523 is larger than the preset minimum size and smaller than the size of the frame image. The electronic device 100 may capture the 12 th second frame image through the camera and determine the position of the crop box in the original image 514 a. The electronic device 100 may cut out an image in the cropping frame 523 on the original image 514a, and adjust the cut-out image to a designated size to obtain the cropped image 514. The electronic apparatus 100 may display the original image 514a in the finder presentation window 521, and display the position of the crop box 523 on the original image 514a in the finder presentation window 521. Electronic device 100 may display cropped image 514 on recording interface 360. The magnification of the image content in the trimmed image 514 is smaller than the magnification of the image content in the trimmed image 513 shown in fig. 5D.
The electronic device 100 may receive an input operation (e.g., a single click) from a user on the video recording end control 365, and in response to the input operation, the electronic device 100 may end video recording and store the video recorded after the video stream is clipped by the clip box information corresponding to the foregoing mirror-moving template.
In some application scenarios, the electronic device 100 may select the mirror-moving mode for taking a picture. After electronic device 100 switches to the mirror moving mode, electronic device 100 may receive a mirror moving template selected by a user. Different lens moving templates can correspond to different cutting frame information (including the size change rule of the cutting frame and the motion track of the cutting frame on the original image acquired by the camera). The electronic device 100 may crop each frame of image collected by the camera through the cropping frame information corresponding to the mirror template, and display the cropped image as a preview image in the shooting interface. When the electronic apparatus 100 receives an input (e.g., a single click) of the user acting on the photographing control, the electronic apparatus 100 may save the preview screen displayed at this time as a picture. In this way, the electronic device 100 may implement automatic zooming through the user-selected mirror-moving template, facilitating the user to select a subject of the scene or a scene background.
For example, as shown in fig. 6A, the electronic device 100 may receive an input operation (e.g., a single click) of selecting the mirror mode control 327H by the user, and in response to the input operation, the electronic device 100 may switch from the "normal photographing mode" to the "smart mirror mode". In the intelligent mirror-moving mode, the electronic device 100 may cut the frame image acquired by the camera according to the cutting frame size change rule corresponding to the mirror-moving template selected by the user and/or the motion track of the cutting frame in the original image, and display the cut image as a preview image on the shooting interface.
As shown in fig. 6B, the electronic device 100 may display a panning template selection interface 610 in response to an input operation (e.g., a single click) by the user selecting the panning mode control 327H. The mirror selection interface 610 may include one or more mirror template options (e.g., "travel" mirror selection 611A, "comfortable" mirror selection 611B, "dynamic" mirror selection 611C, etc.), a selection box 612, a determination control 613, a mirror template presentation area 614, and a mirror trajectory presentation area 616. A mirror template play control 615 may be displayed on the mirror template presentation area 614. The fortune mirror trajectory display area 616 may display the center point 617 of the crop box corresponding to the "travel" fortune mirror option 611A in the original image, and the size change. For example, the center point of the crop box corresponding to the "travel" mirror motion option 611A may be located at the same position as the center point of the original image, and the size of the crop box corresponding to the "travel" mirror motion option 611A may be scaled down from the size of the original image to a preset minimum size and then scaled up from the preset minimum size to the size of the original image. Optionally, the template selection interface 610 may further display one or more shooting mode controls (e.g., "night mode" control 327A, "portrait mode" control 327B, "large aperture" control 327C, "fortune mirror" mode control 327H, "normal shooting mode" control 327D, "recording mode" control 327E, "professional mode" control 327F, etc.). For the text description of one or more shooting mode controls, reference may be made to the embodiment shown in fig. 3B, which is not described herein again.
The electronic device 100 may receive an input operation (e.g., a single click) from the user on the moving mirror template play control 615, and in response to the input operation, the electronic device 100 may display a template picture, a crop box 618, and a center point 617 of the crop box 618 in a moving mirror trajectory presentation area 616, as shown in fig. 6C. The electronic device 100 may crop the template picture based on the cropping frame 618 to obtain a cropped picture. The electronic device 100 may resize the cropped picture to the display size of the moving mirror template display area 614 for display in the moving mirror template display area 614.
The electronic device 100 may receive an input operation (e.g., a single click) from a user on the determination control 613, and in response to the input operation, the electronic device 100 may display the shooting interface 620 as shown in fig. 6D, and crop the original preview screen captured by the camera using the cropping frame size change rule and/or the movement track of the cropping frame in the original image corresponding to the selected "travel" scope template 611A.
As shown in fig. 6D, the shooting interface 620 may include a mirror template switching control 621, a view finding display window 622, a cut image 631, one or more shooting mode controls (e.g., a "night view mode" control 327A, a "portrait mode" control 327B, a "large aperture" control 327C, a "mirror" mode control 327H, a "normal shooting mode" control 327D, a "video recording mode" control 327E, a "professional mode" control 327F, etc.), a playback control 321, a shooting control 322, and a camera conversion control 323. For the text description of one or more shooting mode controls, the redisplay control 321, the shooting control 322, and the camera conversion control 323, reference may be made to the embodiment shown in fig. 3B, which is not described herein again. The fortune mirror template switching control 621 may be configured to trigger the electronic device 100 to display the fortune mirror template selection interface 610 and switch the fortune mirror template selected by the user based on the input of the user. The viewfinder presentation window 622 can be displayed with an original image 631 a. The close control 623 may be used to trigger the electronic device 100 to close the viewfinder presentation window 623. As shown in fig. 6D, the size of the current crop box is the same as the size of the original image 631. Therefore, the image content in the cropped image 631 is the same as the image content in the original image 631 a.
For example, in the first 8s after the electronic device 100 receives the input operation of the user on the determination control 613 illustrated in fig. 6C, the size of the crop box may be gradually scaled down from the size of the original image stream to the preset minimum size, and when the electronic device 100 receives the 8 th to 16 th seconds after the input operation of the user on the determination control 613, the size of the crop box may be gradually scaled up to the size of the original image.
As shown in fig. 6E, when the electronic device 100 receives the 3 rd second after the input operation of the user on the determination control 613, the electronic device 100 may acquire the original image 632a through the camera and determine the position of the crop box 624 in the original image 632 a. The electronic apparatus 100 may clip the image in the cropping frame 624 on the original image 632a, and adjust the clipped image to a specified size, resulting in a cropped image 632. The electronic device 100 may display the original image 632a in the viewfinder presentation window 622, and display the position of the crop box 624 on the original image 632a in the viewfinder presentation window 622. The electronic device 100 may display the cropped image 632 on the capture interface 620. The magnification of the image content in the cropped image 632 is greater than the magnification of the image content in the cropped image 631 shown in fig. 6D.
As shown in fig. 6F, when the electronic apparatus 100 receives the 8 th second after the input operation of the user on the above-mentioned determination control 613, the electronic apparatus 100 may acquire the original image 633a through the camera and determine the position of the crop box in the original image 633 a. The electronic apparatus 100 may cut out the image within the trimming frame 624 on the original image 633a and adjust the cut-out image to a specified size, resulting in the trimmed image 633. The electronic device 100 may display the original image 633a in the viewfinder presentation window 622, and display the position of the crop box 624 on the original image 633a in the viewfinder presentation window 622. The electronic device 100 may display the cropped image 633 on the capture interface 620. The magnification of the image content in the trimming image 633 is greater than the magnification of the image content in the trimming image 632 shown in fig. 6E.
The electronic apparatus 100 may receive an input operation (e.g., a single click) by the user on the photographing control 322, and in response to the input operation, the electronic apparatus 100 may save the cut image 633 displayed on the photographing interface 620 as a picture and display a thumbnail of the cut image 633 on the playback control 321.
In a possible implementation manner, the electronic device 100 may display snapshot prompt information on the shooting interface 620 during the process of displaying the shooting interface 620, where the snapshot prompt information may be used to prompt the user to trigger saving of the preview picture displayed on the shooting interface 620 as a picture.
In one possible implementation, when the electronic device 100 receives an input operation (e.g., a click) of a user on the shooting control 322 on the shooting interface 620, in response to the input operation, the electronic device 100 may save a plurality of frames of cropped images before and after receiving the input operation on the shooting control 322 as a video frame of a video file.
In one possible implementation, the electronic device 100 may determine, based on a mirror template selected by a user, one frame of cut image that satisfies a certain condition from a plurality of frames of cut images cut by the electronic device 100, and store the one frame of cut image as a picture.
In one possible implementation, after the electronic device 100 receives an input (e.g., a single click) from a user acting on the determination control 613 illustrated in fig. 6C, the electronic device 100 may automatically save a plurality of frames of cropped images within a preset time as a video frame of a video file.
As shown in fig. 6G, when the electronic device 100 receives the 12 th second after the input operation of the user on the determination control 613, the electronic device 100 may acquire the original image 634a through the camera and determine the position of the crop box in the original image 634 a. The electronic device 100 may crop out the image within the crop box 624 on the original image 634a and adjust the cropped image to a specified size, resulting in a cropped image 634. The electronic device 100 may display the original image 634a in the viewfinder presentation window 622, and display the position of the crop box 624 on the original image 634a in the viewfinder presentation window 622. The electronic device 100 may display the cropped image 634 on the capture interface 620. The image content in the cut image 634 is magnified less than the image content in the cut image 633 shown in fig. 6F. The thumbnail of the above-described cut-out image 633 can be displayed on the playback control 321.
Electronic device 100 may receive an input operation (e.g., a single click) from a user on the bring-back control 346, and in response to the input operation, electronic device 100 may display a picture browsing interface 640 as shown in fig. 6H.
As shown in fig. 6H, the picture browsing interface 640 may include a picture 641 and a menu 642. Here, the picture 641 is the same as the trimming image 633 shown in fig. 6F described above. For the text description of the menu 642, reference may be made to the embodiment shown in fig. 3N, which is not repeated herein.
In one possible implementation, the electronic device 100 may receive an input operation (e.g., a long press) by the user on the photographing control 322, and in response to the input operation, the electronic device 100 may save the cropped image 633 displayed on the photographing interface 620 together with the above-described viewfinder presentation window 622 as a picture.
The mirror template referred to in the embodiments of the present application will be described below.
In some embodiments, the crop box may be rectangular, the center of the crop box may be co-located with the center of the original image, and the size of the crop box is proportional to the size of the original image. The size of the crop box may be gradually scaled down from the size of the original image to a preset minimum size and then gradually scaled up from the preset minimum size to the size of the original image over time. The crop box may repeat the above-described variation process in a loop.
In one possible implementation, the size of the crop box may be gradually scaled up from a preset minimum size to the size of the original image and then gradually scaled down from the size of the original image to the preset minimum size over time. The crop box may repeat the above-described variation process in a loop.
Illustratively, as shown in fig. 7A, the original image 701 is rectangular. The center of the original image 701 may be co-located with the center of the crop box 702, i.e., both at location 704. The center of the predetermined minimum size area 703 is also the location 704. The size of the crop box 702 may vary proportionally between the size of the original image 701 and a preset minimum size. The width of the crop box 702 may be x, and the height may be y. The original image 701 may have a width a and a height B. The predetermined minimum size area 703 may have a width and b height. Wherein, the width x and the height y of the cropping frame 702 can satisfy the following relationship: where x/y is a/B and a ≦ x ≦ a (or B ≦ y ≦ B), the width x of the crop box 702 may be reduced from the width value a to the width value a, and the height y may be reduced from the height value B to the height value B.
As shown in FIG. 7B, the crop box 702 has been reduced in size to a predetermined minimum size (width a, height B).
As shown in fig. 7C, after the size of the crop box 702 is reduced to the preset minimum size (width a, height B), the size of the crop box 702 may gradually increase to the size of the original image 701 (width a, height B).
The electronic device 100 may acquire a plurality of frames of original images through the acquisition camera, and the electronic device 100 may determine the position of the crop box in each frame of original image according to the information of the crop box (including the rule of the change of the size of the crop box and/or the motion track of the crop box in the original image). The electronic device 100 may intercept the image in the cropping frame in each frame of the original image according to the position of the cropping frame in each frame of the original image, and adjust the image to a specified size, thereby obtaining a plurality of frames of the cropped image. The electronic device 100 may display the plurality of frames of cropped images in the shooting interface or the video interface in the above embodiments in sequence according to the frame order. The electronic device 100 may display the frame of cut image on the shooting interface or the video interface every time a frame of cut image is cut, or may display the plurality of frames of cut images on the shooting interface or the video interface in sequence according to the frame order after cutting out the plurality of frames of cut images.
For example, as shown in fig. 7D, when the electronic device 100 acquires the original image 721 through a camera, the electronic device 100 may determine, according to the cropping frame information, that a center point of the cropping frame 712 on the original image 721 is located at the same position as a center point of the original image 721, that is, both are located at the position 714. At this time, the size of the cropping frame 712 is scaled down from the size of the original image 721 to between the size of the original image 721 and the preset minimum size. The electronic apparatus 100 may clip the image in the cropping frame 712 on the original image 721 and adjust to a specified size to obtain a cropped image 722. The electronic device 100 may display the cropped image 722 on the capture interface or the video interface.
As shown in fig. 7E, after acquiring the original image 721, the electronic apparatus 100 acquires an original image 731 by using a camera. The electronic device 100 can determine that the center point of the cropping frame 712 on the original image 731 is at the same position as the center point of the original image 731, i.e., both at the position 714, according to the cropping frame information. The size of the cropping frame 712 is now scaled down from the size of the original image 731 to a preset minimum size, i.e., the size of the cropping frame 712 coincides with the preset minimum size region 713. The electronic device 100 may crop out the image in the cropping frame 712 of the original image 731 and adjust to a specified size to obtain a cropped image 732. The electronic device 100 may display the cropping image 732 on the shooting interface or the video interface. The magnification of the subject in the cropped image 732 is larger than the magnification of the subject in the cropped image 722 shown in fig. 7D.
As shown in fig. 7F, after acquiring the original image 731, the electronic device 100 acquires an original image 741 by using a camera. The electronic device 100 may determine that the center point of the cropping frame 712 on the original image 741 is located at the same position as the center point of the original image 741, that is, both are located at the position 714 according to the cropping frame information. At this time, the size of the crop box 712 is scaled up from a preset minimum size to between the size of the original image 741 and the preset minimum size. The electronic device 100 can intercept the image in the cropping box 712 on the original image 741, and adjust to the specified size to obtain the cropped image 742. The electronic device 100 may display the cropped image 742 on the camera interface or the video interface. The magnification of the subject in the trimmed image 742 is smaller than the magnification of the subject in the trimmed image 742 shown in fig. 7D.
In some embodiments, the center of the crop box may not be co-located with the center of the original image. The size of the crop box varies in proportional relation to the size of the original image. The size of the crop box may be gradually scaled down from a maximum size to a preset minimum size and then gradually scaled up from the preset minimum size to the maximum size over time. The crop box may repeat the above-described variation process in a loop. The electronic device 100 may determine the maximum size of the crop box according to the center point of the crop box and the size of the original image. The maximum size is in proportional relation with the original image, when the cutting frame is in the maximum size, the coverage range of the cutting frame does not exceed the original image, and any side frame of the cutting frame is in contact with the frame of the original image.
In one possible implementation, the size of the crop box may be gradually scaled up from a preset minimum size to a maximum size and then gradually scaled down from the maximum size to the preset minimum size over time. The crop box may repeat the above-described variation process in a loop.
Illustratively, as shown in FIG. 8A, the original image 801 is rectangular. The center of the original image 801 may be at a different location than the center of the crop box 802, the center of the original image 801 may be at location 806, the center of the crop box 802 may be at location 805, the center of the preset minimum size area 803 may also be at location 805, and the center of the maximum size area 804 may also be at location 805. Wherein any border (e.g., the lower border) of the maximum size area 804 contacts the border of the original image. The width of the crop box 8002 is x, and the height may be y. The original image 801 may have a width and B height. The predetermined minimum size region 803 may have a width and a height. The maximum size region 804 may have a width c and a height d. Wherein, the width x and the height y of the cropping frame 802 can satisfy the following relationship: the width x of the crop box 802 can be reduced from a width value c to a width value a, and the height y can be reduced from a height value d to a height value B.
As shown in FIG. 8B, the crop box 802 has been reduced in size to a predetermined minimum size (width a, height B).
As shown in fig. 8C, after the size of the cropping frame 802 is reduced to a preset minimum size (width a, height b), the size of the cropping frame 802 may gradually increase to the size of the original image 801 (width C, height d).
The electronic device 100 may acquire a plurality of frames of original images through the acquisition camera, and the electronic device 100 may determine the position of the crop box in each frame of original image according to the information of the crop box (including the rule of the change of the size of the crop box and/or the motion track of the crop box in the original image). The electronic device 100 may intercept the image in the cropping frame in each frame of the original image according to the position of the cropping frame in each frame of the original image, and adjust the image to a specified size, so as to obtain a multi-frame cropped image. The electronic device 100 may sequentially display the plurality of frames of cropped images on the shooting interface or the video recording interface in the above embodiments in the order of frames. The electronic device 100 may display a frame of cut image on the shooting interface or the video interface every time the frame of cut image is cut, or may display a plurality of frames of cut images on the shooting interface or the video interface in sequence according to a frame sequence after the plurality of frames of cut images are cut.
For example, as shown in fig. 8D, the electronic device 100 acquires the original image 821 through a camera, and the electronic device 100 may determine that the center point of the cropping frame 812 on the original image 821 is at the position 815 according to the cropping frame information. At this point, the size of the crop box 812 is scaled down from a maximum size to between the maximum size and a preset minimum size. The electronic device 100 may intercept the image in the crop box 812 on the original image 821 and adjust to a specified size to obtain a cropped image 822. The electronic device 100 may display the cropped image 822 on the camera interface or the video interface.
As shown in fig. 8E, after acquiring the original image 821, the electronic apparatus 100 acquires an original image 831 through the camera. The electronic device 100 may determine that the center point of the crop box 812 on the original image 831 is at the location 815 according to the crop box information. The size of the crop box 812 is then scaled down from the maximum size to the preset minimum size, i.e., the size of the crop box 812 coincides with the preset minimum size area 813. The electronic device 100 may intercept the image in the crop box 812 on the original image 831 and adjust to a specified size to get the cropped image 832. The electronic device 100 may display the cropping image 832 on the shooting interface or the video interface. The magnification of the subject in the clipped image 832 is larger than the magnification of the subject in the clipped image 822 shown in fig. 8D.
As shown in fig. 8F, the electronic apparatus 100 acquires the original image 841 through the camera after acquiring the original image 831. The electronic device 100 may determine that the center point of the crop box 712 on the original image 841 is at location 815 based on the crop box information. At this time, the size of the crop box 812 is scaled up from a preset minimum size to between the maximum size and the preset minimum size. The electronic device 100 may intercept the image in the crop box 812 on the original image 841 and adjust to the specified size to obtain a cropped image 842. The electronic device 100 may display the cropped image 842 on the camera interface or the video interface. The magnification of the subject in the clipped image 842 is smaller than the magnification of the subject in the clipped image 842 shown in fig. 8D.
In one possible implementation, the center of the crop box may not be in the same position as the center of the original image. The size of the crop box varies in proportional relation to the size of the original image. First, the center point of the crop box may be fixed, and the size of the crop box may gradually increase from a preset minimum size to a point where the first edge of the crop box touches the boundary of the original image over time. Then, the crop box keeps the first edge in contact with the boundary of the original image, the center point of the crop box can move, and the size of the crop box continues to increase until the first edge and the second edge of the crop box both contact the boundary of the original image. Then, keeping the first edge and the second edge of the cutting frame in contact with the boundary of the original image, the center point of the cutting frame can move, and the size of the cutting frame continuously increases to the size of the original image. The crop box may repeat the above-described variation process.
Illustratively, as shown in fig. 8G, at time t1, the center point of the crop box 803 is at the position point 805 on the original image 801, and the center point of the original image 801 is at the position point 806, but not at the same position. The size of the crop box 803 is a preset minimum size. At time t2, the center point of the crop box 803 is at the location point 805, and the size of the crop box 803 increases proportionally with respect to time t 1. At time t3, the center point of the crop box 803 is at the location point 805, and the size of the crop box 803 is scaled up relative to time t1 until the lower border of the crop box 803 contacts the lower border of the original image 801. In the period from t3 to t4, the lower frame of the trimming frame 803 is always in contact with the lower boundary of the original image 801, and the size of the trimming frame 803 increases in proportion. At time t4, the center point of the crop box 803 increases proportionally to the size of the crop box 803 displaced relative to time t1 until the lower border of the crop box 803 contacts the lower border of the original image 801 and the left border of the crop box 803 contacts the left border of the original image 801. In the time period from t4 to t5, the lower frame of the trimming frame 803 is always in contact with the lower boundary of the original image 801 and the left frame of the trimming frame 803 is always in contact with the left boundary of the original image 801, and the size of the trimming frame 803 increases in proportion. At time t5, the size of the crop box 803 is increased to the size of the original image.
In some embodiments, the size of the crop box may be a fixed size, and the center point of the crop box may move on the original image according to a preset trajectory. The original image may be a rectangle, the crop box may also be a rectangle, and the ratio of the width to the height of the crop box is equal to the ratio of the width to the height of the original image.
Illustratively, as shown in FIG. 9A, the center point 913 of the crop box 912 may move left or right on the original image 911 in a straight line trajectory 921 between the location 922 and the location 923. When the center point 913 of the crop frame 912 is at the position 922, the left frame of the crop frame 912 may coincide with the left frame of the original image 911. When the center point 913 of the crop frame 912 is at the position 923, the right frame of the crop frame 912 may coincide with the right frame of the original image 911.
As shown in fig. 9B, the center point 913 of the crop box 912 may move up and down on the original image 911 in a straight-line trajectory 931 between a position 932 and a position 933. When the center point 913 of the crop box 912 may be at the position 932, the top border of the crop box 912 may coincide with the top border of the original image 911. When the center point 913 of the crop frame 912 is at the position 933, the lower frame of the crop frame 912 may coincide with the lower frame of the original image 911.
As shown in fig. 9C, the center point 913 of the crop box 912 can move between a position 942 and a position 943 in a straight line trajectory 941 from the upper left corner to the lower right corner on the original image 911. When the center point 913 of the crop box 912 may be at the location 942, the top frame of the crop box 912 may coincide with the top frame of the original image 911 and the left frame of the crop box 912 may coincide with the left frame of the original image 911. When the center point 913 of the crop frame 912 is at position 943, the lower border of the crop frame 912 may coincide with the lower border of the original image 911, and the right border of the crop frame 912 may coincide with the right border of the original image 911.
As shown in fig. 9D, the center point 913 of the crop box 912 can move between the position 952 and the position 953 in a straight line trajectory 951 from the lower left corner to the upper right corner on the original image 911. When the center point 913 of the crop frame 912 may be at the position 952, the lower border of the crop frame 912 may coincide with the lower border of the original image 911 and the left border of the crop frame 912 may coincide with the left border of the original image 911. When the center point 913 of the crop frame 912 is at the position 953, the top frame of the crop frame 912 may coincide with the top frame of the original image 911 and the right frame of the crop frame 912 may coincide with the right frame of the original image 911.
As shown in fig. 9E, the motion trajectory 961 of the center point 913 of the crop box 912 may be a heart-shaped figure, wherein the crop box 912 may move clockwise or counterclockwise on the motion trajectory 961. The crop box 912 does not always exceed the size range of the original image 911 when moving clockwise or counterclockwise on the motion trajectory 961.
As shown in fig. 9F, the movement track 971 of the center point 913 of the crop box 912 may be a circular figure, wherein the crop box 912 may move clockwise or counterclockwise on the movement track 971. The crop box 912 always does not exceed the size range of the original image 911 when moving clockwise or counterclockwise on the motion trajectory 971.
It should be noted that the movement trajectory of the center point 912 of the crop box 912 in the original image 911 is not limited to the above-mentioned figure, and other figures, such as a horizontal "8" curve figure, a bezier curve figure, a circular arc curve figure, and the like, may be provided.
In a possible implementation manner, when the electronic device 100 displays a view finding display window on a shooting interface or a video recording interface in a shooting or video recording mode, the electronic device 100 may display a motion track of the crop box on the original image in the view finding display window. When the crop box has moved once along the motion trajectory, the electronic device 100 may display a special effect (e.g., a firework special effect, a kaleidoscope special effect, a heart beat special effect, etc.) in the viewfinder display window.
Illustratively, as shown in fig. 9G, the electronic device 100 may display a capture interface 980. The shooting interface 980 may include a moving mirror template switching control 981, a viewfinder presentation window 982, a cropped image 991, and one or more shooting mode controls. For the text description of one or more shooting mode controls, reference may be made to the embodiment shown in fig. 3B, which is not described herein again. The name of the mirror movement template currently used by the electronic device 100 (for example, "circular mirror movement") is displayed on the mirror movement template switching control 981. The electronic apparatus 100 may display the original image 991a, the close control 983, the crop box 984, and the movement locus of the crop box 984 acquired at this time in the viewfinder presentation window 982. The electronic apparatus 100 crops the raw image 991a and the crop box 984 to obtain the cropped image 991 based on the positions of the raw image 991a and the crop box 984 in the raw image 991 a.
As shown in fig. 9H, the crop box 984 is moved once according to a preset motion trajectory (e.g., a circular trajectory). The electronic device 100 may acquire the original image 992a by using a camera and display the original image 992a in the viewfinder view window 982. The electronic apparatus 100 may cut out an image on the original image 992a within the trimming frame 984, and adjust the cut-out image to a specified size, resulting in the trimmed image 992. The electronic device 100 may display the cropped image 992 on the capture interface 980 and display a special effects screen 986 (e.g., a fireworks special effects screen, etc.).
In one possible implementation, the crop box is moved once according to a preset motion track (e.g., a circular track). The electronic device 100 may display a special effect picture on the shooting interface, and while displaying the characteristic picture, adjust the size of the crop box to gradually increase to the size of the original image over time, and display the crop image in the crop box on the shooting interface or the video interface.
When the electronic device 100 displays the viewfinder display window on the video recording interface in the mirror-moving video recording mode, the electronic device 100 may receive an input of a user to end video recording, and in response to the input, the electronic device 100 may store the clip image displayed on the video recording interface within the video recording time and the special-effect image together as a video image in the video file. Alternatively, the electronic device 100 may store the cut image displayed on the recording interface during the recording time and the viewfinder display window together as a video image in the video file. Alternatively, the electronic device 100 may store the clip image, the special effect image, and the viewfinder presentation window, which are displayed on the recording interface during the recording time, as a video image in the video file.
For example, as shown in fig. 9I, the electronic apparatus 100 may display a video preview interface 993 after saving the clip image displayed on the recording interface during the recording time, the special effect screen, and the viewfinder presentation window together as a video screen in the video 995. The video preview interface 993 may include a video 995 shot in a circular mirror moving mode, and a video playing control 994a, where a video frame of the video 995 includes a cut image obtained by cutting an original image obtained based on a camera, and a viewing display window 994 b. The viewing display window 994b includes a crop box 994c and an original image 995a obtained by the camera.
The electronic device 100 receives an input operation (e.g., a single click) from a user on the video play control 994a, and in response to the input operation, the electronic device 100 may play the video 995.
As shown in fig. 9J, when the video 995 is played to the 3 rd second, the electronic device 100 may display a video screen 996 on the video preview interface 993. The video screen 996 includes a viewfinder display window 994b, a position of the crop box 994c on the original image 996a acquired by the camera, and a crop image cut out from the original image 996a based on the crop box 994 c. The viewfinder display window 994b may also display the movement locus of the crop box 994 c.
As shown in fig. 9K, when the video 995 is played for the 12 th second, the electronic apparatus 100 may display a video screen 997 on the video preview interface 993. The video screen 997 includes a viewfinder display window 994b, a position of a crop box 994c on an original image 997a acquired by a camera, and a cut image and a special effect screen (e.g., a firework special effect screen) cut from the original image 997a based on the crop box 994 c.
In some embodiments, the size of the crop box is changed in proportion to the size of the original image, and at the same time, the center point of the crop box may move on the original image according to a preset trajectory. The size of the cropping frame can be gradually reduced proportionally from the size of the original image to the preset minimum size and then gradually increased proportionally from the preset minimum size to the size of the original image. The original image may be a rectangle, the cropping frame may also be a rectangle, and the ratio of the width to the height of the cropping frame is equal to the ratio of the width to the height of the original image.
In some application scenarios, the electronic device 100 may receive a user-defined center point for setting a crop box in an original image when the original image acquired by a camera is displayed in a shooting interface as a preview screen. The electronic device 100 may determine the maximum size and the minimum size of the crop box based on the size of the original image and the center point of the crop box. The electronic device 100 may determine the position and size of the crop box in each frame of the original image based on the maximum size and the minimum size of the crop box and the speed at which the size of the crop box changes proportionally between the maximum size and the minimum size. The electronic apparatus 100 may cut out an image within the crop box on each frame of the original image based on the position and size of the crop box in each frame of the original image, and display the cut-out image as a preview screen in the photographing interface. In this way, the user can conveniently and self-define the central point of the cutting frame, and the electronic device 100 can align the central point of the shot picture with the shot main body, so that the shot main body is clearer.
Illustratively, as shown in FIG. 10A, electronic device 100 may display a moving mirror template selection interface 610. For the text description of the mirror moving selection interface 610, reference may be made to the embodiment shown in fig. 6B, which is not described herein again. Wherein the "travel" fortune mirror template 611A has been selected in the fortune mirror selection interface 610 by selection box 612.
As shown in fig. 10B, the electronic device 100 may receive a user input (e.g., a single click) for the "custom" fortune mirror template option 611D, in response to which the electronic device 100 may move a selection box 612 to select the "custom" fortune mirror template option 611D. When the selection box 612 selects the "custom" fortune mirror template option 611D, the electronic device 100 may close the display fortune mirror template display area 614 and the fortune mirror track display area 616, and display the prompt box 1011. Wherein the prompt box 1011 can be used to display a functional description of the "custom" fortune mirror template, and the functional description can be a text (for example, "you can customize a fortune mirror center point", etc.), a picture, a video, and the like.
The electronic apparatus 100 may receive an input operation (e.g., a single click) by the user acting on the determination control 613, and in response to the input operation, the electronic apparatus 100 may display a photographing interface 1020 as shown in fig. 10C.
As shown in fig. 10C, the shooting interface 1020 may include a viewfinder display window 1021, a raw image 1031, a prompt 1025, a determination control 1026, a mirror template switching control 1027, a position mark 1028, one or more shooting mode controls (e.g., "night view mode" control 327A, "portrait mode" control 327B, "large aperture" control 327C, "mirror" mode control 327H, "normal shooting mode" control 327D, "recording mode" control 327E, "professional mode" control 327F, etc.), a playback control 321, a shooting control 322, and a camera conversion control 323. For the text description of one or more shooting mode controls, the redisplay control 321, the shooting control 322, and the camera conversion control 323, reference may be made to the embodiment shown in fig. 3B, which is not described herein again. The mirror template switching control 1027 may be configured to trigger the electronic device 100 to switch a mirror template selected by a user, and a name of the mirror template selected by the electronic device 100 (for example, "custom mirror") is displayed on the mirror template switching control 1027. The viewfinder display window 1021 may display a closing control 1022, an original image 1031a (the same as the image content of the original image 1031 and different in display scale), a trimming frame 1023, and a trimming frame 1023 at a center point 1024 of the original image 1031 a. The close control 1022 can be used to trigger the electronic device 100 to close the viewfinder display 1021. The prompting information 1025 can be used to prompt the user to adjust the position of the center point 1023 of the crop box 1024 on the image on the original image 1031. The position tag 1028 may be used to indicate the position on the original image 1031a where the center point 1023 of the crop box 1024 is currently located.
As shown in fig. 10D, the electronic device 100 may receive an input (e.g., a drag) of the position mark 1028 by a user, and in response to the input operation, the electronic device 100 may adjust a display position of the position mark 1028 on the original image 1031 a. After the position of the position mark 1028 on the original image 1031 is moved, the electronic device 100 may adjust the position of the center point 1023 of the cropping frame 1024 on the original image 1031a in the viewing window 1021.
As shown in fig. 10E, electronic device 100 may receive an input (e.g., a single click) from a user with respect to the determination control 1026, and in response to the input, electronic device 100 may crop out an image within the crop box on the original image and resize the cropped image to a specified size, resulting in a cropped image. The electronic apparatus 100 may display the cropped image as a preview screen on the photographing interface. Electronic device 100 may also not display prompt 1025 and decision control 1026 in response to user input with respect to the decision control 1026.
As shown in fig. 10F, the center point 1024 of the crop box 1023 may be fixed at a position on the original image, and the size of the crop box 1023 varies in proportion to the size of the original image. The size of the crop box 1023 may gradually scale up from a preset minimum size to a maximum size over time, and then gradually scale up from the maximum size to the preset minimum size. For the process of changing the size of the cropping frame 1023 between the preset minimum size and the maximum size, reference may be made to the embodiments illustrated in fig. 8A to 8F, which are not described herein again. At this time, the size of the crop box 1023 is a preset minimum size. The electronic apparatus 100 may display the original image 1032a acquired by the camera in the viewfinder display window 1021. The electronic apparatus 100 may cut out an image within the trimming frame 1023 on the original image 1032a, and adjust the cut-out image to a specified size, resulting in the trimmed image 1032. The electronic apparatus 100 may display the cut image 1032 as a preview screen on the photographing interface 1020. Electronic device 100 can also display control 1029 on capture interface 1020. Among other things, the control 1029 can be used to trigger the electronic device 100 to reselect the center point of the crop box on the original image based on user input.
At this time, as shown in fig. 10G, the size of the crop box 1023 is increased in proportion from the preset minimum size to between the maximum size and the preset minimum size. The electronic apparatus 100 can display the original image 1033a acquired by the camera at this time in the viewfinder presentation window 1021. The electronic device 100 may cut out the image in the cropping frame 1023 on the original image 1033a and adjust the cut-out image to a specified size to obtain the cropped image 1033, and the electronic device 100 may display the cropped image 1033 as a preview screen on the shooting interface 1020. The magnification of the image content in the cut image 1033 is smaller than the magnification of the image content in the cut image 1032 shown in fig. 10F.
At this time, as shown in fig. 10H, the size of the crop box 1023 is increased from a size between the maximum size and a preset minimum size to the maximum size in proportion. The electronic apparatus 100 may display the original image 1034a acquired by the camera at this time in the viewfinder display window 1021. The electronic device 100 may cut out an image in the cropping frame 1023 on the original image 1034a, adjust the cut-out image to a specified size, and obtain a cropped image 1034, and the electronic device 100 may display the cropped image 1034 as a preview screen on the photographing interface 1020. The magnification of the image content in the cut image 1034 is smaller than the magnification of the image content in the cut image 1033 shown in fig. 10G.
Electronic device 100 may receive an input (e.g., a single click) from a user on control 1029, and in response to the input, electronic device 100 may display original image 1035, prompt 1025, position marker 1028, and decision control 1026 on camera interface 1020, as shown in fig. 10I.
As shown in fig. 10I, the electronic device 100 may receive an input (e.g., a drag) of the position marker 1028 for the user, in response to which the electronic device 100 may adjust the display position of the position marker 1028 in the original image 1035. The electronic device 100 may determine the position of the center point 1024 of the crop box 1023 on the original image 1035a (the same image content as the original image 1035, but at a different display scale) in the viewfinder presentation window 1021 based on the display position of the position marker 1028 on the original image 1035.
As shown in fig. 10J, electronic device 100 may receive an input (e.g., a single click) from a user with respect to the determination control 1026, and in response to the input, electronic device 100 may crop out an image within the crop box on the original image and resize the cropped image to a specified size, resulting in a cropped image. The electronic apparatus 100 may display the cut image as a preview screen on the photographing interface 1020.
As shown in fig. 10K, the position of the center point 1024 of the crop box 1023 in the original image is fixed, and the size of the crop box 1023 changes in proportion to the size of the original image. The electronic device 100 may display the raw image 1036a captured by the camera in the viewfinder display 1021. The electronic apparatus 100 can display a trimming image 1036 that is clipped by the trimming frame 1023 as a preview screen on the photographing interface 1020.
In some application scenarios, the electronic device 100 may receive a user-defined center point for setting a crop box in an original image when the original image acquired by a camera is displayed in a shooting interface as a preview screen. The electronic device 100 may determine the maximum size and the preset minimum size of the crop box based on the size of the original image and the center point of the crop box. The electronic device 100 may receive user input and adjust the size of the crop box at any time during the preview or recording process. The electronic apparatus 100 may intercept an image within the crop box on each frame of the original image based on the position and size of the crop box in each frame of the original image, and display the intercepted image as a preview screen in the photographing interface. Therefore, the user can conveniently and self-define and select the central point of the cutting frame and the size of the cutting frame, the electronic equipment 100 can align the central point of the shot picture with the shot main body, and the shot main body is enlarged or reduced at any time, so that the shot main body is clearer.
Illustratively, as shown in FIG. 11A, the electronic device 100 may display a capture interface 1020. For the text description of the shooting interface 1020, reference may be made to the embodiment shown in fig. 10C, which is not described herein again.
After receiving the input of the user and adjusting the display position of the position mark 1028, the electronic device 100 may receive the input (e.g., clicking) from the user with respect to the determination control 1026, and in response to the input, as shown in fig. 11B, the electronic device 100 may display the original image 1112a captured by the camera at this time in the viewfinder display window 1021. The electronic apparatus 100 may cut out an image within the trimming frame 1023 on the original image 1112a, and adjust the cut-out image to a specified size, resulting in the trimmed image 1112. The electronic apparatus 100 may display the trimming image 1112 as a preview screen on the photographing interface.
As shown in fig. 11B, the electronic apparatus 100 may also display a magnification control 1121 and a magnification reduction control 1122 on the photographing interface 1020. The magnification control 1121 can be used to reduce the size of the crop box 1023, and the magnification reduction control 1122 can be used to increase the size of the crop box 1023. The magnification control 1121 and the magnification reduction control 1122 may also be a slider bar, and when the electronic device 100 receives a user sliding up the slider bar, the electronic device 100 may reduce the size of the crop box 1023. When the electronic device 100 receives a user sliding down on the slider bar, the electronic device 100 may increase the size of the crop box 1023.
The electronic device 100 may receive an input (e.g., a single click or a long press) from the user on the magnification zoom-out control 1122, and in response to the input, the electronic device 100 may increase the size of the crop box 1023, as shown in fig. 11C. The electronic apparatus 100 may display the original image 1113a acquired by the camera at this time in the viewfinder presentation window 1021. The electronic device 100 may cut out the image in the cropping frame 1023 on the original image 1113a, and adjust the image to a specified size to obtain the cropped image 1113.
As shown in fig. 11C, the electronic apparatus 100 may display the cut image 1113 as a preview screen on the photographing interface 1020. The magnification of the image content in the cropped image 1113 is smaller than the magnification of the image content in the cropped image 1112 shown in fig. 11B.
The electronic device 100 may continue to receive an operation (e.g., a single click or a long press) by the user on the magnification zoom-out control 1122, and in response to the operation, the electronic device 100 may increase the size of the crop box 1023 to the maximum size. For the description of the maximum size, reference may be made to the embodiments shown in fig. 8A to fig. 8F, which are not repeated herein.
As shown in fig. 11D, the electronic apparatus 100 adjusts the crop box 1023 to the maximum size based on the user input operation for the magnification reduction control 1122. The electronic device 100 may display the original image 1114a acquired by the camera at this time in the viewfinder presentation window 1021. The electronic device 100 may cut out the image in the cropping frame 1023 on the original image 1114a, and adjust the image to a specified size to obtain the cropped image 1114. The electronic device 100 may display the cut image 1114 as a preview screen on the photographing interface 1020. The magnification of the image content in the cropped image 1114 is smaller than the magnification of the image content in the cropped image 1113 shown in fig. 11C.
The electronic device 100 may receive an operation (e.g., a single click or a long press) by the user on the magnification increase control 1121, and in response to the operation, the electronic device 100 may reduce the size of the crop box 1023 as shown in fig. 11E. The electronic apparatus 100 may display the original image 1115a acquired by the camera at this time in the viewfinder presentation window 1021. The electronic device 100 may cut out the image in the cropping frame 1023 on the original image 1115a, and adjust the image to a designated size to obtain the cropped image 1115.
As shown in fig. 11E, the electronic apparatus 100 may display the cropped image 1115 as a preview screen on the photographing interface 1020. The magnification of the image content in the trimmed image 1115 is smaller than the magnification of the image content in the trimmed image 1114 shown in fig. 11D.
In some application scenarios, the electronic device 100 may receive a user-defined center point for setting a crop box in an original image when the original image acquired by a camera is displayed as a preview screen in a shooting interface. Wherein, the size of the cutting frame can be fixed. During the preview or video recording process, the crop box can move on the original image along a fixed direction, and the electronic device 100 can receive the input of the user and adjust the moving direction and the moving speed of the crop box on the original image at any time. The electronic device 100 can determine the position and size of the cropping frame in each frame of the original image according to the moving direction and the moving speed of the cropping frame. The electronic device 100 may cut out an image within the cropping frame on each frame of the original image based on the position and size of the cropping frame in each frame of the original image, and display the cut-out image as a preview image in the shooting interface or the recording interface. Therefore, the user can conveniently adjust the acquisition position of the shot picture at any time in the previewing or video recording process, the shot picture can move along with the selection direction of the user, and the user experience is improved.
Illustratively, as shown in fig. 12A, the electronic device 100 may display a capture interface 1020. For the text description of the shooting interface 1020, reference may be made to the embodiment shown in fig. 10C, which is not described herein again.
After the electronic device 100 receives the input of the user and adjusts the display position of the position mark 1028, it may receive the input (e.g., clicking) from the user with respect to the determination control 1026, and in response to the input, as shown in fig. 12B, the electronic device 100 may display an original image 1211a acquired through the camera at this time in the viewfinder display window 1021. The electronic apparatus 100 may clip out an image within the clipping box 1023 on the original image 1211a, and adjust the clipped-out image to a specified size, resulting in the clipped-out image 1211. The electronic apparatus 100 may display the clip image 1211 as a preview screen on the photographing interface.
As shown in fig. 12B, the electronic device 100 may display a plurality of directional movement controls (including a move-up control 1221, a move-down control 1222, a move-left control 1223, and a move-right control 1224) of the crop box 1023 on the capture interface 1020. Among other things, the move-up control 1221 can be used to trigger the electronic device 100 to move the position of the crop box 1023 upward on the original image. The move down control 1222 may be used to trigger the electronic device 100 to move the position of the crop box 1023 down on the original image. The left control 1223 can be used to trigger the electronic device 100 to move the position of the crop box 1023 to the left over the original image. The right movement control 1224 may be used to trigger the electronic device 100 to move the position of the crop box 1023 to the right on the original image. In one possible implementation, the plurality of directional-movement controls may be cross-shaped sliders.
As shown in fig. 12B and 12C, the crop box 1023 may be moved right from the user-selected position on the original image. As shown in fig. 12B, the electronic device 100 may intercept the image in the cropping frame 1023 on the original image 1211a, and adjust the image to a specified size to obtain the cropped image 1211. As shown in fig. 12C, the cropping frame 1023 moves to the right on the original image, and the electronic device 100 can cut out the cropped image 1212 from the original image 1212a through the cropping frame 1023 and display the cropped image 1212 as a preview screen on the photographing interface 1020. The image content in the cropped image 1212 is to the right of the image content in the cropped image 1211 shown in fig. 12B.
The electronic device 100 may receive an input (e.g., a single click) from the user on the move-up control 1221, and in response to the input, the electronic device 100 may move the position of the crop box 1023 upward on the original image. The electronic device 100 may intercept the image in the crop box 1023 on the original image, resize to a specified size, resulting in a cropped image 1213.
As shown in fig. 12D, the electronic apparatus 100 may display the original image 1213a acquired by the camera at this time in the viewfinder presentation window 1021. The electronic apparatus 100 may clip out an image within the cropping frame 1023 on the original image 1213a and adjust the clipped-out image to a specified size, resulting in a cropped image 1213. The electronic apparatus 100 can display the cropped image 1213 as a preview screen on the photographing interface 1020. Wherein the image content in the cropped image 1213 is above the image content in the cropped image 1212 shown in fig. 12C above.
In one possible implementation, when the electronic device 100 does not receive the user input with respect to the direction movement control for a period of time (e.g., 3 seconds), the electronic device 100 may exit from displaying the direction movement control. When the electronic apparatus 100 receives an input (e.g., a double-click operation) from the user with respect to the photographing interface, the electronic apparatus 100 may display the above-described direction movement control again.
In some embodiments, the electronic device 100 may receive a long-press operation by the user on the direction movement control, and in response to the long-press operation, the electronic device 100 may increase the movement speed of the crop box on the original image.
For example, the electronic apparatus 100 may acquire the original image P1 at the 1 st second, the original image P2 at the 2 nd second, and the original image P3 at the 3 rd second in this order by using a camera. Wherein the position of the crop box on the original image P2 is shifted rightward by 100 pixels from the position on the original image P1. When the electronic device 100 receives the long-press operation of the user on the right movement control in the 2 nd to 3 rd seconds, the electronic device 100 may increase the speed at which the crop box moves to the right on the original image. Where the position of the crop box on the original image P3 is shifted rightward by 200 pixels compared to the position on the original image P2. The above examples are merely illustrative of the present application and should not be construed as limiting.
In some embodiments, the electronic device 100 may receive a user-defined center point for setting a crop box in an original image when the original image acquired by a camera is displayed as a preview screen in a shooting interface. Wherein, the size of the cutting frame can be fixed. The electronic device 100 may receive an input from a user (for example, an operation of the direction movement control described above) during the photo preview or video recording process, and adjust the position of the crop box on the original image at any time. The electronic device 100 may clip an image within the cropping frame on each of the original images based on the position and size of the cropping frame in each of the original images, and display the clipped image as a preview screen in the shooting interface or the video recording interface. Therefore, the user can conveniently adjust the acquisition position of the shot picture at any time in the previewing or video recording process, the shot picture can move along with the selection direction of the user, and the user experience is improved.
In some application scenarios, the electronic device 100 may have moving image content (e.g., flying birds) in a preview or video screen. The electronic device 100 receives user input, identifies the moving image content (e.g., flying birds), and determines the location of the moving image content on each of the original images. The electronic device 100 may adjust the position of the crop box on each frame of the original image so that the moving image content is always within the crop box. The electronic apparatus 100 may intercept an image within the crop box on each frame of the original image based on the position and size of the crop box in each frame of the original image, and display the intercepted image as a preview screen in the photographing interface. In this way, the electronic device 100 can always enlarge and display the moving image content on the shooting screen when shooting the moving image content.
For example, as shown in fig. 13A, the electronic device 100 may display a photographing interface 1310. The shooting interface 1310 may include a viewfinder display window 1311, an original image 1321, one or more shooting mode controls (e.g., a "night view mode" control 327A, a "portrait mode" control 327B, a "large aperture" control 327C, a "mirror" mode control 327H, a "general shooting mode" control 327D, a "video recording mode" control 327E, a "professional mode" control 327F, etc.), a playback control 321, a shooting control 322, and a camera conversion control 323. For the text description of one or more shooting mode controls, the redisplay control 321, the shooting control 322, and the camera conversion control 323, reference may be made to the embodiment shown in fig. 3B, which is not described herein again. The finder presentation window 1311 can display the closing element 1312 and the original image 1321a (the same image content as the original image 1321, and a different display scale).
The electronic device 100 may receive a user's clicking action (e.g., single click, long press, double click, etc.) on the original image 1321, and in response to the input, the electronic device 100 may identify the user-selected image content 1316 as a bird and the image content 1316 is in motion based on the location of the clicking action in the original image 1321.
As shown in fig. 13B, at this point, the electronic device 100 acquires an original image 1322 where the image content 1316 is at a different location in the original image 1322 than in the original image 1321. The electronic apparatus 100 displays the original image 1322 on the imaging interface 1310 as a preview screen, and displays the original image 1322a (having the same image content as the original image 1322 and a different display scale) in the finder presentation window 1311. Electronic device 100 may display prompt information 1318 and decision control 1319 upon recognizing that image content 1316 is in motion and display a marker box 1317 in the area where image content 1316 is located. Where the determination control 1319 is displayed at a location on the capture interface 1310 that is not limited, for example, the determination control 1319 may be above the pan mode control 327H and the determination control 1319 may be near the image 1316. The electronic device 100 can also display a crop box 1313 in the viewfinder presentation window 1311, where a center point of the crop box 1313 is on the image content 1316, and the prompt information 1318 can be used to prompt the user whether to track the image content 1316. The center point of the marker box 1317 is on the image content 1316, and the marker box 1317 is used to indicate the size of the crop box, as well as the position of the crop box relative to the image content 1316. In one possible implementation, the electronic device 100 may receive an input (e.g., a drag) from a user on the tab box 1317 to resize the crop box.
The electronic apparatus 100 may receive an input (e.g., a single click) from the user on the determination control 1319, and in response to the input, the electronic apparatus 100 may display an original image 1323a acquired through the camera at this time in the viewing presentation window 1311 as shown in fig. 13C. The electronic apparatus 100 may clip out the image within the trimming frame 1313 on the original image 1323a, and adjust the clipped-out image to a predetermined size, resulting in the trimmed image 1323. The electronic apparatus 100 may display the trimming image 1323 as a preview screen on the photographing interface 1310. The image content 1316 is centered within the cropped image 1323. The display size of the image content 1316 in the cropped image 1323 is larger than the display size of the image content 1316 in the original image 1322 shown in fig. 13B.
In one possible implementation, the electronic device 100 may receive a double-click operation or the like from the user on the original image 1321, and then identify the image content 1316 selected by the user directly according to the action position of the clicking operation in the original image 1321. Upon recognizing the user-selected image content 1316, the electronic device 100 may display an original image captured by the camera in the viewfinder presentation window 1311. The electronic device 100 may crop an image that includes image content 1316 from the original image and adjust the cropped image to a specified size to obtain a cropped image. The electronic device 100 may display the cropped image on the photographing interface 1310.
As shown in fig. 13D, the image content 1316 has a change in position in the original image. The crop box 1313 follows the position movement of the image content 1316. The electronic apparatus 100 can display the clipped image 1324 clipped from the original image 1324a at this time on the photographing interface 1310 as a preview screen. Where the image content 1316 is still in the center of the cropped image 1324.
In one possible implementation, as shown in fig. 13E, the electronic device 100 may display an original image 1325a acquired by the camera at this time in the viewfinder presentation window 1311. The electronic apparatus 100 may clip the image within the cropping frame 1313 on the original image 1325a, and adjust the clipped image to a predetermined size, resulting in a cropped image 1325. The electronic device 100 may display the trimming image 1325 as a preview screen on the photographing interface 1310. The electronic device 100 may further display a motion trajectory 1331 of the image content 1316 in the viewfinder presentation window 1311.
In some embodiments, the electronic device 100 may identify an outline of specified image content (e.g., a scene or task) in the preview or video recording. The electronic device 100 may use the contour of the designated image content as a motion trajectory of the center point of the crop box, and adjust the crop box to move on the contour of the designated image content at a preset speed. Wherein, the size of the cutting frame can be fixed. The electronic device 100 may determine the position of the crop box on each frame of the original image based on the outline of the specified image content and the preset speed. The electronic device 100 may capture an image within the cropping frame on each original image based on the position of the cropping frame in each original image, and display the captured image as a preview image in the shooting interface or as a video recording image in the video recording interface. In this way, the electronic device 100 can automatically focus and highlight the detailed features of the shot object according to the outline of the shot object when recording a video or displaying a preview picture.
In some application scenarios, the electronic device 100 may receive a user-selected piece of music before taking a picture or recording. After the electronic device 100 starts recording, the electronic device 100 may control the size change of the cropping frame and/or the motion track of the cropping frame on the original image according to the audio information of the music piece. The electronic device 100 may determine a position and a size of the cropping frame on each frame of the original image acquired by the camera, and the electronic device 100 may crop each frame of the image based on the position and the size of the cropping frame on each frame of the original image, and display the cropped image in the photo preview interface or the video interface according to the frame sequence. In this way, the shooting picture of the electronic device 100 can automatically zoom along with the rhythm of the music, and the visual effect of the shooting picture is enhanced.
Illustratively, as shown in fig. 14A, when the electronic device 100 receives an input operation (e.g., a single click) from a user acting on the "smart mirror mode" control 331 shown in fig. 3C above, in response to the input operation, the electronic device 100 may display a mirror template selection interface 1410. The goggles template selection interface 1410 may include one or more goggles template options (e.g., a "travel" goggles option, a "comfortable" goggles option 1411A, a "dynamic" goggles option 1411B, etc.), a music mode option 1411C, a selection frame 1412, a smart goggles prompt frame 1413, a goggles capture control 1415, and a show back control 1416, among other things. Wherein, in the intelligent mirror movement prompt box 1413, the closing control 1414 can be used for triggering the electronic device 100 to exit the mirror movement template selection interface 1410. For the text descriptions of the intelligent mirror-moving prompt box 1413, the mirror-moving shooting control 1415 and the redisplay control 1416, reference may be made to the embodiment shown in fig. 3D, which is not described herein again.
Among other things, the electronic device 100 can receive a user operation (e.g., a single click) with respect to the music mode option 1411C, in response to which the electronic device 100 can move the selection box 1412 to select the music mode option 1411C and display a music template selection box 1420 on the sports mirror template selection interface 1410. The music template selection box 1420 may include one or more music piece options (e.g., music piece 1 option 1421, music piece 2 option, music piece 3 option, music piece 4 option, etc.). The audio information of each music segment is different, and different music segments correspond to different cutting box information (including cutting box size change rules and/or the motion trail of the cutting box in the original image). Each music piece option has a play control and a selection control (for example, a play control 1423 and a selection control 1422 are displayed on the music piece 1 option 1421), and the play control may be used to trigger the electronic device 100 to play the music piece corresponding to the play control. The selection control can be used to trigger the electronic device 100 to select the music piece corresponding to the selection control.
Electronic device 100 can receive an input (e.g., a single click) from a user on selection control 1422, and in response to the input, electronic device 100 can select musical section 1 corresponding to musical section 1 option 1421 and display a selection mark on selection control 1422, as shown in fig. 14B. The selection mark displayed on the selection control 1422 may be used to indicate that the electronic device 100 determines the cropping frame information corresponding to the music piece 1, and crops the original image acquired by the camera during video recording.
As shown in fig. 14B, after music piece 1 has been selected, electronic device 100 may receive an input (e.g., a single click) from a user acting on the panning control 1415, in response to which input electronic device 100 may display a video recording interface 1430 as shown in fig. 14C.
As shown in fig. 14C, the video recording interface 1430 may include an original image 1461 captured by the camera, a playback control 1416, a video capture control 1431, a mirror-moving template switching control 1432, an intelligent mirror-moving prompt box 1413, a view-finding display window 1441, and an audio waveform display window 1445. The mirror template switching control 1432 can be used to trigger the electronic device 100 to switch the selected music piece, and the name of the currently selected music piece (for example, "music piece 1") is displayed on the mirror template switching control 1432. The view finding display window 1441 displays an original image 1461a (the same as the image content of the original image 1461, and different in display scale), a close control 1442, and a crop box 1443, which are acquired by a camera. The sound waveform display window 1445 can be used for displaying a sound waveform diagram of the selected music piece (e.g., music piece 1), wherein the signal waveform diagram can be a variation graph of sound amplitude versus time of the selected music piece.
The electronic device 100 can receive an input (e.g., a single click) from a user with respect to the video capture control 1431, and in response to the input, the electronic device 100 can begin playing the music piece 1 through the speaker and parse the audio information of the selected music piece. The electronic device 100 may determine the cropping frame information (including the size variation of the cropping frame and/or the motion trajectory of the cropping frame on the original image) according to the audio information (including the sound amplitude, the sound frequency, etc.) of the selected music piece. The electronic device 100 may crop the original image acquired by the camera according to the cropping frame information to obtain a cropped image, and display the cropped image as a shot image on the video interface 1430. For the relationship between the audio information of the music piece and the cutting box information, reference may be made to the following embodiments, which are not described herein again.
As shown in fig. 14D, when the electronic device 100 starts recording in response to an input (e.g., a single click) to the video capture control 1431, the electronic device 100 may display a recording time frame 1434, a recording end control 1435, and replace and display the playback control 1416 and the mode switching control 1432 as the pause recording control 1435 and the photographing control 1436, and the electronic device 100 may display a sound waveform diagram of the music piece 1 on the viewfinder display window 1445. The pause video control 1435 may be used to trigger the electronic device 100 to pause video. The photographing control 1436 may be used to trigger the electronic device 100 to save a certain frame or multiple frames of the video recording picture in the video recording process as a picture. The electronic apparatus 100 determines the position of the cropping box 1443 on each frame of the original image acquired by the camera based on the audio information of the music piece 1. The electronic device 100 may sequentially display each frame of original image acquired by the camera in the view window 1441 in the frame order, and display the position of the crop box 1443 on each frame of original image in the view window 1441. The electronic device 100 may cut out the image in the cropping frame 1443 on the original image, and adjust the cut-out image to a specified size, resulting in a cropped image. The electronic device 100 may display the cropped image as a shot on the video interface 1430.
As shown in fig. 14D, 14E, the size and/or the center point of the crop box 1443 changes at the position of the original image following the change in the tempo of the music piece 1. For example, the size of the cropping frame 1443 may increase as the sound amplitude of the musical piece 1 increases, the sound amplitude of the musical piece 1 at the 3 rd second is larger than the sound amplitude at the 8 th second, and therefore the size of the cropping frame 1443 at the 3 rd second after the start of video recording is larger than the size at the 8 th second after the start of video recording. The electronic apparatus 100 acquires the original image 1462a by a camera at the 3 rd second after the start of recording. Electronic device 100 may intercept cropped image 1462 from original image 1462a via crop box 1443. The electronic apparatus 100 acquires the original image 1463a by a camera at the 8 th second after the start of recording. The cropped image 1463 is cut out of the original image 1463a by a crop box 1443. The magnification of the image content in cropped image 1462 is smaller than the magnification of the image content in cropped image 1463.
Electronic device 100 may receive a user input (e.g., a single click) to end-of-recording control 1435, in response to which electronic device 100 may save as a video file the plurality of cropped images displayed over time from the start of recording to the end of recording and the sound in the selected music piece.
In some application scenarios, for example, when a user picks up the electronic device 100 to shoot a singer on a stage at a concert, the concert is live with the singer's singing voice and music. The electronic device 100 may collect the audio signal through the microphone during the photo preview or the video recording. The electronic device 100 may control the size change of the cropping frame and/or the motion trajectory of the cropping frame on the original image by using the audio information (including the sound amplitude, the sound frequency, and the like) of the collected sound signal. The electronic device 100 may determine a position and a size of the cropping frame on each frame of the original image acquired by the camera, and the electronic device 100 may crop each frame of the image based on the position and the size of the cropping frame on each frame of the original image, and display the cropped image in the photo preview interface or the video interface according to the frame sequence. In this way, the shooting picture of the electronic device 100 can automatically zoom along with the rhythm of the sound signal in the environment, and the visual effect of the shooting picture is enhanced.
Illustratively, as shown in fig. 15A, when the electronic device 100 receives an input operation (e.g., a single click) from a user acting on the "smart fortune mirror mode" control 331 shown in fig. 3C above, in response to the input operation, the electronic device 100 may display a fortune mirror template selection interface 1410. The scope template selection interface 1410 may also display an ambient sound option 1424 in addition to the controls shown in FIG. 14A, discussed above. A selection control 1425 is displayed over the ambient sound option 1424.
The electronic device 100 may receive an input (e.g., a single click) from a user on the selection control 1425, and in response to the input, as shown in fig. 15B, the electronic device 100 may display a selection mark on the selection control 1425, where the selection mark displayed on the selection control 1425 may be used to indicate that the electronic device 100 determines an environmental sound collected through a microphone during recording, and to analyze audio information (e.g., sound amplitude, sound rhythm, etc.) of the environmental sound to crop an original image captured by a camera.
As shown in fig. 15B, after selecting the ambient sound option 1423, the electronic device 100 may receive an input (e.g., a single click) from a user acting on the panning control 1415, in response to which the electronic device 100 may display a video recording interface 1530 as shown in fig. 14C.
As shown in fig. 15C, the video recording interface 1530 may include an original image 1561 acquired by the camera, a display back control 1516, a video shooting control 1531, a mirror moving template switching control 1532, an intelligent mirror moving prompt box 1513, a view finding display window 1541, and a sound waveform display window 1545. The moving mirror template switching control 1532 may be configured to trigger the electronic device 100 to switch the music obtaining source, and the word "environmental sound" is displayed on the moving mirror template switching control 1532. The viewfinder display window 1541 includes an original image 1561a (the same as the image content in the original image 1561, and in a different display scale), a closing control 1542, and a crop box 1543. The sound waveform display window 1545 may be configured to display a sound waveform diagram of the collected ambient sound, where the sound waveform diagram may be a graph of the change of the sound amplitude of the ambient sound with time.
The electronic device 100 can receive an input (e.g., a single click) from the user with respect to the video capture control 1531, in response to which the electronic device 100 can begin capturing ambient sound in real-time via the microphone and parse audio information in the ambient sound. The electronic device 100 may determine the crop box information (including the size change of the crop box and/or the motion trajectory of the crop box on the original image) according to the audio information (including the sound amplitude, the sound frequency, etc.) of the environmental sound. The electronic device 100 may crop the original image captured by the camera according to the cropping frame information to obtain a cropped image, and display the cropped image as a shot on the video interface 1530. For the relationship between the audio information of the environmental sound and the crop box information, reference may be made to the following embodiments, which are not described herein again.
As shown in fig. 15D, when the electronic device 100 starts recording in response to an input (e.g., a single click) to the video capture control 1531, the electronic device 100 may display a recording time frame 1534, a recording end control 1535, and display a back display control 1516 and a mode switch control 1532 instead as a pause recording control 1535 and a capture control 1536. The electronic apparatus 100 may display a sound waveform diagram of the collected ambient sound on the viewfinder presentation window 1545. The pause video recording control 1535 may be used to trigger the electronic device 100 to pause video recording. The photographing control 1536 may be used to trigger the electronic device 100 to save one or more frames of the recorded picture in the recording process as pictures. The electronic device 100 determines the position of the cropping frame 1543 on each frame of the original image acquired by the camera based on the audio information of the environmental sound. The electronic apparatus 100 may sequentially display each frame of original image acquired by the camera in the framing display window 1541 in the frame order, and display the position of the crop frame 1543 on each frame of original image in the framing display window 1541. The electronic device 100 may cut out an image within the cropping frame 1543 on the original image, and adjust the cut-out image to a specified size, resulting in a cropped image. The electronic device 100 may display the cropped image as a shot on the video recording interface 1530.
As shown in fig. 15D and 15E, the size and/or the center point of the crop box 1543 changes at the position of the original image following the change in the rhythm of the ambient sound. For example, the size of the crop frame 1543 may increase as the sound level of the ambient sound increases, the sound level of the ambient sound at 3 rd second being greater than the sound level at 8 th second, and thus the size of the crop frame 1543 at 3 rd second after the start of video recording is greater than the size at 8 th second after the start of video recording. The electronic apparatus 100 acquires the original image 1562a by a camera at the 3 rd second after the start of recording. The electronic device 100 may cut out the cut image 1562 from the original image 1562a through the cut frame 1543, and the electronic device 100 acquires the original image 1563a through a camera at the 8 th second after the start of recording. The electronic device 100 may intercept the cropped image 1563 from the original image 1563a through the cropping frame 1543, wherein the magnification of the image content in the cropped image 1562 is less than the magnification of the image content in the cropped image 1563.
Electronic device 100 may receive an input (e.g., a single click) from a user on record end control 1535, in response to which electronic device 100 may save as a video file a plurality of cropped images displayed over time from start of recording to end of recording and ambient sound captured by a microphone over time from start of recording to end of recording.
In some embodiments, the electronic device 100 has a touch screen that can be folded, which may be referred to as a folded screen. When the touch screen of the electronic device 100 is in a half-folded state (for example, the folding angle is about 90 degrees), the touch screen of the electronic device 100 may be divided into at least an a screen and a B screen. Some applications on the electronic device 100 may have a customized interface layout and functionality control layout pre-configured for the semi-collapsed configuration of the collapsed screen. Taking a camera application as an example, when the folding screen of the electronic device 100 is in the unfolded state, the electronic device 100 may display a shooting interface (or a video recording interface) of the camera application, and after the folding screen of the electronic device 100 is changed into the semi-folded state, the electronic device 100 may display a preview picture (or a video recording picture in the video recording interface) in the shooting interface on the screen a, and display a function control (or a function control in the video recording interface) in the shooting interface (or the video recording interface) on the screen B.
For example, in the unfolded state of the foldable screen, the electronic device 100 may display the video interface 1530 shown in fig. 15C on the touch screen. When the folding screen is switched to the half-folded state, the electronic device 100 may display the video recording screen (e.g., the original image 1561 in fig. 15C or the cropped image 1562 in fig. 15D) in the video recording interface 1530 on the a screen, and display the function controls (e.g., the display back control 1516, the video shooting control 1531, the mirror template switching control 1532, the smart mirror prompt box 1513, the viewfinder display window 1541, and the sound waveform display window 1545 in fig. 15C) in the video recording interface 1530 on the B screen. Therefore, all the controls in the application interface can be rearranged according to the physical form of the folding screen, and a user can obtain better visual experience when watching and using the folding screen.
In some embodiments, electronic device 100 may display different content in the same application interface in a split screen. When the split application screen is started, the touch screen of the electronic device 100 may be divided into a plurality of display areas, and the plurality of display areas include a display area 1 and a display area 2. Taking a camera application as an example, when the electronic device 100 does not open the application split screen, the electronic device 100 may display a shooting interface (or a video recording interface) of the camera application. After the electronic device 100 starts application split, the electronic device 100 may display a preview picture (or a video picture in a video interface) in the shooting interface on the display area 1 of the touch screen, and display a function control (or a function control in a video interface) in the shooting interface (or the video interface) on the display area 2 of the touch screen.
For example, when the electronic device 100 does not open the application split screen, the electronic device 100 may display the video recording interface 1530 shown in fig. 15C on the touch screen. After the electronic device 100 opens the application split screen, the electronic device 100 may display the video picture (e.g., the original image 1561 in fig. 15C or the cropped image 1562 in fig. 15D) in the video interface 1530 on the display area 1 of the touch screen, and display the function control (e.g., the display back control 1516, the video shooting control 1531, the moving mirror template switching control 1532, the intelligent moving mirror prompt box 1513, the viewfinder display window 1541, and the sound waveform display window 1545 in fig. 15C) in the video interface 1530 on the display area 2 of the touch screen. Therefore, the control in the application interface can be rearranged according to whether the split application screen is opened or not, so that a user can obtain better visual experience after watching and using the split application screen.
In the embodiment of the application, the user triggers the electronic device 100 to start recording, end recording, or take a picture, and the like, and is not limited to touch input such as clicking of a control (for example, a start recording control, an end recording control, a shooting control, and the like) displayed on a touch screen. Optionally, the user may trigger the electronic device 100 to start recording, end recording, or take a picture, or may also be a voice input, for example, after the electronic device 100 receives the voice input of the user, the electronic device 100 may recognize semantics of the voice input (for example, "eggplant", "take a picture", "start recording", "end recording", and the like), and control the electronic device 100 to start recording, end recording, or take a picture, and the like. Optionally, the electronic device 100 may also trigger taking a picture, starting or ending a video recording, and the like by recognizing a designated gesture or facial expression of a person in the captured preview image. The electronic device 100 may also control the electronic device 100 to start recording, end recording or taking a picture, etc. by detecting user input (e.g., simultaneously pressing an up volume key and a down volume key) with respect to a physical key (e.g., a volume key or a power key, etc.).
The following describes an embodiment of the present application, which determines the above-mentioned crop box information (including the crop box size variation rule and/or the movement track of the crop box in the original image) according to the audio information of the sound signal.
The electronic apparatus 100 acquires the audio signal when recording or shooting a preview. The sound signal may be audio data corresponding to a music piece selected by the user in the above embodiment, or may be environmental sound collected by the electronic device 100 through a microphone in the above embodiment. The electronic device 100 may parse the audio information of the sound signal, wherein the audio information of the sound signal includes rhythm information, amplitude information, and style information, among others. The electronic apparatus 100 may adjust the size of the crop box, the change speed of the crop box (including the size change speed of the crop box and/or the moving speed of the center point of the crop box, etc.), the filter effect (including the light effect of different colors), and the like through the audio information of the sound signal.
After the sound signal is acquired, the electronic device 100 may sample and quantize the sound signal to obtain window data. The electronic device 100 may perform difference processing on the window data to obtain difference data. Then, the electronic device 100 may perform fourier transform on the difference data and differentiate again to obtain a cadence signal. As shown in fig. 16, the electronic device 100 may obtain the tempo starting point and the tempo ending point from the tempo signal, for example, the electronic device 100 identifies the tempo starting point of the ith tempo, the tempo ending point of the ith tempo, the starting point of the (i + 1) th tempo, the ending point of the (i + 1) th tempo, and so on, where i is a positive integer. Wherein, the length between the rhythm starting point and the rhythm end point of one rhythm represents the speed degree of the rhythm.
In some embodiments, the electronic device 100 may control the size change of the crop box on the original image by the tempo of the sound signal. Wherein, when the rhythm of the sound signal becomes fast, the electronic apparatus 100 may control the size of the crop frame on the original image to become large, and when the rhythm of the sound signal becomes slow, the electronic apparatus 100 may control the size of the crop frame on the original image to become small. When the size of the crop box is scaled up to the maximum size, the electronic device 100 may control the size of the crop box to be scaled down inversely. When the size of the crop box becomes larger than the preset minimum size, the electronic device 100 may control the size of the crop box to become proportionally larger in reverse.
In one possible implementation, the electronic apparatus 100 may recognize a rhythm termination time point of the sound signal and control the size of the crop box to be larger by a specified multiple or smaller by a specified multiple at the rhythm termination time point of the sound signal.
In some embodiments, the electronic device 100 may control the crop box to move over the original image. The electronic device 100 may control the moving speed of the cropping frame on the original image by the rhythm speed of the sound signal. Wherein, when the rhythm of the sound signal becomes fast, the electronic apparatus 100 may control the moving speed of the crop box on the original image to become large. When the rhythm of the sound signal becomes slow, the electronic apparatus 100 can control the moving speed of the crop box on the original image to become small. When the cropping frame moves to the boundary of the original image along one direction, the electronic device 100 can control the cropping frame to move reversely and also control the cropping frame to move in a positive and negative direction of 90 degrees.
In one possible implementation, the electronic apparatus 100 may recognize a rhythm termination time point (or rhythm start time point) of the sound signal and control switching of the crop box once to move the direction at the rhythm termination time point (or rhythm start time point) of the sound signal.
In some embodiments, the electronic device 100 may control the size change of the crop box on the original image by the amplitude of the sound signal. Wherein, when the amplitude of the sound signal becomes large, the electronic apparatus 100 may control the size of the crop frame on the original image to become large, and when the amplitude of the sound signal becomes small, the electronic apparatus 100 may control the size of the crop frame on the original image to become small. When the size of the crop box is proportionally larger to the maximum size, the electronic device 100 may control the size of the crop box to be proportionally smaller in the opposite direction. When the size of the crop box becomes larger to the preset minimum size, the electronic device 100 may control the size of the crop box to become proportionally larger in the opposite direction.
In some embodiments, the electronic device 100 may control the crop box to move over the original image. The electronic apparatus 100 may control the moving speed of the crop box on the original image by the amplitude of the sound signal. Wherein, when the amplitude of the sound signal becomes large, the electronic apparatus 100 may control the moving speed of the crop box on the original image to become large. When the rhythm of the sound signal becomes slow, the electronic apparatus 100 can control the moving speed of the crop box on the original image to become small. When the cropping frame moves to the boundary of the original image along one direction, the electronic device 100 can control the cropping frame to move reversely, and can also control the cropping frame to move in a positive and negative direction by 90 degrees.
In some embodiments, the electronic device 100 may control the size change of the cropping frame on the original image by the rhythm speed of the sound signal, the electronic device 100 may control the movement of the cropping frame on the original image, and the movement speed of the cropping frame on the original image is controlled by the amplitude size of the sound signal.
In some embodiments, the electronic device 100 may control the size change of the cropping frame on the original image through the amplitude size of the sound signal, the electronic device 100 may control the movement of the cropping frame on the original image, and the movement speed of the cropping frame on the original image is controlled through the rhythm speed of the sound signal.
In some embodiments, electronic device 100 may determine a filter color (e.g., a light effect of a color such as red, orange, yellow, green, cyan, blue, violet, etc.) from the musical style of the sound signal. After the electronic device 100 intercepts the image in the cropping frame of the original image, a filter effect may be applied to the intercepted image based on the filter color, and the cropped image after the filter effect may be displayed in the shooting interface or the video interface. For example, the electronic device 100 may change one filter color when a major-minor song switch of the sound signal is recognized.
In one possible implementation, the electronic apparatus 100 may recognize a rhythm start time point (or rhythm end time point) of the sound signal and change the filter color once at the rhythm start time point (or rhythm end time point) of the sound signal.
In some application scenarios, the electronic device 100 may have a video file stored locally. The electronic device 100 may determine crop box information (including crop box size change rules and/or a motion trajectory of the crop box on the video frame) corresponding to the mirror template selected by the user. The electronic device 100 may crop each frame of video image in the video file through the cropping frame information corresponding to the mirror-moving template, and adjust the cropped video image to a designated size to obtain a cropped video. In this way, the electronic device 100 may clip the saved video file through the clipping information corresponding to the moving mirror template selected by the user, zoom the video frame again, and highlight the shooting subject or the scene background in the video frame.
Illustratively, as shown in FIG. 17A, the electronic device 100 displays an interface 310 with a home screen. For the text description of the interface 310, reference may be made to the embodiment shown in fig. 3A, which is not described herein again.
The electronic device 100 may receive an input (e.g., a single click) from a user on the gallery application icon 312, in response to which the electronic device 100 may display a gallery application interface 1710 as shown in FIG. 17B.
As shown in fig. 17B, the gallery application interface 1710 may display a gallery including one or more albums (e.g., all photos album, video album 1716, camera album, WeChat album, micro blog album, etc.). The electronic device 100 may display a gallery menu 1711 below the gallery album interface 1710. The gallery menu 1711 includes a photo control 1712, an album control 1713, a time control 1714, and a discovery control 1715. The photo control 1712 is configured to trigger the electronic device 100 to display all local pictures in the form of thumbnail pictures. The album widget 1713 is used to trigger the electronic device 100 to display the album to which the local picture belongs. As shown in FIG. 17B, the current album control 1713 is in a selected state and the electronic device 100 displays the gallery application interface 1710. The time control 1714 may be used to trigger the electronic device 100 to display a locally stored pick picture. The discovery control 1715 may be used to trigger the electronic device 100 to display a categorized album of pictures.
The electronic device 100 may receive an input (e.g., a single click) from a user on the video album 1716, in response to which the electronic device 100 may display a video album interface 1720 as shown in fig. 17C.
As shown in FIG. 17C, the video album interface 1720 may include one or more video file options (e.g., a video file option 1721 and a video file option 1722).
Electronic device 100 may receive an input (e.g., a single click) from a user on the video file option 1721, in response to which electronic device 100 may display a video browsing interface 1730 as shown in FIG. 17D.
As shown in fig. 17D, the video browsing interface 1730 includes a video 1731, a video playing control 1732, a total video time 1733, a menu 1734, and a mirror moving mode control 1735. For the text description of the menu 1734, reference may be made to the foregoing embodiments, which are not described herein again.
The electronic device 100 may receive an input (e.g., a single click) from the user on the fortune mirror mode control 1735, in response to which the electronic device 100 may display a fortune mirror template selection interface 1740 as shown in fig. 17E.
As shown in fig. 17E, the mirror template selection interface 1740 includes one or more mirror template options (e.g., "travel" mirror option 1741A, "comfortable" mirror option 1741B, "dynamic" mirror option 1741C, etc.), a selection box 1742, a determination control 1743, a cancel control 1744, a mirror template display area 1751, and a mirror trajectory display area 1753. Among other things, the cancel control 1743 can be used to trigger the electronic device 100 to exit the scope template selection interface 1740. And the size change rules or the movement tracks of the cutting frames corresponding to different lens moving template options are different. As shown in fig. 17E, selection box 1742 has selected "travel" mirror option 1741A, and the mirror template display area 1751 displays the video swatch corresponding to "travel" mirror option 1741A. The scope track display area 1753 may display the cropping frame size change rule corresponding to the "travel" scope moving option 1741A and/or the motion track of the cropping frame in the original image of the video sample. For example, the center point of the crop box corresponding to the "travel" mirror movement option 1741A may be located at the same position as the center point of the original image, and the size of the crop box corresponding to the "travel" mirror movement option 1741A may be reduced from the size of the original image to a preset minimum size in proportion to time, and then increased from the preset minimum size to the size of the original image in proportion to time. Where the "travel" mirror selection 1741A corresponds to a crop box size that varies proportionally to the size of the original image. The preset minimum size is smaller than the size of the original image.
The electronic device 100 can receive an input operation (e.g., a click) from a user on the determination control 1743, and in response to the input operation, the electronic device 100 can crop the video screen stream in the video 1731 to obtain a cropped video 1761 using the crop box resizing rule corresponding to the selected "travel" moving mirror template 1741A and/or the motion trajectory of the crop box in the original image, and display the cropped video 1761 on the video browsing interface 1760 shown in fig. 17F.
As shown in FIG. 17F, the video browsing interface 1760 may include a cut video 1761, a video play control 1762, a total length of cut video 1763, a mirror template switching control 1764, and a save control 1765. The video play control 1762 may be used, among other things, to trigger the electronic device 100 to save the cropped video 1761 to a specified storage path. The mirror motion template switching control 1764 may be used to trigger the electronic device 100 to switch mirror motion templates.
In some application scenarios, the electronic device 100 may have a video file stored locally. The electronic device 100 may receive a user selected piece of music or collect ambient sounds through a microphone. The electronic apparatus 100 may control a size change of the crop box and/or a motion trajectory of the crop box on the original video picture in the video file according to the audio information of the music piece or the environmental sound. The electronic device 100 may determine a position and a size of the cropping frame on each frame of the original image acquired by the camera, crop each frame of the video image based on the position and the size of the cropping frame on each frame of the original image, and adjust the cropped video image to a specified size to obtain a cropped image stream. The electronic apparatus 100 may save the clip screen stream and the above-described music piece (or ambient sound) as a video file. Therefore, the video picture can automatically zoom along with the rhythm of the music, and the visual effect of the shot picture is enhanced.
Illustratively, as shown in FIG. 17G, the electronic device 100 may display a fortune mirror template selection interface 1740. When the electronic device 100 receives and moves the selection box 1742 onto the music mode option 1742C in response to a user input, the electronic device 100 may display a music template selection box 1770 on the scope template selection interface 1740 to include one or more music piece options (e.g., music piece 1 option 1771, music piece 2 option, music piece 3 option, music piece 4 option, etc.), and an ambient sound option 1774. Each music piece option has a play control and a selection control (e.g., a play control 1773 and a selection control 1772 are displayed on the music piece 1 option 1771), and the play control may be used to trigger the electronic device 100 to play the music piece corresponding to the play control. The selection control may be used to trigger the electronic device 100 to select the music piece corresponding to the selection control. The ambient sound option 1774 has a selected control 1775 displayed thereon.
The electronic device 100 can receive an input (e.g., a single click) from a user on the selection control 1775, and in response to the input, as shown in fig. 17G, the electronic device 100 can display a selection mark on the selection control 1775, where the selection mark displayed on the selection control 1775 can be used to indicate that the electronic device 100 captures the ambient sound through a microphone, and to crop each frame of the video frame in the video 1731.
As shown in fig. 17G, after the ambient sound option 1774 has been selected, the electronic device 100 can receive an input (e.g., a single click) from the user on the decision control 1743, in response to which the electronic device 100 can begin collecting ambient sound via the microphone and display an ambient sound collection interface 1780 as shown in fig. 17H.
As shown in fig. 17H, the ambient sound collection interface 1780 includes a waveform map 1781 of ambient sounds, a collection pause control 1782, a collection time bar 1783, and a re-collection control 1784. The waveform 1781 of the environmental sound is used to display the signal waveform of the environmental sound. The capture pause control 1782 may be used to trigger the electronic device 100 to pause the capture of ambient sounds via the microphone. The capture time bar 1783 is used to indicate a total time length (e.g., 12 seconds) that the electronic device 100 needs to capture the ambient sound and a time length that has already been captured (e.g., 8 seconds of ambient sound have currently been captured), where the total time length to capture the ambient sound is a broadcasting time length of the video to be cropped.
As shown in fig. 17I, when the electronic device 100 reaches the length of the broadcast time of the video to be clipped for the time of capturing the ambient sound, the electronic device 100 may display an ambient sound playing control 1785 and a determination control 1786 on the ambient sound clipping interface 1780. The ambient sound play control 1785 may be configured to trigger the electronic device 100 to play the captured ambient sound. Thus, the user can listen to whether the collected environment is appropriate or not.
The electronic device 100 may receive an input (e.g., a single click) from a user with respect to the determination control 1786, and in response to the input, the electronic device 100 may crop each frame of video picture in the video to be cropped by using the collected ambient sound, and enlarge the cropped picture to a specified size, so as to obtain a cropped picture stream. The electronic device 100 may combine the cropped picture stream and the captured ambient sound to obtain a cropped video 1791 and display the cropped video 1761 on a video browsing interface 1790 as shown in fig. 17J.
As shown in fig. 17J, the video browsing interface 1790 may include a clip video 1791, a video play control 1792, a clip video total duration 1793, a moving mirror template switching control 1794, and a save control 1795. The video playback control 1792 can be used to trigger the electronic device 100 to save the clip video 1791 to a specified storage path. The mirror template switching control 1794 may be used to trigger the electronic device 100 to switch mirror templates.
A video stream clipping system provided in an embodiment of the present application is described below.
Fig. 18 shows a schematic diagram of a video stream cropping system 1800 provided in an embodiment of the present application. The video stream cropping system 1800 can be applied to the electronic device 100 to perform the image processing method in the above embodiments.
As shown in fig. 18, the video stream cropping system 1800 may include: an image stabilizing module 1801, a mirror moving control module 1802 and a super-division processing module 1803.
When the above-mentioned mirror-moving mode is turned on for video recording, the electronic device 100 acquires the video stream through the camera. Or, the electronic device 100 may acquire the video stream through a camera when the foregoing mirror operating mode is turned on and a shooting interface is displayed. Alternatively, the electronic device 100 stores a video file locally, and the electronic device 100 may acquire a video stream in the video file.
In one possible implementation, there may be multiple cameras on the electronic device 100, including a wide-angle camera. When the electronic device 100 turns on the above-described mirror operating mode, the electronic device 100 acquires the video stream using a wide-angle camera. In this way, the electronic apparatus 100 can capture a video screen of a wider viewing angle.
When the electronic device 100 acquires the video stream through the camera, the electronic device 100 may perform image stabilization on the video stream through the image stabilization module 1802.
In one possible implementation, the image stabilization module 1801 may perform Electronic Image Stabilization (EIS) processing on the video stream. Specifically, the image stabilizing module 1801 may acquire that the electronic device 100 detects shake information of the electronic device 100 when acquiring a video stream through a camera by using a sensor such as a gyroscope sensor. The image stabilization module 1801 may calculate motion information of adjacent frame pictures in the video stream based on the dithering information. The image stabilizing module 1801 may perform motion compensation on each pixel point in the frame picture based on the motion information of the adjacent frame picture, so as to achieve the purpose of stabilizing the image.
In other possible implementations, the image stabilization module 1801 may also optically stabilize the video stream. Specifically, the electronic device 100 may compensate for image motion due to camera platform shake by optically blind adaptive adjustment of the optical path.
The image stabilization module 1801 may also perform mechanical image stabilization when the video stream is acquired by the camera. Specifically, the image stabilizing module 1801 may acquire shake information of the electronic device 100 detected by the electronic device 100 through a device such as a gyroscope sensor, and then adjust a servo system (sevomechanism) of the camera, so as to achieve the purpose of stabilizing an image.
The image stabilizing module 1801 is optional, and it is optional to perform image stabilizing processing on the acquired video stream, and the electronic device 100 may perform clipping by using the mirror moving control module without performing image stabilizing.
The mirror control module 1802 may crop a picture of the video stream based on a preset mirror template or a sound signal. Specifically, the mirror moving control module 1802 may determine the crop box information through the mirror moving template selected by the user or the rhythm information of the sound signal. Wherein the cropping information comprises the size and position of the cropping frame on each frame of picture in the video stream. The lens moving template defines a size change rule of the cropping frame and a motion track of the cropping frame on an original image acquired by the camera. The sound signal may be a music piece locally stored in the electronic device 100, or may be collected by a microphone when the electronic device 100 starts recording or starts the shooting interface in the foregoing shooting mode.
After determining the size and position of the cropping frame on each frame of picture in the video stream, the mirror-moving control module 1802 may intercept the image in the cropping frame in each frame of picture, and adjust the intercepted image to a specified size, thereby obtaining the cropped video stream.
After the cropping video stream is obtained, the mirror control module 1802 inputs the cropping video stream into the super-segmentation processing module 1803. The super-resolution processing module 1803 may perform super-resolution image reconstruction on the cut video stream. Therefore, the definition of the picture can be improved, the picture details are enriched, and the noise in the picture is removed. The electronic device 100 may display the cropped video stream after the super-segmentation in the video recording interface or the shooting interface in the above embodiments. Electronic device 100 may also save the over-cropped video stream in response to user input.
In a possible implementation manner, the hyper-resolution processing module 1803 is optional, and the electronic device 100 may directly display the clipped video stream processed by the mirror operation control module 1802 in the video recording interface or the shooting interface in the foregoing embodiment. Electronic device 100 may also save the cropped video stream in response to user input.
A video stream clipping system provided in an embodiment of the present application is described below.
Fig. 19 shows a schematic diagram of a video stream cropping system 1900 provided in an embodiment of the present application. The video stream cropping system 1900 can be applied to the electronic device 100 to execute the image processing method in the above embodiments.
The video stream cropping system 1900 may include a camera 1901, an Image Signal Processor (ISP)1902, a memory 1903, and an Application Processor (AP) 1904.
The camera 1901 may receive an optical signal, convert the optical signal into an electrical signal, and input the electrical signal to the image signal processor 1902. The image signal processor 1902 may output YUV data through a signal processing algorithm and store the YUV data in the memory 1903.
In one possible implementation, there may be multiple cameras on the electronic device 100, including a wide-angle camera. When the electronic device 100 turns on the above-described mirror operating mode, the electronic device 100 acquires the video stream using a wide-angle camera. In this way, the electronic apparatus 100 can capture a video screen of a wider viewing angle.
The application processor 1904 may read YUV data (for example, size 4608 × 2592 or 3840 × 2160) in the memory 1903, and determine the position and size of the cropping frame in the current frame based on the preset mirror template and the audio signal. The application processor 1904 may clip the image in the cropping frame on the current frame picture and adjust to a specified size (e.g., 1920 × 1080 or 3840 × 2160). The application processor 1904 may intercept the image in the cropping frame in each frame of the video stream, and adjust the image to a specified size to obtain a cropped video stream.
In one possible implementation, after determining the position and size of the crop box on the current frame picture, the application processor 1904 may send the position and size of the crop box on the current frame picture to the image signal processor 1902. The image signal processor 1902 may cut out an image within the crop frame on the current frame screen based on the position and size of the crop frame on the current frame screen, and adjust to a specified size. The image signal processor 1902 is used for intercepting an image in a cropping frame of each frame of picture in the video stream, and adjusting the image to a specified size to obtain a cropped video stream.
In one possible implementation, the electronic device 100 needs to identify the image content in the captured image, detect the location of tracking the specified image content, and determine the outline of the specified image content. The video stream cropping system 1900 also includes a neural network processor (not shown in FIG. 19). The application processor 1904 may preprocess the YUV data after reading the YUV data. The preprocessing flow includes, among other things, applying the processor 1904 (or applying the processor 1904 to direct digital signal processing) to convert the YUV data into an RGB map (or a BGR map or a mono map or a gray scale map, etc.) and down-sampling to the input map size of the AI model (e.g., 224 × 224, 288 × 288, etc.). The application processor 1904 may feed the preprocessed RGB map (or BGR map or single channel map or grayscale map, etc.) to the neural network processor. The neural network processor may detect and track the position of the designated image content in the RGB map (or BGR map or mono-channel map or gray scale map, etc.) through the AI model and determine the outline of the designated image content. The neural network processor may output detection frame information or contour information specifying the image content to the application processor 1904. The application processor 1904 may determine the crop box information based on the detection box information and/or the outline information of the designated image.
In some embodiments, the application processor 1904 may detect shaking information of the electronic device 100 through a gyro sensor while the image signal processor 1902 outputs YUV data. After acquiring the YUV data, the application processor 1904 may calculate motion information of adjacent frame pictures in the video stream based on the jitter information. The application processor 1904 may perform motion compensation on each pixel point in the frame picture based on the motion information of the adjacent frame picture, so as to achieve the purpose of stabilizing the image.
In one possible implementation, the application processor 1904 may perform Electronic Image Stabilization (EIS) processing on the video stream. Specifically, the application processor 1904 may acquire the shake information of the electronic apparatus 100 when the video stream is acquired by the camera by acquiring the shake information of the electronic apparatus 100 through a sensor such as a gyro sensor. The application processor 1904 may calculate motion information of adjacent frame pictures in the video stream based on the dithering information. The application processor 1904 may perform motion compensation on each pixel point in the frame picture based on the motion information of the adjacent frame picture, so as to achieve the purpose of stabilizing the image.
Possibly, the application processor 1904 may send the motion information of the adjacent frame picture to the graphics processor (not shown in fig. 19) after calculating the motion information of the adjacent frame picture in the video stream based on the jitter information. The image processor can perform motion compensation on each pixel point in the frame picture based on the motion information of the adjacent frame picture, thereby achieving the purpose of stabilizing the image.
In other possible implementations, the application processor 1904 may also optically stabilize the video stream. Specifically, the application processor 1904 may compensate for image motion due to camera platform shake by optically blind adaptive adjustment of the optical path.
The application processor 1904 may also perform mechanical stabilization while the video stream is being acquired by the camera. Specifically, the application processor 1904 may acquire shake information of the electronic device 100 detected by the electronic device 100 through a gyroscope sensor or the like, and then adjust a servo system (sevomechanism) of the camera, so as to achieve the purpose of stabilizing an image.
In some embodiments, the application processor 1904 may perform super-resolution reconstruction of the pictures of the cropped video stream. Therefore, the definition of the picture can be improved, the picture details are enriched, and the noise in the picture is removed.
In a possible implementation manner, the application processor 1904 performs clipping on each frame of picture in the video stream based on the clipping frame, then directly outputs the clipped picture to the super-resolution neural network model, performs image super-resolution reconstruction on the clipped picture through the super-resolution neural network model, and adjusts the picture to a specified size, thereby obtaining a clipped video stream after super-resolution.
In a possible implementation manner, after each frame of picture in the video stream is cut based on the cutting frame, the application processor 1904 adjusts the size to a specified size, and then outputs the size to the super-resolution neural network model, and performs image super-resolution reconstruction on the cut picture through the super-resolution neural network model to obtain a super-resolution cut video stream.
In this embodiment, the electronic device 100 may acquire the video stream at the first frame rate through the camera when receiving and responding to an input of a user on the camera application icon and displaying the shooting interface. After the electronic device 100 receives and responds to the input of the user turning on the mirror operation mode, the electronic device 100 may acquire a video stream at a second frame rate through the camera, where the second frame rate is higher than the first frame rate. In this way, the electronic device 100 can perform fusion using multiple frames of pictures (e.g., fusion every 3 to 7 frames of pictures) during the super-resolution process, so that the picture sharpness of the cropped video stream is improved.
After acquiring the multiple frames of pictures in the cropped video stream in the super-partition process, the electronic device 100 may select one of the multiple frames of pictures as a reference frame picture, and register the remaining frames of pictures in the multiple frames of pictures with the reference frame picture. The electronic device 100 may input the registered multiple frames of images into the hyper-neural network model, and fuse the multiple frames of images through the hyper-neural network model to obtain a high-definition frame image.
The electronic device 100 may increase an image capture frame rate of the camera, thereby increasing a frame rate of the video stream. For example, before the electronic device 100 turns on the mirror-in mode, image frames may be captured by the camera at a frame rate of 30 frames per second. After the electronic device 100 starts the mirror moving mode, the image frames may be acquired by the camera at a frame rate of 90 frames per second.
In a possible implementation manner, the electronic device 100 may insert frames in software after the camera acquires an image frame, so as to improve the frame rate of the video stream. For example, before the electronic device 100 starts the mirror operating mode, the camera may capture an image frame at a frame rate of 30 frames per second, and without performing frame interpolation, a video stream of 30 frames per second is obtained. After the electronic device 100 starts the mirror moving mode, the camera may capture image frames at a frame rate of 90 frames per second, and insert 90 frames of image frames per second, so as to obtain a video stream with 180 frames per second.
The following describes a flow of an image processing method provided in an embodiment of the present application.
Fig. 20 is a flowchart illustrating an image processing method according to an embodiment of the present application.
As shown in fig. 20, the image processing method includes:
s2001, the electronic device 100 displays a first shooting preview interface, where the first shooting preview interface includes a first preview frame, and the first preview frame displays a picture acquired by a camera of the electronic device 100 in real time.
Here, the first photographing preview interface may be the photographing interface 320 in the embodiment shown in fig. 3B described above. In some embodiments, the first capture preview interface may also be the capture interface 320 in the embodiment illustrated in FIG. 6A described above. For details, reference may be made to the foregoing embodiments, which are not described herein again.
S2002, after the electronic device 100 detects a first operation on the first shooting preview interface, the electronic device 100 displays a shooting option interface. The shooting option interface comprises a first shooting option and a second shooting option.
The first operation may be an operation of the above-described more mode control 327G in the embodiment shown in fig. 3B, and the shooting option interface may be the above-described moving mirror template selection interface 340 in the embodiment shown in fig. 3D. After electronic device 100 detects an operation for more mode control 327G, electronic device 100 may display mode selection page 330 shown in FIG. 3C, above. Electronic device 100 may detect an operation (e.g., a single click) with respect to "smart fortune mirror mode" control 331 in mode selection page 330, and in response to the operation with respect to "smart fortune mirror mode" control 331, electronic device 100 may display the fortune mirror template selection interface 340. For example, the first shooting option may be the "travel" mirror motion option 341A in the mirror motion template selection interface 340 shown in fig. 3D, and the second shooting option may be the "dynamic" mirror motion template option 341C in the mirror motion template selection interface 340 shown in fig. 3D.
In some embodiments, the first operation may be an operation of the more mode control 327G in the embodiment illustrated in fig. 3B described above, and the capture option interface may be the foregoing scope template selection interface 1410 in the embodiment illustrated in fig. 14A. After electronic device 100 detects operation with respect to the more mode control 327G, electronic device 100 can display the mode selection page 330 illustrated in FIG. 3C above. Electronic device 100 may detect an operation (e.g., a single click) with respect to "smart mirror mode" control 331 in mode selection page 330, and in response to the operation with respect to "smart mirror mode" control 331, electronic device 100 may display mirror mode template selection interface 1410 shown in fig. 14A. For example, the first shooting option may be the music section 1 option 1421 shown in fig. 14A described above, and the second shooting option may be the music section 2 option shown in fig. 14B described above.
S2003, after the electronic apparatus 100 detects a second operation for the first photographing option, the electronic apparatus 100 displays a second photographing preview interface. The second preview interface includes a second preview frame, and the second preview frame displays a picture acquired by a camera of the electronic device 100 in real time.
The second operation may be an operation (e.g., a single click) for the "travel" mirror option 341A in the embodiment shown in fig. 3D, and the second shooting preview interface may be the video recording interface 360 in the embodiment shown in fig. 3I or the shooting interface 510 in the embodiment shown in fig. 5B. After the electronic device 100 detects an operation on the "travel" moving mirror option 341A, the electronic device 100 may further detect an operation (e.g., a single click) on the moving mirror capturing control 345 in the embodiment shown in fig. 3D or fig. 3E or fig. 5A, and in response to the operation on the moving mirror capturing control 345, the electronic device 100 may display the video recording interface 360 or the capturing interface 510.
S2004, the electronic apparatus 100 starts capturing the first video content.
In one possible implementation, before the electronic device 100 starts capturing the first video content, the electronic device 100 detects a fourth operation with respect to the second capture preview interface. In response to the fourth operation, the electronic apparatus 100 starts capturing the first video content.
Illustratively, the second shooting preview interface may be the video recording interface 360 in the embodiment shown in FIG. 3I described above. The fourth operation may be an operation (e.g., a single click) with respect to the video capture control 361 in the recording interface 360, and in response to the operation with respect to the video capture control 361 in the recording interface 360, the electronic device 100 may start capturing the first video content.
S2005, at a first time after the first video content starts to be captured, the electronic device 100 displays, in the second preview frame, a first portion of a picture captured by a camera of the electronic device 100 in real time.
And S2006, at a second moment after the first video content starts to be shot, the electronic device 100 displays a second part of a picture acquired by the camera of the electronic device 100 in real time in a second preview frame.
For example, the electronic device 100 displays the first portion, which may be the cropped image 372 in the embodiment shown in fig. 3J described above, in the second preview frame at the 3 rd second after the start of capturing the first video content. The electronic device 100 displays the second portion, which may be the cropped image 373 in the embodiment shown in fig. 3K described above, in the second preview frame at the 8 th second after the start of capturing the first video content.
The second shooting preview interface further comprises a first window. The electronic device 100 displays a picture acquired by a camera of the electronic device 100 in real time in the first window. At a first time after the start of capturing the first video content, the electronic device 100 displays a bounding box of the first portion in the first window. At a second time after the first video content is started to be captured, the electronic device 100 displays the bounding box of the second portion in the first window. Therefore, the user can see the picture acquired by the camera in real time conveniently, and the user can find the shot main body conveniently.
Illustratively, the first window may be the viewfinder presentation window 521 in the embodiment shown in fig. 5B-5E described above.
In a possible implementation manner, at a first time after the first video content starts to be shot, the electronic device 100 displays, in the first window, a video picture at the first time in the video sample corresponding to the first shooting option. At a second time after the first video content starts to be shot, the electronic device 100 displays a video picture at the second time in the video sample corresponding to the first shooting option in the first window. Therefore, the user can conveniently compare the shooting effect in the video sample in real time.
Illustratively, the first window may be the foregoing moving mirror template preview window 363 in the embodiment illustrated in FIGS. 3I-3M.
The display positions of the first window and the second preview frame are any one of the following positions: at least a partial area of the first window overlaps with a display area of the second preview frame; or the first window is displayed at a position outside the second preview frame in the second shooting preview interface; or the first window is displayed in the upper right corner area of the second preview frame; alternatively, the first window is displayed in the upper left corner region of the second preview box.
The first window comprises a window closing control; the electronic device 100 detects an eighth operation on the window closing control in the first window; in response to the eighth operation, the electronic apparatus 100 closes to display the first window. Therefore, when the user does not need the first window, the user can manually trigger the closing, and it is ensured that the shooting picture displayed on the electronic device 100 is not blocked.
After the electronic device 100 closes and displays the first window, the electronic device 100 displays a window opening control in the second shooting preview interface; the electronic device 100 detects a ninth operation for the window opening control; in response to the ninth operation, the electronic apparatus 100 displays the first window on the second photographing preview interface. In this way, the user may trigger electronic device 100 to redisplay the first window when the user needs to retrieve the widget.
For specific content, reference may be made to the embodiments shown in fig. 4D to fig. 4F, which are not repeated herein.
In some embodiments, the second shooting preview interface further includes a second window, and the electronic device 100 displays the waveform of the first music piece corresponding to the first shooting option in the second window. Wherein the audio information in the waveform map at a first time after the first video content starts to be captured is different from the audio information in the waveform map at a second time after the first video content starts to be captured, and the audio information includes one or more of the following: rhythm information, amplitude information, and style information.
Wherein after the electronic device 100 starts shooting the first video content, the electronic device 100 can play the first music piece.
When the electronic device 100 saves the first part as a video picture of a first time in a video file and saves the second part as a video picture of a second time in the video file, the electronic device 100 saves the first music piece in the audio data of the video file.
Illustratively, the second window may be the sound waveform presentation window 1445 in the embodiment illustrated in FIGS. 14C-14E described above. For details, reference may be made to the embodiments shown in fig. 14C to fig. 14E, which are not described herein again.
After the electronic device 100 displays a second portion of a picture captured by the camera of the electronic device 100 in real time in the second preview frame, the electronic device 100 detects a sixth operation on the second shooting preview interface. In response to the sixth operation, the electronic apparatus 100 saves the first part as a video picture at a first time in a video file and saves the second part as a video picture at a second time in the video file. In this way, the electronic apparatus 100 can capture a video image having an effect of auto zoom, lens shift, or the like.
For example, the playing pictures of the video file may refer to the embodiments shown in fig. 3N to fig. 3Q, and are not described herein again.
In one possible implementation, the electronic device 100 detects the seventh operation at a first time after the start of shooting the first video content. In response to the seventh operation, the electronic device 100 saves the first portion as a picture. For example, the seventh operation may be an operation (e.g., a single click) with respect to the photographing control 368 in the embodiment shown in fig. 3J. For details, reference may be made to the embodiment shown in fig. 3J, which is not described herein again.
S2007, the electronic device 100 displays the shooting option interface.
In a possible implementation manner, the second shooting preview interface further includes an option switching control; after the electronic device 100 displays the second portion of the screen captured by the camera of the electronic device 100 in real time in the second preview frame, the electronic device 100 detects a fifth operation for the option switching control. In response to the fifth operation, the electronic apparatus 100 displays the shooting option interface. Therefore, after one shooting option is selected, an entrance for replacing the shooting option is provided for the user, and the user can conveniently replace the shooting option to shoot video content.
Illustratively, the fifth operation may be an operation (e.g., a single click) with respect to the moving mirror template switching control 362 in the embodiment shown in fig. 4A described above. For details, reference may be made to the embodiments shown in fig. 4A to fig. 4D, which are not repeated herein.
S2008, after the electronic device 100 detects a third operation on the second shooting option, the electronic device 100 displays a third shooting preview interface, where the third shooting preview interface includes a third preview frame, and the third preview frame displays a picture acquired by a camera of the electronic device 100 in real time.
Illustratively, the third shooting preview interface may be the shooting interface 410 shown in fig. 4D described above. For the shot image effect corresponding to different shooting options, reference may be made to the embodiments shown in fig. 7A to 7F, the embodiments shown in fig. 8A to 8G, and the embodiments shown in fig. 9A to 9F, which are not described again here.
S2009, the electronic device 100 starts capturing the second video content.
S2010, at a first time after the second video content starts to be captured, the electronic device 100 displays, in the third preview frame, a third portion of a picture acquired by a camera of the electronic device 100 in real time.
S2011, at a second time after the second video content starts to be captured, the electronic device 100 displays, in the third preview frame, a fourth portion of the picture acquired by the camera of the electronic device 100 in real time. Wherein the first portion, the second portion, the third portion and the fourth portion are all different.
For the display effect of the first portion, the second portion, the third portion, and the fourth portion, reference may be made to the embodiments shown in fig. 7A to 7F, the embodiments shown in fig. 8A to 8G, and the embodiments shown in fig. 9A to 9F, which are not described herein again.
In this embodiment of the application, before displaying a picture acquired by a camera of the electronic device 100 in real time, the electronic device 100 performs image stabilization processing on the picture acquired by the camera of the electronic device 100 in real time, where the image stabilization processing includes one or more of the following: electronic image stabilization EIS processing, optical image stabilization processing and mechanical image stabilization processing. Thus, the shot picture can be made smooth and excessive.
In the embodiment of the present application, before the electronic device 100 displays the first portion of the picture acquired by the camera of the electronic device 100 in real time in the second preview frame, the electronic device performs super-resolution reconstruction on the first portion. Before the electronic equipment displays a second part of a picture acquired by a camera of the electronic equipment in real time in the second preview frame, the electronic equipment carries out image super-resolution reconstruction on the second part. Before the electronic equipment displays a third part of a picture acquired by a camera of the electronic equipment in real time in the third preview frame, the electronic equipment carries out image super-resolution reconstruction on the third part. Before the electronic equipment displays a fourth part of a picture acquired by a camera of the electronic equipment in real time in the third preview frame, the electronic equipment carries out image super-resolution reconstruction on the fourth part. Thus, the shot picture can be clearer.
In one possible implementation, the camera of the electronic device includes a wide-angle camera and a non-wide-angle camera. The first preview frame displays a picture acquired by the non-wide-angle camera in real time. And the second preview frame displays a picture acquired by the wide-angle camera in real time. And the third preview frame displays the pictures acquired by the wide-angle camera in real time. Thus, the shot picture can be made to have a larger angle of view.
The above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and these modifications or substitutions do not depart from the scope of the technical solutions of the embodiments of the present application.

Claims (22)

1. An image processing method, characterized by comprising:
the method comprises the steps that the electronic equipment displays a first shooting preview interface, the first shooting preview interface comprises a first preview frame, and the first preview frame displays a picture acquired by a camera of the electronic equipment in real time;
after the electronic equipment detects a first operation aiming at the first shooting preview interface, the electronic equipment displays a shooting option interface, wherein the shooting option interface comprises a first shooting option and a second shooting option;
after the electronic equipment detects a second operation aiming at the first shooting option, the electronic equipment displays a second shooting preview interface, the second shooting preview interface comprises a second preview frame, and the second preview frame displays a picture acquired by a camera of the electronic equipment in real time;
the electronic equipment starts to shoot first video content;
at a first moment after the first video content starts to be shot, the electronic equipment displays a first part of a picture acquired by a camera of the electronic equipment in real time in the second preview frame;
at a second moment after the first video content is shot, the electronic equipment displays a second part of a picture acquired by a camera of the electronic equipment in real time in the second preview frame;
the electronic equipment displays the shooting option interface;
after the electronic equipment detects a third operation aiming at the second shooting option, the electronic equipment displays a third shooting preview interface, the third shooting preview interface comprises a third preview frame, and the third preview frame displays a picture acquired by a camera of the electronic equipment in real time;
the electronic equipment starts to shoot second video content;
at a first moment after the second video content starts to be shot, the electronic equipment displays a third part of a picture acquired by a camera of the electronic equipment in real time in the third preview frame;
and at a second moment after the second video content is shot, the electronic equipment displays a fourth part of a picture acquired by a camera of the electronic equipment in real time in the third preview frame, wherein the first part, the second part, the third part and the fourth part are different.
2. The method of claim 1, wherein before the electronic device begins capturing the first video content, the method further comprises:
the electronic equipment detects a fourth operation aiming at the second shooting preview interface;
the electronic device starts to shoot a first video content, and the method specifically includes:
in response to the fourth operation, the electronic device starts capturing the first video content.
3. The method of claim 1, wherein the second capture preview interface further comprises an option toggle control; after the electronic device displays a second part of a picture acquired by a camera of the electronic device in real time in the second preview frame, the method further includes:
the electronic equipment detects a fifth operation aiming at the option switching control;
the electronic device displaying the shooting option interface specifically includes:
in response to the fifth operation, the electronic device displays the shooting option interface.
4. The method according to claim 1 or 2, wherein after the electronic device displays the second part of the picture acquired by the camera of the electronic device in real time in the second preview frame, the method further comprises:
the electronic equipment detects a sixth operation aiming at the second shooting preview interface;
in response to the sixth operation, the electronic device saves the first part as a video picture at a first time in a video file, and saves the second part as a video picture at a second time in the video file.
5. The method according to claim 1 or 2, characterized in that the method further comprises:
the electronic equipment detects a seventh operation at a first moment after the electronic equipment starts shooting the first video content;
in response to the seventh operation, the electronic device saves the first portion as a picture.
6. The method of claim 1 or 2, wherein the second capture preview interface further comprises a first window; the method further comprises the following steps:
the electronic equipment displays a picture acquired by a camera of the electronic equipment in real time in the first window;
at a first moment after shooting of the first video content is started, the electronic equipment displays a bounding box of the first part in the first window;
and at a second moment after the first video content is shot, the electronic equipment displays the boundary box of the second part in the first window.
7. The method of claim 1 or 2, wherein the second capture preview interface further comprises a first window; the method further comprises the following steps:
at a first moment after the first video content starts to be shot, the electronic equipment displays a video picture at the first moment in the video sample corresponding to the first shooting option in the first window;
and at a second moment after the first video content starts to be shot, the electronic equipment displays a video picture at the second moment in the video sample corresponding to the first shooting option in the first window.
8. The method of claim 6, wherein the display positions of the first window and the second preview box are any one of:
at least a partial area of the first window and a display area of the second preview box overlap; alternatively, the first and second liquid crystal display panels may be,
the first window is displayed at a position outside the second preview frame in the second shooting preview interface; alternatively, the first and second electrodes may be,
the first window is displayed in the upper right corner area of the second preview frame; alternatively, the first and second liquid crystal display panels may be,
and the first window is displayed in the upper left corner area of the second preview frame.
9. The method of claim 7, wherein the display positions of the first window and the second preview box are any one of:
at least a partial area of the first window and a display area of the second preview box overlap; alternatively, the first and second liquid crystal display panels may be,
the first window is displayed at a position outside the second preview frame in the second shooting preview interface; alternatively, the first and second liquid crystal display panels may be,
the first window is displayed in the upper right corner area of the second preview frame; alternatively, the first and second electrodes may be,
and the first window is displayed in the upper left corner area of the second preview frame.
10. The method of claim 6, wherein a window closing control is included in the first window; the method further comprises the following steps:
the electronic device detects an eighth operation on the window closing control in the first window;
in response to the eighth operation, the electronic device closes and displays the first window.
11. The method of claim 7, wherein a window closing control is included in the first window; the method further comprises the following steps:
the electronic device detects an eighth operation on the window closing control in the first window;
in response to the eighth operation, the electronic device closes and displays the first window.
12. The method of claim 10, wherein after the electronic device closes displaying the first window, the method further comprises:
the electronic equipment displays a window opening control in the second shooting preview interface;
the electronic device detects a ninth operation of the window opening control;
in response to the ninth operation, the electronic device displays the first window on the second photographing preview interface.
13. The method of claim 11, wherein after the electronic device closes displaying the first window, the method further comprises:
the electronic equipment displays a window opening control in the second shooting preview interface;
the electronic device detects a ninth operation of the window opening control;
in response to the ninth operation, the electronic device displays the first window on the second shooting preview interface.
14. The method of claim 1 or 2, wherein the second capture preview interface further comprises a mirror mode close control; the method further comprises the following steps:
the electronic equipment detects a tenth operation acting on the mirror-moving mode closing control;
in response to the tenth operation, the electronic device displays the first shooting preview interface.
15. The method of claim 1 or 2, wherein the second capture preview interface further comprises a second window, and wherein the method further comprises:
the electronic equipment displays a waveform diagram of a first music piece corresponding to the first shooting option in the second window;
wherein the audio information in the waveform map at a first time after the first video content begins to be captured is different from the audio information in the waveform map at a second time after the first video content begins to be captured, the audio information comprising one or more of: rhythm information, amplitude information, and style information.
16. The method of claim 15, wherein after the electronic device begins capturing the first video content, the method further comprises:
the electronic equipment plays the first music piece.
17. The method of claim 15, further comprising:
and when the electronic equipment stores the first part as a video picture at a first moment in a video file and stores the second part as a video picture at a second moment in the video file, the electronic equipment stores the first music clip in the audio data of the video file.
18. The method of claim 1, further comprising:
before the electronic equipment displays a picture acquired by a camera of the electronic equipment in real time, image stabilization processing is performed on the picture acquired by the camera of the electronic equipment in real time, wherein the image stabilization processing comprises one or more of the following steps: electronic image stabilization EIS processing, optical image stabilization processing and mechanical image stabilization processing.
19. The method of claim 1, further comprising:
before the electronic equipment displays a first part of a picture acquired by a camera of the electronic equipment in real time in the second preview frame, performing image super-resolution reconstruction on the first part by the electronic equipment;
before the electronic equipment displays a second part of a picture acquired by a camera of the electronic equipment in real time in the second preview frame, the electronic equipment carries out image super-resolution reconstruction on the second part;
before the electronic equipment displays a third part of a picture acquired by a camera of the electronic equipment in real time in the third preview frame, performing image super-resolution reconstruction on the third part by the electronic equipment;
before the electronic equipment displays a fourth part of a picture acquired by a camera of the electronic equipment in real time in the third preview frame, the electronic equipment carries out image super-resolution reconstruction on the fourth part.
20. The method of claim 1, wherein the cameras of the electronic device comprise a wide-angle camera and a non-wide-angle camera;
the first preview frame displays a picture acquired by a camera of the electronic equipment in real time, and specifically comprises the following steps:
the first preview frame displays a picture acquired by the non-wide-angle camera in real time;
the second preview frame displays a picture acquired by a camera of the electronic equipment in real time, and specifically comprises the following steps:
the second preview frame displays a picture acquired by the wide-angle camera in real time;
the third preview frame displays a picture acquired by a camera of the electronic equipment in real time, and specifically includes:
and the third preview frame displays the pictures acquired by the wide-angle camera in real time.
21. An electronic device, comprising: the system comprises a touch screen, a camera, one or more processors and one or more memories; the one or more processors are coupled with the touch screen, the camera, the one or more memories for storing computer program code comprising computer instructions that, when executed by the one or more processors, cause the electronic device to perform the method of any of claims 1-20.
22. A computer-readable storage medium comprising instructions that, when executed on an electronic device, cause the electronic device to perform the method of any of claims 1-20.
CN202011053345.5A 2020-09-29 2020-09-29 Image processing method, electronic equipment and computer readable storage medium Active CN113556461B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202011053345.5A CN113556461B (en) 2020-09-29 2020-09-29 Image processing method, electronic equipment and computer readable storage medium
CN202210839558.3A CN115379112A (en) 2020-09-29 2020-09-29 Image processing method and related device
PCT/CN2021/116944 WO2022068537A1 (en) 2020-09-29 2021-09-07 Image processing method and related apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011053345.5A CN113556461B (en) 2020-09-29 2020-09-29 Image processing method, electronic equipment and computer readable storage medium

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202210839558.3A Division CN115379112A (en) 2020-09-29 2020-09-29 Image processing method and related device

Publications (2)

Publication Number Publication Date
CN113556461A CN113556461A (en) 2021-10-26
CN113556461B true CN113556461B (en) 2022-07-26

Family

ID=78101639

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202210839558.3A Pending CN115379112A (en) 2020-09-29 2020-09-29 Image processing method and related device
CN202011053345.5A Active CN113556461B (en) 2020-09-29 2020-09-29 Image processing method, electronic equipment and computer readable storage medium

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202210839558.3A Pending CN115379112A (en) 2020-09-29 2020-09-29 Image processing method and related device

Country Status (2)

Country Link
CN (2) CN115379112A (en)
WO (1) WO2022068537A1 (en)

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114187216A (en) * 2021-11-17 2022-03-15 海南乾唐视联信息技术有限公司 Image processing method and device, terminal equipment and storage medium
CN114222065B (en) * 2021-12-20 2024-03-08 北京奕斯伟计算技术股份有限公司 Image processing method, image processing apparatus, electronic device, storage medium, and program product
CN114500851A (en) * 2022-02-23 2022-05-13 广州博冠信息科技有限公司 Video recording method and device, storage medium and electronic equipment
CN115550559B (en) * 2022-04-13 2023-07-25 荣耀终端有限公司 Video picture display method, device, equipment and storage medium
CN114549712B (en) * 2022-04-25 2022-07-12 北京搜狐新媒体信息技术有限公司 Method and device for generating dynamic webp format picture
CN117441341A (en) * 2022-05-18 2024-01-23 北京小米移动软件有限公司 Image processing method, device, mobile terminal and storage medium
CN116132790B (en) * 2022-05-25 2023-12-05 荣耀终端有限公司 Video recording method and related device
CN116055868B (en) * 2022-05-30 2023-10-20 荣耀终端有限公司 Shooting method and related equipment
CN116088740B (en) * 2022-05-30 2023-10-31 荣耀终端有限公司 Interface processing method and device
CN116051368B (en) * 2022-06-29 2023-10-20 荣耀终端有限公司 Image processing method and related device
CN115379195B (en) * 2022-08-26 2023-10-03 维沃移动通信有限公司 Video generation method, device, electronic equipment and readable storage medium
CN115237297B (en) * 2022-09-21 2023-03-24 荣耀终端有限公司 Method for displaying schedule and related device
CN116723416B (en) * 2022-10-21 2024-04-02 荣耀终端有限公司 Image processing method and electronic equipment
CN117135452A (en) * 2023-03-31 2023-11-28 荣耀终端有限公司 Shooting method and electronic equipment
CN116095262B (en) * 2023-04-11 2023-08-22 北京仁光科技有限公司 Mobile processing device, processing method and system for processing video signal source
CN117119276A (en) * 2023-04-21 2023-11-24 荣耀终端有限公司 Underwater shooting method and electronic equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109831622A (en) * 2019-01-03 2019-05-31 华为技术有限公司 A kind of image pickup method and electronic equipment
CN110445978A (en) * 2019-06-24 2019-11-12 华为技术有限公司 A kind of image pickup method and equipment
CN111083354A (en) * 2019-11-27 2020-04-28 维沃移动通信有限公司 Video recording method and electronic equipment

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5432664B2 (en) * 2009-10-22 2014-03-05 キヤノン株式会社 Imaging device
US9137444B2 (en) * 2011-09-26 2015-09-15 Sony Corporation Image photography apparatus for clipping an image region
KR102289837B1 (en) * 2017-01-06 2021-08-17 삼성전자주식회사 Method and electronic device for taking a photograph
CN108900790B (en) * 2018-06-26 2021-01-01 努比亚技术有限公司 Video image processing method, mobile terminal and computer readable storage medium
CN109361865B (en) * 2018-11-21 2020-08-04 维沃移动通信(杭州)有限公司 Shooting method and terminal
CN110381276B (en) * 2019-05-06 2021-08-13 华为技术有限公司 Video shooting method and electronic equipment
CN111093023B (en) * 2019-12-19 2022-03-08 维沃移动通信有限公司 Video shooting method and electronic equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109831622A (en) * 2019-01-03 2019-05-31 华为技术有限公司 A kind of image pickup method and electronic equipment
CN110445978A (en) * 2019-06-24 2019-11-12 华为技术有限公司 A kind of image pickup method and equipment
WO2020259038A1 (en) * 2019-06-24 2020-12-30 华为技术有限公司 Method and device for capturing images
CN111083354A (en) * 2019-11-27 2020-04-28 维沃移动通信有限公司 Video recording method and electronic equipment

Also Published As

Publication number Publication date
CN113556461A (en) 2021-10-26
CN115379112A (en) 2022-11-22
WO2022068537A1 (en) 2022-04-07

Similar Documents

Publication Publication Date Title
CN113556461B (en) Image processing method, electronic equipment and computer readable storage medium
WO2021093793A1 (en) Capturing method and electronic device
CN110072070B (en) Multi-channel video recording method, equipment and medium
US11800221B2 (en) Time-lapse shooting method and device
CN112532869B (en) Image display method in shooting scene and electronic equipment
CN113489894B (en) Shooting method and terminal in long-focus scene
CN113747048B (en) Image content removing method and related device
CN113194242B (en) Shooting method in long-focus scene and mobile terminal
CN113542581A (en) View finding method of multi-channel video, graphical user interface and electronic equipment
CN113727017B (en) Shooting method, graphical interface and related device
CN113497881B (en) Image processing method and device
CN113891009B (en) Exposure adjusting method and related equipment
CN113170037A (en) Method for shooting long exposure image and electronic equipment
CN112580400A (en) Image optimization method and electronic equipment
CN114466101B (en) Display method and electronic equipment
CN113542574A (en) Shooting preview method under zooming, terminal, storage medium and electronic equipment
RU2789447C1 (en) Method and apparatus for multichannel video recording
CN117202081A (en) Audio processing method and electronic equipment
CN113497888A (en) Photo preview method, electronic device and storage medium
CN115914823A (en) Shooting method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant