CN112637481B - Image scaling method and device - Google Patents

Image scaling method and device Download PDF

Info

Publication number
CN112637481B
CN112637481B CN202011340842.3A CN202011340842A CN112637481B CN 112637481 B CN112637481 B CN 112637481B CN 202011340842 A CN202011340842 A CN 202011340842A CN 112637481 B CN112637481 B CN 112637481B
Authority
CN
China
Prior art keywords
focal length
zoom
zooming
target
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011340842.3A
Other languages
Chinese (zh)
Other versions
CN112637481A (en
Inventor
孙晓康
西蒙·埃克斯特兰德
王宗波
荣石中
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202011340842.3A priority Critical patent/CN112637481B/en
Publication of CN112637481A publication Critical patent/CN112637481A/en
Application granted granted Critical
Publication of CN112637481B publication Critical patent/CN112637481B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2628Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation

Abstract

The application provides an image zooming method and device. The image zooming method comprises the following steps: determining a zoom central point, wherein the zoom central point is associated with an initial touch position corresponding to a zoom operation performed by a user on a screen; determining a target focal length according to the zooming central point, wherein the target focal length is the maximum or minimum focal length which can cover the complete target object in the view finding range of a camera of the electronic equipment; adjusting the shooting focal length of the camera from the first focal length to the second focal length; the second focal length is larger than the first focal length and is smaller than or equal to the target focal length; or the second focal length is smaller than the first focal length and is greater than or equal to the target focal length; and displaying a preview picture acquired by the lens after the focal length is adjusted on the screen, wherein the preview picture takes the zoom central point as a central point. According to the method and the device, the target object expected by the user can be amplified or reduced in a targeted manner, and the target object is placed in the central area of the preview picture, so that the success rate of photographing is improved.

Description

Image scaling method and device
Technical Field
The present application relates to image processing technologies, and in particular, to an image scaling method and apparatus.
Background
When a user uses a camera of a mobile phone to take a picture, the user sometimes uses two fingers to open or close the camera on a screen of the mobile phone to adjust the focal length of the camera. The camera zooms in or zooms out the picture displayed on the screen based on the operation of the user so as to meet the photographing requirement of the user.
However, the above-described zoom process of the camera may cause an object that the user wants to photograph to not completely appear in the finder frame, resulting in a failure in photographing.
Disclosure of Invention
The application provides an image zooming method and device, which can be used for carrying out zooming-in or zooming-out on a target object expected by a user in a targeted manner, and the target object is placed in the central area of a preview picture, so that the success rate of photographing is improved.
In a first aspect, the present application provides an image scaling method, including: determining a zoom central point, wherein the zoom central point is located in a pixel area of a screen of an electronic device, on which a target object is displayed, and the zoom central point is associated with an initial touch position corresponding to a zoom operation performed by a user on the screen; determining a target focal length according to the zooming central point, wherein the target focal length is the maximum or minimum focal length which can cover the whole target object in the view finding range of a camera of the electronic equipment; adjusting a shooting focal length of the camera from a first focal length to a second focal length; wherein the second focal length is greater than the first focal length and the second focal length is less than or equal to the target focal length; or the second focal length is smaller than the first focal length and is greater than or equal to the target focal length; and displaying a preview picture acquired by the lens after the focal length is adjusted on the screen, wherein the preview picture takes the zooming central point as a central point.
Generally, when a user uses an electronic device (e.g., a mobile phone) to take a picture, a picture taking Application (APP) is first opened on the electronic device, then a default lens (e.g., a rear lens) on the electronic device is opened to obtain a real picture of the electronic device at a default focal length (if the default focal length is a fixed focal length of the lens, if the default focal length is a zoom lens, the default focal length may be one of a plurality of focal lengths of the lens, the default focal length may be preset), and a preview image signal corresponding to the real picture is transmitted to a screen of the electronic device, and the screen displays a preview picture corresponding to the preview image signal. Before pressing the shutter, a user can adjust the focal length in various ways, and the view finding range of the lens is reduced along with the increase of the focal length, so that a preview picture displayed on a screen of the electronic equipment can be changed into an enlarged picture of a local area in the original preview picture; as the focal length is reduced, the view range of the lens is enlarged, and the preview image displayed on the screen of the electronic device is a reduced image including the original preview image and a frame including more contents.
The above-mentioned various ways of adjusting the focal length may include: (1) the two fingers of the user slide on the screen separately (indicating that the focal length needs to be increased, the viewing range needs to be reduced, the local area in the current preview picture is enlarged, and the details of the local area are richer) or gather together (indicating that the focal length needs to be reduced, the viewing range needs to be increased, and the current preview picture needs to be reduced, so that the preview picture contains more contents); (2) the user double clicks a screen by one finger (indicating that the focal length is to be increased, the viewing range is to be reduced, and the local area in the current preview picture is enlarged, so that the details of the local area are richer) or double clicks the screen by two fingers (indicating that the focal length is to be reduced, the viewing range is to be increased, and the current preview picture is to be reduced, so that the preview picture contains more contents).
When the user adjusts the focal length, the user may zoom in or out on a specific object in the preview screen according to the desired shooting effect. For example, the user may want to zoom in on a character that appears smaller in the preview screen at the default focal length. At the moment, two fingers of the user slide on the screen separately, so that the focal length can be increased, the view finding range can be reduced, and the local area in the current preview picture can be enlarged. In the related art, the center point of the current preview screen is used as a zoom center point for zooming, that is, a zoomed partial area surrounds the center point of the current preview screen, but if a person to be photographed is not in the middle of the current preview screen, part or all of the person may not be in the view range after the focus adjustment, so that the zoomed preview screen does not include a complete person screen.
The electronic device can acquire touch data of a user through the touch sensor, analyze the touch data to determine what kind of zoom operation the user performs (for example, two fingers slide apart, two fingers slide together, one finger double-click on the screen, or two fingers double-click on the screen), and finally determine a zoom center point according to the zoom operation. The zoom center point in the present application is associated with an initial touch position corresponding to a zoom operation performed by a user on a screen.
In one possible implementation manner, when the zooming operation is that the two fingers slide separately or the two fingers slide together, the initial touch positions of the two fingers on the screen are acquired, and the midpoint of a connecting line between the initial touch positions of the two fingers on the screen is used as a zooming central point. For example, when a user wants to zoom in a person in a preview picture, the user can place two fingers near a pixel area on a screen where the person is displayed and then slide the two fingers on the screen separately, so that a midpoint of a connecting line acquired according to initial touch positions of the two fingers is likely to fall within the pixel area where the person is displayed, and then the zoom-in is performed by taking the midpoint as a zoom center point, so that the person can be always located at the center of the preview picture, and not only can the accurate focusing during the zooming be realized, but also the person cannot easily get out of the view range of a lens.
In one possible implementation manner, when the zooming operation is a one-finger double click or a two-finger double click, the touch position corresponding to the one-finger double click or the two-finger double click is determined as the zooming center point. For example, when a user wants to enlarge a person in a preview image, the user can use a finger joint of one finger to double click a certain position in a pixel area on a screen where the person is displayed, and enlarge the position by using the position as a zoom central point, so that the person can be always positioned at the central position of the preview image, thereby not only realizing accurate focusing in the zooming process, but also preventing the person from easily getting out of the view range of a lens.
The target object is in the center of the preview screen, so that the possibility that the target object is out of the mirror can be reduced unless the focal length is so large that the lens cannot completely obtain the target object. The target focal length can therefore be determined based on whether the target object is out of view, i.e. the target focal length is the maximum (suitable for a scene with increased focal length) or the minimum (suitable for a scene with decreased focal length) focal length that the viewing range of the camera of the electronic device can cover to the complete target object.
And after the electronic equipment determines the target focal length, if the target focal length is the condition of a plurality of fixed-focus lenses, switching the current lens into the fixed-focus lens corresponding to the second focal length. If the zoom lens is the case, the current focal length of the zoom lens is switched to the second focal length. Thereby achieving camera zoom.
When the zooming operation is to increase the focal length, the second focal length is greater than the first focal length and is less than or equal to the target focal length. If the second focal length is smaller than the target focal length, the process of increasing the focal length may be step-by-step, that is, each time the user performs an operation of increasing the focal length (e.g., two-finger sliding apart or one-finger double-click), the shooting focal length of the camera is adjusted from the current focal length (first focal length) to a focal length (second focal length) one step larger than the current focal length. If the second focal length is equal to the target focal length, the process of increasing the focal length may be one-step, that is, the shooting focal length of the camera is adjusted from the current focal length (the first focal length) to the maximum focal length (the second focal length, that is, the target focal length) that the view range of the camera of the electronic device can cover the complete target object, as long as the user performs the operation of increasing the focal length (for example, two fingers slide apart or one finger double click); alternatively, when the zooming operation is to decrease the focal length, the second focal length is smaller than the first focal length and the second focal length is greater than or equal to the target focal length. If the second focal length is larger than the target focal length, the process of reducing the focal length may be a step-by-step process, that is, each time the user performs an operation of reducing the focal length (for example, two-finger sliding or two-finger double-click), the shooting focal length of the camera is adjusted from the current focal length (the first focal length) to a focal length (the second focal length) one step smaller than the current focal length. If the second focal length is equal to the target focal length, the process of decreasing the focal length may be one-step, that is, the shooting focal length of the camera is adjusted from the current focal length (the first focal length) to the minimum focal length (the second focal length, that is, the target focal length) of the electronic device, which can cover the entire target object, as long as the user performs the operation of decreasing the focal length (for example, two-finger zoom-in sliding or two-finger double click).
The camera can obtain a real picture in a view finding range under the second focal length, and transmits a preview image signal corresponding to the real picture to a screen of the electronic equipment, and the preview image signal is displayed by the screen, wherein the preview picture takes the zooming central point as a central point.
According to the method and the device, the zooming central point is determined based on the operation of the user, the target focal length is determined based on the zooming central point, the zooming central point is used as the central point of the preview picture to display the zoomed preview picture, the preview picture displayed to the user after zooming is zoomed by taking the position selected by the user as the zooming central point, the zooming effect expected by the user is met, namely, the target object expected by the user is amplified or reduced in a targeted manner, the target object is placed in the central area of the preview picture, and the success rate of photographing is improved.
In a possible implementation manner, when the camera includes a plurality of fixed focus lenses (the focal lengths of the plurality of fixed focus lenses are different), the electronic device may acquire the viewing ranges of the plurality of fixed focus lenses, and respectively determine whether the viewing ranges of the plurality of fixed focus lenses completely cover the target object under the condition that the zoom center point is taken as the center. When the zooming operation refers to the amplifying operation, determining the focal length of the object completely covered by the view range in the fixed-focus lenses and having the largest focal length as the target focal length; or, when the zoom operation is a zoom-out operation, determining a focal length of one of the plurality of fixed-focus lenses, in which the view-finding range completely covers the target object and the focal length is the smallest, as the target focal length.
In a possible implementation manner, when the camera includes a zoom lens (the zoom lens can adjust multiple focal lengths), the electronic device may obtain viewing ranges of the multiple focal lengths of the zoom lens, and respectively determine whether the viewing ranges of the multiple focal lengths completely cover the target object. When the zooming operation refers to the amplifying operation, determining that the target object is completely covered by the view-finding range in the plurality of focal lengths and the focal length is the maximum focal length; or, when the zoom operation is a zoom-out operation, determining that the target object is completely covered by the viewing range and the focal length is the smallest among the plurality of focal lengths as the target focal length.
In a second aspect, the present application provides an image scaling apparatus comprising: the device comprises a determining module, a processing module and a display module, wherein the determining module is used for determining a zooming central point, the zooming central point is positioned in a pixel area of a screen of the electronic equipment, wherein a target object is displayed on the pixel area, and the zooming central point is associated with an initial touch position corresponding to zooming operation performed by a user on the screen; determining a target focal length according to the zooming central point, wherein the target focal length is the maximum or minimum focal length which can cover the whole target object in the view finding range of a camera of the electronic equipment; the focusing module is used for adjusting the shooting focal length of the camera from a first focal length to a second focal length; wherein the second focal length is greater than the first focal length and the second focal length is less than or equal to the target focal length; or the second focal length is smaller than the first focal length and is greater than or equal to the target focal length; and the display module is used for displaying a preview picture acquired by the lens after the focal length is adjusted on the screen, wherein the preview picture takes the zoom central point as a central point.
In a possible implementation manner, when the camera includes a plurality of fixed-focus lenses, the determining module is specifically configured to acquire viewing ranges of the plurality of fixed-focus lenses, where focal lengths of the plurality of fixed-focus lenses are different; respectively judging whether the viewing ranges of the plurality of fixed-focus lenses completely cover the target object; when the zooming operation is zooming-in operation, determining the focal length of the multiple fixed-focus lenses, which has the scene range completely covering the target object and has the maximum focal length, as the target focal length; or when the zooming operation is zooming-out operation, determining the focal length of the focal range completely covering the target object and having the smallest focal length in the plurality of fixed-focus lenses as the target focal length.
In a possible implementation manner, when the camera includes a zoom lens, the determining module is specifically configured to obtain a plurality of focal lengths of the zoom lens; respectively judging whether the viewing ranges of the plurality of focal lengths completely cover the target object; when the zooming operation refers to the amplifying operation, determining the object with the view range completely covering the object and the focus with the maximum focus from the plurality of focuses as the target focus; or, when the zoom operation is zoom-out operation, determining that the target object is completely covered by the view range in the plurality of focal lengths and the focal length is the smallest as the target focal length.
In a possible implementation manner, the determining module is specifically configured to acquire touch data of a user through a touch sensor; obtaining the zooming operation of the user according to the touch data; and determining the zooming central point according to the zooming operation.
In a possible implementation manner, the determining module is specifically configured to obtain initial touch positions of the two fingers on the screen when the zooming operation is sliding the two fingers apart or sliding the two fingers together; and taking the middle point of a connecting line between the initial touch positions of the two fingers on the screen as the zoom central point.
In a possible implementation manner, the determining module is specifically configured to determine, when the zooming operation is a one-finger double click or a two-finger double click, a touch position corresponding to the one-finger double click or the two-finger double click as the zooming center point.
In a possible implementation manner, the focusing module is specifically configured to adjust a shooting lens of the camera from a lens corresponding to the first focal length to a lens corresponding to the second focal length to implement the second focal length.
In a possible implementation manner, the focusing module is specifically configured to adjust the focal length of the zoom lens from the first focal length to the second focal length.
In a third aspect, the present application provides an electronic device, comprising: one or more processors; a memory for storing one or more programs; when executed by the one or more processors, cause the one or more processors to implement the method of any one of the first aspects as described above.
In a fourth aspect, the present application provides a computer readable storage medium comprising a computer program which, when executed on a computer, causes the computer to perform the method of any of the first aspects above.
In a fifth aspect, the present application provides a computer program product comprising a computer program for performing the method of any of the first aspects above when the computer program is executed by a computer.
In a sixth aspect, the present application provides a chip comprising a processor and a memory, the memory being configured to store a computer program, and the processor being configured to call and run the computer program stored in the memory to perform the method according to any one of the first aspect.
Drawings
FIG. 1 is an exemplary block diagram of an electronic device of the present application;
FIG. 2 is a flow chart illustrating an exemplary image scaling method of the present application;
FIGS. 3a and 3b are exemplary diagrams of zoom of the present application;
FIG. 4 is a diagram of an exemplary preview screen of the present application;
FIG. 5 is a schematic view of a preview screen of the present application;
fig. 6 is a schematic diagram of an exemplary structure of the image scaling apparatus of the present application.
Detailed Description
To make the purpose, technical solutions and advantages of the present application clearer, the technical solutions in the present application will be clearly and completely described below with reference to the drawings in the present application, and it is obvious that the described embodiments are some, but not all embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in the description examples and claims of this application and in the drawings are used for descriptive purposes only and are not to be construed as indicating or implying relative importance, nor order. Furthermore, the terms "comprises" and "comprising," as well as any variations thereof, are intended to cover a non-exclusive inclusion, such as a list of steps or elements. A method, system, article, or apparatus is not necessarily limited to those steps or elements explicitly listed, but may include other steps or elements not explicitly listed or inherent to such process, system, article, or apparatus.
It should be understood that in the present application, "at least one" means one or more, "a plurality" means two or more. "and/or" for describing an association relationship of associated objects, indicating that there may be three relationships, e.g., "a and/or B" may indicate: only A, only B and both A and B are present, wherein A and B may be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of single item(s) or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, "a and b", "a and c", "b and c", or "a and b and c", wherein a, b, c may be single or plural.
The image zooming method is suitable for a scene that a user holds the electronic equipment for shooting, and the electronic equipment can comprise a mobile phone, a tablet computer, a notebook computer, an intelligent watch and other types of electronic equipment. The electronic device has a built-in camera that can support zoom functions, including optical zoom and digital zoom. The camera may be configured with a plurality of fixed-focus lenses to achieve zooming, or may be configured with a zoom lens (e.g., a periscopic lens) to achieve zooming, which is not specifically limited in this application.
Fig. 1 is an exemplary block diagram of an electronic device according to the present application, and fig. 1 is a schematic structural diagram of the electronic device when the electronic device is a mobile phone. As shown in fig. 1, the mobile phone 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, an image sensor 180N, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the mobile phone 100. In other embodiments of the present application, the handset 100 may include more or fewer components than shown, or some components may be combined, some components may be separated, or a different arrangement of components may be used. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate through an I2C bus interface to implement the touch function of the mobile phone 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may communicate audio signals to the wireless communication module 160 via the I2S interface, enabling answering of calls via a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a bluetooth headset.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, the processor 110 and the camera 193 communicate through a CSI interface to implement the camera function of the handset 100. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the mobile phone 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the mobile phone 100, and may also be used to transmit data between the mobile phone 100 and peripheral devices. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other handsets, such as AR devices, etc.
It should be understood that the interface connection relationship between the modules illustrated in the embodiment of the present application is only an exemplary illustration, and does not constitute a limitation on the structure of the mobile phone 100. In other embodiments of the present application, the mobile phone 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the cell phone 100. The charging management module 140 may also supply power to the mobile phone through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the mobile phone 100 can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the handset 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including wireless communication of 2G/3G/4G/5G, etc. applied to the handset 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication applied to the mobile phone 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), Bluetooth (BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, the antenna 1 of the handset 100 is coupled to the mobile communication module 150 and the antenna 2 is coupled to the wireless communication module 160 so that the handset 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The mobile phone 100 implements the display function through the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the cell phone 100 may include 1 or N display screens 194, with N being a positive integer greater than 1.
The mobile phone 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like. In order to implement the image scaling method provided by the present application, a plurality of fixed-focus cameras 193 may be included to implement zooming, and one zoom camera 193 may also be included to implement zooming. The camera 193 transmits a preview image signal to the display screen 194, and the camera view finding process is completed.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the handset 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the handset 100 is in frequency bin selection, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. Handset 100 may support one or more video codecs. Thus, the handset 100 can play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can realize applications such as intelligent recognition of the mobile phone 100, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the mobile phone 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The data storage area may store data (e.g., audio data, a phonebook, etc.) created during use of the handset 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. The processor 110 executes various functional applications of the cellular phone 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The mobile phone 100 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The cellular phone 100 can listen to music through the speaker 170A or listen to a hands-free call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the cellular phone 100 receives a call or voice information, it is possible to receive voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking the user's mouth near the microphone 170C. The handset 100 may be provided with at least one microphone 170C. In other embodiments, the handset 100 may be provided with two microphones 170C to achieve noise reduction functions in addition to collecting sound signals. In other embodiments, the mobile phone 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions. The microphone 170C may be used as an input module for receiving a voice signal of a user (operator) so that the user can input related operations or data by voice.
The headphone interface 170D is used to connect a wired headphone. The earphone interface 170D may be the USB interface 130, or may be an open mobile platform (OMTP) standard interface of 3.5mm, or a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material.
The gyro sensor 180B may be used to determine the motion attitude of the cellular phone 100.
The air pressure sensor 180C is used to measure air pressure.
The magnetic sensor 180D includes a hall sensor.
The acceleration sensor 180E can detect the magnitude of acceleration of the cellular phone 100 in various directions (typically three axes).
A distance sensor 180F for measuring a distance.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode.
The ambient light sensor 180L is used to sense the ambient light level.
The fingerprint sensor 180H is used to collect a fingerprint.
The temperature sensor 180J is used to detect temperature.
The touch sensor 180K is also called a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on the surface of the mobile phone 100, different from the position of the display 194. The touch sensor 180K may serve as an input module for receiving an input from a user (operator) so that the user may input related operations or data by multi-touch on the screen.
Optionally, the input module in the present application may also obtain the input of the user in a non-touch manner, for example, obtain the motion or gesture of the user in a computer vision or other motion sensor manner, so that the user may input the relevant operation or data by performing the motion or waving the hand.
The bone conduction sensor 180M may acquire a vibration signal.
The image sensor (sensor)180N converts the optical image on the photosensitive surface into an electrical signal in a corresponding proportional relationship with the optical image by using a photoelectric conversion function of the photoelectric device, wherein the photosensitive surface is divided into a plurality of small units, and each small unit corresponds to one pixel point. For example, Bayer (Bayer) sensors arrange RGB filters on a photosensitive surface to form a mosaic color filter array (color filter array) in which 50% is green for sensing green light, 25% is red for sensing red light, and 25% is blue for sensing blue light.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The cellular phone 100 may receive a key input, and generate a key signal input related to user setting and function control of the cellular phone 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be attached to and detached from the cellular phone 100 by being inserted into the SIM card interface 195 or being pulled out from the SIM card interface 195. The handset 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The mobile phone 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the handset 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the mobile phone 100 and cannot be separated from the mobile phone 100.
It is to be understood that the illustrated structure of the embodiments of the present application does not constitute a specific limitation to the apparatus. In other embodiments of the present application, an apparatus may include more or fewer components than illustrated, or some components may be combined, some components may be separated, or a different arrangement of components may be provided. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
FIG. 2 is a flowchart illustrating an exemplary image scaling method of the present application. The process 200 may be performed by an electronic device (e.g., a cell phone as shown in fig. 2). Process 200 is described as a series of steps or operations, it being understood that process 200 may be performed in various orders and/or concurrently, and is not limited to the order of execution shown in fig. 2. As shown in fig. 2, the method includes:
step 201, determining a zoom central point.
Generally, when a user uses an electronic device (e.g., a mobile phone) to take a picture, a picture-taking APP is opened on the electronic device, then a default lens (e.g., a rear lens) on the electronic device is opened to obtain a real picture of the electronic device at a default focal length (if the default focal length is a fixed focal length of the lens, if the default focal length is a zoom lens, the default focal length may be one of a plurality of focal lengths of the lens, and may be preset), and a preview image signal corresponding to the real picture is transmitted to a screen of the electronic device, and the preview picture corresponding to the preview image signal is displayed on the screen. Before pressing the shutter, a user can adjust the focal length in various ways, and the view finding range of the lens is reduced along with the increase of the focal length, so that a preview picture displayed on a screen of the electronic equipment can be changed into an enlarged picture of a local area in the original preview picture; as the focal length is reduced, the view range of the lens is enlarged, and the preview image displayed on the screen of the electronic device is a reduced image including the original preview image and a frame including more contents.
The above-mentioned various ways of adjusting the focal length may include: (1) the two fingers of the user slide on the screen separately (indicating that the focal length needs to be increased, the viewing range needs to be reduced, the local area in the current preview picture is enlarged, and the details of the local area are richer) or gather together (indicating that the focal length needs to be reduced, the viewing range needs to be increased, and the current preview picture needs to be reduced, so that the preview picture contains more contents); (2) a user double-clicks a screen (indicating that the focal length is to be increased, the viewing range is to be reduced, and the local area in the current preview picture is enlarged, so that the details of the local area are richer) with one finger or double-clicks a screen (indicating that the focal length is to be reduced, the viewing range is to be increased, and the current preview picture is to be reduced, so that the preview picture contains more contents); (3) the user clicks a zoom-in control on the photo interface, such as "+" (indicating that the focal length is to be increased, the viewing range is to be decreased, the local area in the current preview screen is enlarged, and the details of the local area are made richer) or a zoom-out control, such as "-" (indicating that the focal length is to be decreased, the viewing range is to be increased, and the current preview screen is to be reduced, so that the preview screen contains more content). It should be noted that, besides the three manners, the user may also adjust the focal length of the shot in other manners, which is not specifically limited in this application. However, in order to determine the zoom center point, the present application needs to acquire the operation of the user on the screen and the relevant position touched by the operation, so the present application focuses mainly on the manners of (1) and (2) above.
When the user adjusts the focal length, the user may zoom in or out on a specific object in the preview screen according to the desired shooting effect. For example, the user may want to zoom in on a character that appears smaller in the preview screen at the default focal length. At the moment, two fingers of the user slide on the screen separately, so that the focal length can be increased, the view finding range can be reduced, and the local area in the current preview picture can be enlarged. In the related art, the center point of the current preview screen is used as a zoom center point for zooming, that is, a zoomed partial area surrounds the center point of the current preview screen, but if a person to be photographed is not in the middle of the current preview screen, part or all of the person may not be in the view range after the focus adjustment, so that the zoomed preview screen does not include a complete person screen.
The electronic device can acquire touch data of a user through the touch sensor, analyze the touch data to determine what kind of zoom operation the user performs (for example, two fingers slide apart, two fingers slide together, one finger double-click on the screen, or two fingers double-click on the screen), and finally determine a zoom center point according to the zoom operation. The zoom center point in the present application is associated with an initial touch position corresponding to a zoom operation performed by a user on a screen.
In one possible implementation manner, when the zooming operation is that the two fingers slide separately or the two fingers slide together, the initial touch positions of the two fingers on the screen are acquired, and the midpoint of a connecting line between the initial touch positions of the two fingers on the screen is used as a zooming central point. For example, when a user wants to zoom in a person in a preview picture, the user can place two fingers near a pixel area on a screen where the person is displayed and then slide the two fingers on the screen separately, so that a midpoint of a connecting line acquired according to initial touch positions of the two fingers is likely to fall within the pixel area where the person is displayed, and then the zoom-in is performed by taking the midpoint as a zoom center point, so that the person can be always located at the center of the preview picture, and not only can the accurate focusing during the zooming be realized, but also the person cannot easily get out of the view range of a lens.
In one possible implementation manner, when the zooming operation is a one-finger double click or a two-finger double click, the touch position corresponding to the one-finger double click or the two-finger double click is determined as the zooming center point. For example, when a user wants to enlarge a person in a preview image, the user can use a finger joint of one finger to double click a certain position in a pixel area on a screen where the person is displayed, and enlarge the position by using the position as a zoom central point, so that the person can be always positioned at the central position of the preview image, thereby not only realizing accurate focusing in the zooming process, but also preventing the person from easily getting out of the view range of a lens.
Step 202, determining the target focal length according to the zoom central point.
The zoom center point determined in step 201 can ensure that the target object is at the center of the preview screen during zooming, which can reduce the possibility that the target object will come out of the lens unless the focal length is so large that the lens cannot fully obtain the target object. The target focal length can therefore be determined based on whether the target object is out of view, i.e. the target focal length is the maximum (suitable for a scene with increased focal length) or the minimum (suitable for a scene with decreased focal length) focal length that the viewing range of the camera of the electronic device can cover to the complete target object.
In a possible implementation manner, when the camera includes a plurality of fixed focus lenses (the focal lengths of the plurality of fixed focus lenses are different), the electronic device may acquire the viewing ranges of the plurality of fixed focus lenses, and respectively determine whether the viewing ranges of the plurality of fixed focus lenses completely cover the target object under the condition that the zoom center point is taken as the center. When the zooming operation refers to the amplifying operation, determining the focal length of the object completely covered by the view range in the fixed-focus lenses and having the largest focal length as the target focal length; or, when the zoom operation is a zoom-out operation, determining a focal length of one of the plurality of fixed-focus lenses, in which the view-finding range completely covers the target object and the focal length is the smallest, as the target focal length.
For example, fig. 3a and 3b are schematic diagrams illustrating an exemplary zoom of the present application, and as shown in fig. 3a and 3b, it is assumed that the camera includes three fixed focus lenses, the focal length of the lens 1 is 1-fold, the focal length of the lens 2 is 3-fold, the focal length of the lens 3 is 5-fold, the default lens is the lens 1, and the default focal length is 1-fold. In the default lens-on state, the user slides the two fingers apart on the screen. At the moment, the electronic equipment acquires a zooming central point according to the operation that the two fingers of the user slide separately, determines that the target object is a rabbit according to the image displayed by the zooming central point, and then judges whether the viewing ranges of the three fixed-focus lenses can cover the complete rabbit or not. The electronic device determines the maximum focal length of the view range which can be covered to the complete rabbit as the target focal length. In fig. 3b, the lens 2 with 3 times focal length can cover the whole rabbit, and the lens 3 with 5 times focal length can not cover the whole rabbit, so that the 3 times focal length is the target focal length, and the lens 2 is the zooming result, i.e. the electronic device switches the shooting lens from the lens 1 to the lens 2.
In a possible implementation manner, when the camera includes a zoom lens (the zoom lens can adjust multiple focal lengths), the electronic device may obtain viewing ranges of the multiple focal lengths of the zoom lens, and respectively determine whether the viewing ranges of the multiple focal lengths completely cover the target object. When the zooming operation refers to the amplifying operation, determining that the target object is completely covered by the view-finding range in the plurality of focal lengths and the focal length is the maximum focal length; or, when the zoom operation is a zoom-out operation, determining that the target object is completely covered by the viewing range and the focal length is the smallest among the plurality of focal lengths as the target focal length.
For example, as also shown in fig. 3a and 3b, assume that the camera includes a zoom lens that can achieve three focal lengths, 1-fold focal length, 3-fold focal length, and 5-fold focal length, with the default focal length being 1-fold focal length. And corresponding to the default focal length in the initial opening state of the lens, and the user slides the lens on the screen separately by using two fingers. At the moment, the electronic equipment acquires a zooming central point according to the operation that the two fingers of the user slide separately, determines that the target object is the rabbit according to the image displayed by the zooming central point, and then judges whether the view finding range of the zoom lens under the three focal lengths can cover the complete rabbit or not. The electronic device determines the maximum focal length of the view range which can be covered to the complete rabbit as the target focal length. In fig. 3b, the complete rabbit can be covered by 3 times of focal length, and the complete rabbit can not be covered by 5 times of focal length, so that the 3 times of focal length is the target focal length, and the zoom lens is switched from 1 time of focal length to 3 times of focal length by the electronic device.
Step 203, adjusting the shooting focal length of the camera from the first focal length to the second focal length.
When the zooming operation is to increase the focal length, the second focal length is greater than the first focal length and is less than or equal to the target focal length. If the second focal length is smaller than the target focal length, the process of increasing the focal length may be step-by-step, that is, each time the user performs an operation of increasing the focal length (e.g., two-finger sliding apart or one-finger double-click), the shooting focal length of the camera is adjusted from the current focal length (first focal length) to a focal length (second focal length) one step larger than the current focal length. If the second focal length is equal to the target focal length, the process of increasing the focal length may be one-step, that is, the shooting focal length of the camera is adjusted from the current focal length (the first focal length) to the maximum focal length (the second focal length, that is, the target focal length) that the view range of the camera of the electronic device can cover the complete target object, as long as the user performs the operation of increasing the focal length (for example, two fingers slide apart or one finger double click); alternatively, when the zooming operation is to decrease the focal length, the second focal length is smaller than the first focal length and the second focal length is greater than or equal to the target focal length. If the second focal length is larger than the target focal length, the process of reducing the focal length may be a step-by-step process, that is, each time the user performs an operation of reducing the focal length (for example, two-finger sliding or two-finger double-click), the shooting focal length of the camera is adjusted from the current focal length (the first focal length) to a focal length (the second focal length) one step smaller than the current focal length. If the second focal length is equal to the target focal length, the process of decreasing the focal length may be one-step, that is, the shooting focal length of the camera is adjusted from the current focal length (the first focal length) to the minimum focal length (the second focal length, that is, the target focal length) of the electronic device, which can cover the entire target object, as long as the user performs the operation of decreasing the focal length (for example, two-finger zoom-in sliding or two-finger double click).
And after the electronic equipment determines the target focal length, if the target focal length is the condition of a plurality of fixed-focus lenses, switching the current lens into the fixed-focus lens corresponding to the second focal length. If the zoom lens is the case, the current focal length of the zoom lens is switched to the second focal length. Thereby achieving camera zoom.
And 204, displaying a preview image acquired by the lens after the focal length is adjusted on a screen, wherein the preview image takes the zoom central point as a central point.
The camera can obtain a real picture in a view finding range under the second focal length, and transmits a preview image signal corresponding to the real picture to a screen of the electronic equipment, and the preview image signal is displayed by the screen, wherein the preview picture takes the zooming central point as a central point. Fig. 4 is an exemplary schematic diagram of a preview screen according to the present application, and as shown in fig. 4, it is assumed that a camera includes three fixed-focus lenses, a focal length of a lens 1 is 1-fold, a focal length of a lens 2 is 3-fold, a focal length of a lens 3 is 5-fold, a default lens is the lens 1, and a default focal length is 1-fold. The target focal length determined by the electronic device is the focal length of the lens 3. Three solid line boxes from the outside to the inside represent the finder ranges at the respective focal lengths of the lens 1, the lens 2, and the lens 3, respectively, and black dots represent zoom center points. The thick line frame indicates the preview screen after zooming, and it can be seen that the zoom center point is located at the center of the thick line frame, and the images within the thick line frame can be acquired from the preview image signal obtained by the lens 3 and the partial preview image signals obtained by the lens 2 and the lens 1, respectively.
According to the method and the device, the zooming central point is determined based on the operation of the user, the target focal length is determined based on the zooming central point, the zooming central point is used as the central point of the preview picture to display the zoomed preview picture, the preview picture displayed to the user after zooming is zoomed by taking the position selected by the user as the zooming central point, the zooming effect expected by the user is met, namely, the target object expected by the user is amplified or reduced in a targeted manner, the target object is placed in the central area of the preview picture, and the success rate of photographing is improved.
The following describes the image scaling method in detail with a specific embodiment.
The target object a appears entirely within the default 1-times focal length viewing range. The two fingers of the user slide apart in the area where the target object a is displayed on the screen, and intend to enlarge the preview picture with the target object a as the center, and it is necessary to always keep the target object a present within the viewing range.
1. Screen (touch screen) capturing touch gestures of user
As shown in fig. 3a, the target object a in the viewing range of 1 × focal length is a bunny rabbit. The user wants to enlarge the current preview picture, and makes an enlarged touch gesture on the screen, such as sliding two fingers apart.
2. Judging whether the touch gesture of the user is two-finger separated sliding
An operating system (e.g., android system) recognition gesture in an electronic device can be processed by using a GesturgeDetector class, a corresponding ScaleGesturgeDetector class is provided corresponding to a gesture of zooming of two fingers, wherein a ScaleGesturgeDetector monitor interface is used for receiving and processing a zooming gesture event, and in the OnScaleGesturgeListener interface, when a two-finger zooming gesture starts, the event is triggered, namely the two fingers press down a screen, a value is returned to coolean through ScaleBegin (), and the operation system is mainly used for judging whether a detector should continue to recognize and process the gesture. If a two-finger zoom gesture is not recognized in a certain area on the screen, the onScale begin () can return false, and the electronic device can ignore the next series of events that do not process the gesture; otherwise, the onScaleBegin () returns true, and the electronic device can proceed with the subsequent steps.
3. Obtaining a zoom central point according to initial positions of two fingers pressed on a screen
The snooping method of the ScaleGestureDetector class includes a ScaleGestureDetector parameter which includes some relevant information of the zoom gesture event, wherein, flow getFocusX () returns the first initial touch point X coordinate of the current zoom gesture, flow getFocusY () returns the second initial touch point Y coordinate of the current zoom gesture, and the zoom central point (X0, Y0) is the midpoint of the connecting line of X and Y.
4. The electronic equipment judges which focal lengths the rabbit can be completely shot to obtain the target focal length based on the zoom central point (X0, Y0)
At the time of determination, the electronic device may perform a field angle of view (FOV) versus focal length relationship. The FOV versus focal length is as follows:
image height EFL × tan (1/2 FOV);
where EFL is the focal length and FOV is the field angle. After the focal length EFL is determined, the FOV thereof can be determined, and further the corresponding preview screen can be determined.
Assume that the camera includes 3 fixed focus lenses, where FOV of lens 1(1 × focal length) is 108 degrees, focal length EFL is 50mm, 1/2FOV is 54 °, and image height H1 is EFL × tan (1/2FOV) is 50 × tan54 ° -68.8 mm. Similarly, the image heights H2, H3 under lens 2(3 times focal length) and lens 3(5 times focal length) can be calculated, respectively.
Through image recognition, the image height of the rabbit at the current focal length can be calculated, so that the relative relation between the image height of the rabbit at the current focal length and the FOV is obtained, and the relative relation between the image heights of the rabbits at other focal lengths and the FOV can be known by combining the zoom central points (X0 and Y0) obtained in the step 3. Therefore, whether the rabbit can be completely shot under a certain focal length can be judged under each focal length.
Fig. 5 is a schematic diagram of a preview screen according to the present application, and as shown in fig. 5, it is assumed that the resolution of the current mobile phone screen is 1440 × 3120 (screen resolution is the viewing resolution (or sensor resolution)), and the center position of the rabbit can be obtained by image recognition with the lower left corner of the screen as the origin of coordinates (0,0) (X1, Y1). And judging whether Y1+ (H2/2) is larger than 3120, if not, indicating that the rabbit can still be completely shot within the view range after 3 times of magnification. And judging whether Y1+ (H3/2) is larger than 3120, if so, indicating that the rabbit is not in the viewing range of 5 times of the focal length after 5 times of magnification and cannot be completely shot. As shown in fig. 3 b. Thus, the target focal length in this embodiment is 3 times focal length, and the corresponding target lens is the lens 2.
5. The electronic equipment switches the fixed-focus lens or the focal length of the zoom lens to complete the zooming process
Based on the above steps, the electronic device determines that the target lens is the lens 2. The image captured by the lens 2 is displayed on the screen with (X0, Y0) as the center point through the onscale gesturlener interface. As shown in fig. 4, the preview screen after zooming includes: a preview image signal obtained by the lens 3, a partial preview image signal obtained by the lens 2, and a partial preview image signal obtained by the lens 1.
Optionally, the touch gesture in step 1 may also be a double click, for example, the zoom-in gesture is a one-finger double click. In this case, step 3 is to use the position of the double click of the user's finger as the zoom center point.
Optionally, on the premise that the target focal length is not exceeded, the electronic device in step 5 switches the current focal length to a focal length one level larger after recognizing a magnification gesture of a user, for example, the electronic device switches the shooting focal length from the current focal length 1 times to the focal length 3 times, so that even if the rabbit is shot completely under the focal length 5 times; and the user still needs to do the zooming-in gesture again, and the electronic equipment switches the focal length from the focal length of 3 times to the focal length of 5 times according to the recognized zooming-in gesture for the second time.
Optionally, on the premise that the target focal length is not exceeded, in step 5, after the magnification gesture of the user is recognized, the current focal length is directly switched to the target focal length, for example, the rabbit can still be completely shot under the focal length of 5 times, and the electronic device switches the shooting focal length from the current focal length of 1 time to the focal length of 5 times. This may provide for more convenient operation.
Fig. 6 is a schematic structural diagram of an exemplary image scaling apparatus according to the present application, and as shown in fig. 6, the apparatus according to the present embodiment may be the electronic device shown in fig. 1. The apparatus 600 comprises: a determination module 601, a focusing module 602, and a display module 603. Wherein the content of the first and second substances,
a determining module 601, configured to determine a zoom center point, where the zoom center point is located in a pixel region where a target object is displayed on a screen of an electronic device, and the zoom center point is associated with an initial touch position corresponding to a zoom operation performed by a user on the screen; determining a target focal length according to the zooming central point, wherein the target focal length is the maximum or minimum focal length which can cover the whole target object in the view finding range of a camera of the electronic equipment; a focusing module 602, configured to adjust a shooting focal length of the camera from a first focal length to a second focal length; wherein the second focal length is greater than the first focal length and the second focal length is less than or equal to the target focal length; or the second focal length is smaller than the first focal length and is greater than or equal to the target focal length; a display module 603, configured to display a preview image obtained through the lens after adjusting the focal length on the screen, where the preview image takes the zoom center point as a center point.
In a possible implementation manner, when the camera includes a plurality of fixed-focus lenses, the determining module 601 is specifically configured to obtain viewing ranges of the plurality of fixed-focus lenses, where focal lengths of the plurality of fixed-focus lenses are different; respectively judging whether the viewing ranges of the plurality of fixed-focus lenses completely cover the target object; when the zooming operation is zooming-in operation, determining the focal length of the multiple fixed-focus lenses, which has the scene range completely covering the target object and has the maximum focal length, as the target focal length; or when the zooming operation is zooming-out operation, determining the focal length of the focal range completely covering the target object and having the smallest focal length in the plurality of fixed-focus lenses as the target focal length.
In a possible implementation manner, when the camera includes a zoom lens, the determining module 601 is specifically configured to obtain a plurality of focal lengths of the zoom lens; respectively judging whether the viewing ranges of the plurality of focal lengths completely cover the target object; when the zooming operation refers to the amplifying operation, determining the object with the view range completely covering the object and the focus with the maximum focus from the plurality of focuses as the target focus; or, when the zoom operation is zoom-out operation, determining that the target object is completely covered by the view range in the plurality of focal lengths and the focal length is the smallest as the target focal length.
In a possible implementation manner, the determining module 601 is specifically configured to obtain touch data of a user through a touch sensor; obtaining the zooming operation of the user according to the touch data; and determining the zooming central point according to the zooming operation.
In a possible implementation manner, the determining module 601 is specifically configured to, when the zooming operation is sliding the two fingers apart or sliding the two fingers together, obtain initial touch positions of the two fingers on the screen; and taking the middle point of a connecting line between the initial touch positions of the two fingers on the screen as the zoom central point.
In a possible implementation manner, the determining module 601 is specifically configured to determine, when the zooming operation is a one-finger double click or a two-finger double click, a touch position corresponding to the one-finger double click or the two-finger double click as the zooming center point.
In a possible implementation manner, the focusing module 602 is specifically configured to adjust a shooting lens of the camera from a lens corresponding to the first focal length to a lens corresponding to the second focal length to implement the second focal length.
In a possible implementation manner, the focusing module 602 is specifically configured to adjust the focal length of the zoom lens from the first focal length to the second focal length.
The apparatus of this embodiment may be used to implement the technical solution of the method embodiment shown in fig. 2, and the implementation principle and the technical effect are similar, which are not described herein again.
In implementation, the steps of the above method embodiments may be performed by integrated logic circuits of hardware in a processor or instructions in the form of software. The processor may be a general purpose processor, a Digital Signal Processor (DSP), an application-specific integrated circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, or discrete hardware components. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in the embodiments of the present application may be directly implemented by a hardware encoding processor, or implemented by a combination of hardware and software modules in the encoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in a memory, and a processor reads information in the memory and completes the steps of the method in combination with hardware of the processor.
The memory referred to in the various embodiments above may be volatile memory or non-volatile memory, or may include both volatile and non-volatile memory. The non-volatile memory may be a read-only memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an electrically Erasable EPROM (EEPROM), or a flash memory. Volatile memory can be Random Access Memory (RAM), which acts as external cache memory. By way of example, but not limitation, many forms of RAM are available, such as Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), Synchronous Dynamic Random Access Memory (SDRAM), double data rate SDRAM, enhanced SDRAM, SLDRAM, Synchronous Link DRAM (SLDRAM), and direct rambus RAM (DR RAM). It should be noted that the memory of the systems and methods described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (personal computer, server, network device, or the like) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (18)

1. An image scaling method, comprising:
determining a zoom central point, wherein the zoom central point is located in a pixel area of a screen of an electronic device, on which a target object is displayed, and the zoom central point is associated with an initial touch position corresponding to a zoom operation performed by a user on the screen;
determining a target focal length according to the zooming central point and the zooming operation, wherein the target focal length is the maximum or minimum focal length which can be covered to the complete target object by a viewing range of a camera of the electronic equipment;
adjusting a shooting focal length of the camera from a first focal length to a second focal length; wherein the second focal length is greater than the first focal length and the second focal length is less than or equal to the target focal length; or the second focal length is smaller than the first focal length and is greater than or equal to the target focal length;
displaying a preview image obtained by cutting and splicing a plurality of preview image signals on the screen, wherein the preview image takes the zoom central point as a central point, the preview image signals are preview image signals obtained under a lens with a plurality of focal lengths, and the plurality of focal lengths at least comprise the first focal length and the second focal length.
2. The method of claim 1, wherein when the camera includes a plurality of fixed focus lenses, the determining a target focal length from the zoom center point comprises:
acquiring the framing ranges of the plurality of fixed-focus lenses, wherein the focal lengths of the plurality of fixed-focus lenses are different;
respectively judging whether the viewing ranges of the plurality of fixed-focus lenses completely cover the target object;
when the zooming operation is zooming-in operation, determining the focal length of the multiple fixed-focus lenses, which has the scene range completely covering the target object and has the maximum focal length, as the target focal length; alternatively, the first and second electrodes may be,
and when the zooming operation is zooming-out operation, determining the focal length of the multiple fixed-focus lenses, which has the scene range completely covering the target object and has the minimum focal length, as the target focal length.
3. The method of claim 1, wherein when the camera comprises a zoom lens, said determining a target focal length from the zoom center point comprises:
acquiring a plurality of focal length framing ranges of the zoom lens;
respectively judging whether the viewing ranges of the plurality of focal lengths completely cover the target object;
when the zooming operation refers to the amplifying operation, determining the object with the view range completely covering the object and the focus with the maximum focus from the plurality of focuses as the target focus; alternatively, the first and second electrodes may be,
and when the zooming operation refers to zooming-out operation, determining the target focal length as the one with the smallest focal length, wherein the target object is completely covered by the view range in the plurality of focal lengths.
4. The method of any of claims 1-3, wherein the determining a zoom center point comprises:
acquiring touch data of a user through a touch sensor;
obtaining the zooming operation of the user according to the touch data;
and determining the zooming central point according to the zooming operation.
5. The method of claim 4, wherein determining the zoom center point according to the zoom operation comprises:
when the zooming operation is that the two fingers slide separately or the two fingers slide together, acquiring initial touch positions of the two fingers on a screen;
and taking the middle point of a connecting line between the initial touch positions of the two fingers on the screen as the zoom central point.
6. The method of claim 4, wherein determining the zoom center point according to the zoom operation comprises:
and when the zooming operation is one-finger double-click or two-finger double-click, determining the touch position corresponding to the one-finger double-click or the two-finger double-click as the zooming central point.
7. The method of claim 2, wherein adjusting the camera's capture focal length from a first focal length to a second focal length comprises:
and adjusting a shooting lens of the camera from a lens corresponding to the first focal length to a lens corresponding to the second focal length to realize the second focal length.
8. The method of claim 3, wherein adjusting the camera's capture focal length from a first focal length to a second focal length comprises:
adjusting the focal length of the zoom lens from the first focal length to the second focal length.
9. An image scaling apparatus, comprising:
the device comprises a determining module, a processing module and a display module, wherein the determining module is used for determining a zooming central point, the zooming central point is positioned in a pixel area of a screen of the electronic equipment, wherein a target object is displayed on the pixel area, and the zooming central point is associated with an initial touch position corresponding to zooming operation performed by a user on the screen; determining a target focal length according to the zooming central point and the zooming operation, wherein the target focal length is the maximum or minimum focal length which can be covered to the complete target object by a viewing range of a camera of the electronic equipment;
the focusing module is used for adjusting the shooting focal length of the camera from a first focal length to a second focal length; wherein the second focal length is greater than the first focal length and the second focal length is less than or equal to the target focal length; or the second focal length is smaller than the first focal length and is greater than or equal to the target focal length;
the display module is used for displaying a preview image acquired by cutting and splicing a plurality of preview image signals on the screen, wherein the preview image takes the zoom central point as a central point, the preview image signals are preview image signals acquired under a lens with a plurality of focal lengths, and the plurality of focal lengths at least comprise the first focal length and the second focal length.
10. The apparatus according to claim 9, wherein when the camera includes a plurality of fixed focus lenses, the determining module is specifically configured to obtain the viewing ranges of the plurality of fixed focus lenses, and focal lengths of the plurality of fixed focus lenses are different; respectively judging whether the viewing ranges of the plurality of fixed-focus lenses completely cover the target object; when the zooming operation is zooming-in operation, determining the focal length of the multiple fixed-focus lenses, which has the scene range completely covering the target object and has the maximum focal length, as the target focal length; or when the zooming operation is zooming-out operation, determining the focal length of the focal range completely covering the target object and having the smallest focal length in the plurality of fixed-focus lenses as the target focal length.
11. The apparatus according to claim 9, wherein, when the camera comprises a zoom lens, the determining means is specifically configured to obtain a viewing range of a plurality of focal lengths of the zoom lens; respectively judging whether the viewing ranges of the plurality of focal lengths completely cover the target object; when the zooming operation refers to the amplifying operation, determining the object with the view range completely covering the object and the focus with the maximum focus from the plurality of focuses as the target focus; or, when the zoom operation is zoom-out operation, determining that the target object is completely covered by the view range in the plurality of focal lengths and the focal length is the smallest as the target focal length.
12. The apparatus according to any of claims 9-11, wherein the determining module is specifically configured to obtain touch data of a user via a touch sensor; obtaining the zooming operation of the user according to the touch data; and determining the zooming central point according to the zooming operation.
13. The apparatus according to claim 12, wherein the determining module is specifically configured to obtain an initial touch position of the two fingers on the screen when the zooming operation is a two-finger split sliding or a two-finger close sliding; and taking the middle point of a connecting line between the initial touch positions of the two fingers on the screen as the zoom central point.
14. The apparatus according to claim 12, wherein the determining module is specifically configured to determine, as the zoom center point, a touch position corresponding to a one-finger double click or a two-finger double click when the zoom operation is the one-finger double click or the two-finger double click.
15. The apparatus according to claim 10, wherein the focusing module is specifically configured to adjust a shooting lens of the camera from a lens corresponding to the first focal length to a lens corresponding to the second focal length to achieve the second focal length.
16. The apparatus of claim 11, wherein the focusing module is specifically configured to adjust the focal length of the zoom lens from the first focal length to the second focal length.
17. An electronic device, comprising:
one or more processors;
a memory for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-8.
18. A computer-readable storage medium, comprising a computer program which, when executed on a computer, causes the computer to perform the method of any one of claims 1-8.
CN202011340842.3A 2020-11-25 2020-11-25 Image scaling method and device Active CN112637481B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011340842.3A CN112637481B (en) 2020-11-25 2020-11-25 Image scaling method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011340842.3A CN112637481B (en) 2020-11-25 2020-11-25 Image scaling method and device

Publications (2)

Publication Number Publication Date
CN112637481A CN112637481A (en) 2021-04-09
CN112637481B true CN112637481B (en) 2022-03-29

Family

ID=75303947

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011340842.3A Active CN112637481B (en) 2020-11-25 2020-11-25 Image scaling method and device

Country Status (1)

Country Link
CN (1) CN112637481B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113364976B (en) * 2021-05-10 2022-07-15 荣耀终端有限公司 Image display method and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107809581A (en) * 2017-09-29 2018-03-16 天津远翥科技有限公司 Image processing method, device, terminal device and unmanned plane
CN108170350A (en) * 2017-12-28 2018-06-15 努比亚技术有限公司 Realize method, terminal and the computer readable storage medium of Digital Zoom
CN110958387A (en) * 2019-11-19 2020-04-03 维沃移动通信有限公司 Content updating method and electronic equipment
CN111182205A (en) * 2019-12-30 2020-05-19 维沃移动通信有限公司 Photographing method, electronic device, and medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3076657B1 (en) * 2015-04-02 2017-05-24 Axis AB Method for determination of focal length for a zoom lens

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107809581A (en) * 2017-09-29 2018-03-16 天津远翥科技有限公司 Image processing method, device, terminal device and unmanned plane
CN108170350A (en) * 2017-12-28 2018-06-15 努比亚技术有限公司 Realize method, terminal and the computer readable storage medium of Digital Zoom
CN110958387A (en) * 2019-11-19 2020-04-03 维沃移动通信有限公司 Content updating method and electronic equipment
CN111182205A (en) * 2019-12-30 2020-05-19 维沃移动通信有限公司 Photographing method, electronic device, and medium

Also Published As

Publication number Publication date
CN112637481A (en) 2021-04-09

Similar Documents

Publication Publication Date Title
US11595566B2 (en) Camera switching method for terminal, and terminal
WO2020073959A1 (en) Image capturing method, and electronic device
CN113727016A (en) Shooting method and electronic equipment
CN112954218A (en) Multi-channel video recording method and equipment
US20220321797A1 (en) Photographing method in long-focus scenario and terminal
CN113810601B (en) Terminal image processing method and device and terminal equipment
CN113810600B (en) Terminal image processing method and device and terminal equipment
WO2020078273A1 (en) Photographing method, and electronic device
EP4117272A1 (en) Image processing method and apparatus
US20220342516A1 (en) Display Method for Electronic Device, Electronic Device, and Computer-Readable Storage Medium
CN114489533A (en) Screen projection method and device, electronic equipment and computer readable storage medium
CN114880251A (en) Access method and access device of storage unit and terminal equipment
CN113497851B (en) Control display method and electronic equipment
CN112637481B (en) Image scaling method and device
CN115412678B (en) Exposure processing method and device and electronic equipment
US20230370718A1 (en) Shooting Method and Electronic Device
CN115706869A (en) Terminal image processing method and device and terminal equipment
CN115696067B (en) Image processing method for terminal, terminal device and computer readable storage medium
WO2022105670A1 (en) Display method and terminal
CN117714858A (en) Image processing method, electronic equipment and readable storage medium
CN116582743A (en) Shooting method, electronic equipment and medium
CN117319369A (en) File delivery method, electronic device and storage medium
CN117714849A (en) Image shooting method and related equipment
CN116069156A (en) Shooting parameter adjusting method, electronic equipment and storage medium
CN115712368A (en) Volume display method, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant