WO2022237287A1 - 一种图像的显示方法及电子设备 - Google Patents

一种图像的显示方法及电子设备 Download PDF

Info

Publication number
WO2022237287A1
WO2022237287A1 PCT/CN2022/079142 CN2022079142W WO2022237287A1 WO 2022237287 A1 WO2022237287 A1 WO 2022237287A1 CN 2022079142 W CN2022079142 W CN 2022079142W WO 2022237287 A1 WO2022237287 A1 WO 2022237287A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
electronic device
camera
target area
resolution
Prior art date
Application number
PCT/CN2022/079142
Other languages
English (en)
French (fr)
Inventor
肖斌
丁大钧
陆洋
王宇
朱聪超
Original Assignee
荣耀终端有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 荣耀终端有限公司 filed Critical 荣耀终端有限公司
Priority to EP22777935.2A priority Critical patent/EP4117276B1/en
Priority to US17/919,579 priority patent/US20240214669A1/en
Publication of WO2022237287A1 publication Critical patent/WO2022237287A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/633Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
    • H04N23/635Region indicators; Field of view indicators
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/69Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/52Details of telephonic subscriber devices including functional features of a camera

Definitions

  • the embodiments of the present application relate to the technical field of electronic equipment, and in particular, to an image display method and electronic equipment.
  • electronic devices such as mobile phones, tablet computers or smart watches, etc.
  • cameras can be installed in most electronic devices, so that the electronic devices have the function of capturing images.
  • multiple cameras can be installed in the mobile phone, such as a main camera, a telephoto camera, and a wide-angle camera.
  • the mobile phone can use different cameras to capture images in the same shooting scene to obtain images with different characteristics. For example, based on the feature of long focal length of the telephoto camera, the mobile phone can use the telephoto camera to capture a partially clear telephoto image. For another example, based on the characteristics of large light input and high resolution of the main camera, the mobile phone can use the main camera to capture relatively clear images as a whole. For another example, based on the characteristics of short focal length and large viewing angle of the wide-angle camera, the mobile phone can use the wide-angle camera to capture images with a large viewing angle. After that, users can view images with different characteristics through their mobile phones.
  • the electronic device needs to switch the displayed images so that the user can view images with different characteristics through the mobile phone.
  • the process of displaying images with different characteristics on electronic devices is relatively cumbersome, which affects user experience.
  • the present application provides an image display method and an electronic device, which can simplify the process of displaying images with different characteristics on the electronic device, and improve user experience.
  • the present application provides an image display method, which can be applied to an electronic device, and the electronic device includes a display screen, a first camera, and a second camera, and the field of view of the first camera is larger than that of the second camera. field angle.
  • the electronic device may receive a first user operation on a first interface, the first interface is a viewfinder interface for the electronic device to take pictures, and the first interface includes a preview image captured by the first camera.
  • the electronic device may save the first image captured by the first camera, and save the second image captured by the second camera, the viewing range of the first image captured by the first camera is the first viewing range, and the The viewfinder range of the second image is the second viewfinder range, and the first viewfinder range is larger than the second viewfinder range.
  • the electronic device may receive a second operation, and the second operation is used to trigger the electronic device to display the first image.
  • the electronic device may display the first image.
  • the electronic device may receive a third operation, the third operation is used to trigger the electronic device to display the first area image in the enlarged first image, the first area image includes the first target area image, and the first target area image is relative to
  • the viewfinder range of the first camera is a third viewfinder range, the first viewfinder range includes the third viewfinder range, and the third viewfinder range overlaps with the second viewfinder range.
  • the electronic device may stitch and display the second image on the first target area image.
  • the electronic device may display the area in the first area image except the first target area image and the second image.
  • the electronic device can simultaneously have the characteristics of the first image and the second image in one image, thereby ensuring that the user can view the first image and the second image at the same time.
  • the technical solution of the present application can simplify the process of displaying images with different characteristics on electronic devices and improve user experience.
  • the method further includes: in response to the second operation, the electronic device may stitch the second image into the first image and the second object The area image shows that the viewfinder range of the second target area image relative to the first camera coincides with the second viewfinder range.
  • the electronic device when the electronic device displays the first image, the second image can be directly displayed on the second target area image in the first image without receiving an operation (for example, a third operation) from the user. In this way, the electronic device can simultaneously display a part of the first image and the second image.
  • the technical solution of the present application can simplify the process of displaying images with different characteristics on electronic devices and improve user experience.
  • the method further includes: the electronic device may acquire the resolution of the first region image and the resolution of the display screen.
  • the electronic device may calculate the first ratio according to the resolution of the image in the first area and the resolution of the display screen, where the first ratio is the ratio between the resolution of the image in the first area and the resolution of the display screen. Afterwards, if the first ratio is greater than the first preset ratio, the electronic device stitches the second image on the first target area image for display.
  • the electronic device can control the timing of splicing and displaying the second image on the first target area image. In this way, the electronic device can display the second image under a preset condition, which improves user experience.
  • the method further includes: the electronic device may acquire a first magnification, and the first magnification is the third operation that triggers the electronic device to enlarge the first image the subsequent magnification. If the first magnification is greater than the first preset zoom magnification, the electronic device will display the second image spliced on the first target area image.
  • the electronic device can control to splice and display the second image on the first target area image. In this way, the electronic device can display the second image under a preset condition, which improves user experience.
  • the acquisition of the first magnification by the electronic device may include: the electronic device may acquire the zoom magnification of the first image, the resolution of the first image, and the resolution of the first region image . Afterwards, the electronic device calculates and obtains the first magnification according to the scaling magnification of the first image, the resolution of the first image, and the resolution of the first region image.
  • the electronic device after the electronic device acquires the scaling factor of the first image, the resolution of the first image, and the resolution of the image of the first region, it can calculate the first factor, and then control the splicing of the second image on the first target region displayed on the image. In this way, the electronic device can display the second image under a preset condition, which improves user experience.
  • the first magnification satisfies the following formula:
  • M is used to represent the first magnification
  • B is used to represent the resolution of the first image
  • A is used to represent the resolution of the first region image
  • Z is used to represent the zoom ratio of the first image.
  • the electronic device in response to the third operation, may perform blurring processing on areas in the first area image other than the first target area image.
  • the electronic device blurs the areas in the first area image except for the first target area image, which can reduce the degree of abnormality at the splicing part of the second image and the first area image. In this way, the image quality displayed by the electronic device can be improved, and the user experience is improved.
  • the electronic device further includes a third camera, and the field angle of the second camera is larger than the field angle of the third camera.
  • the method further includes: in response to the photographing operation, the electronic device may store a third image captured by the third camera, the viewing range of the third camera capturing the third image is a fourth viewing range, and the second viewing range is larger than the fourth viewing range.
  • the electronic device may receive a fourth operation, and the fourth operation is used to trigger the electronic device to display a second area image in the enlarged second image, where the second area image includes
  • the viewfinder range of the third target area image relative to the second camera is the fifth viewfinder range
  • the second viewfinder range includes the fifth viewfinder range
  • the fifth viewfinder range overlaps with the fourth viewfinder range.
  • the electronic device can acquire the resolution of the image in the second area and the resolution of the display screen.
  • the electronic device may calculate a second ratio according to the resolution of the image in the second area and the resolution of the display screen, where the second ratio is the ratio between the resolution of the image in the second area and the resolution of the display screen. If the second ratio is greater than the first preset ratio, the electronic device splices the third image on the third target area image for display.
  • the electronic device may display the third image and the area in the second area image other than the third target area image.
  • the electronic device can simultaneously have the characteristics of the second image and the third image in one image, thereby ensuring that the user can view the second image and the third image at the same time.
  • the technical solution of the present application can simplify the process of displaying images with different characteristics on electronic devices and improve user experience.
  • the electronic device may receive a fourth operation, and the fourth operation is used to trigger the electronic device to display a second area image in the enlarged second image, where the second area image includes
  • the viewfinder range of the third target area image relative to the second camera is the fifth viewfinder range
  • the second viewfinder range includes the fifth viewfinder range
  • the fifth viewfinder range overlaps with the fourth viewfinder range.
  • the electronic device may acquire a second magnification, and the second magnification is a zooming magnification after the fourth operation triggers the electronic device to enlarge the second image. If the second magnification is greater than the second preset zoom magnification, the electronic device displays the third image spliced on the third target area image.
  • the electronic device may display the third image and the area in the second area image other than the third target area image.
  • the electronic device can simultaneously have the characteristics of the second image and the third image in one image, thereby ensuring that the user can view the second image and the third image at the same time.
  • the technical solution of the present application can simplify the process of displaying images with different characteristics on electronic devices and improve user experience.
  • the present application provides an electronic device, which includes: a memory, a display screen, and one or more processors, the above-mentioned memory, the display screen and the above-mentioned processor are coupled; the memory is used to store computer program codes, and the computer program The code includes computer instructions; when the computer instructions are executed by the above-mentioned one or more processors, the above-mentioned processor is used to receive the first operation of the user on the first interface. Including the preview image captured by the first camera.
  • the above-mentioned memory is used to store the first image captured by the first camera and the second image captured by the second camera in response to the first operation.
  • the viewfinder range in which the camera collects the second image is the second viewfinder range, and the first viewfinder range is larger than the second viewfinder range.
  • the display screen is configured to display the first image in response to the second operation.
  • the processor above is further configured to receive a third operation, the third operation is used to trigger the electronic device to display the first area image in the enlarged first image, the first area image includes the first target area image, the first target area image.
  • the viewfinder range relative to the first camera is the third viewfinder range, the first viewfinder range includes the third viewfinder range, and the third viewfinder range overlaps with the second viewfinder range.
  • the above-mentioned processor is further configured to splice and display the second image on the first target area image in response to the third operation.
  • the processors are further configured to stitch the second image on the first image in response to the second operation
  • the image of the second target area in the center is displayed, and the viewfinder range of the second target area image relative to the first camera coincides with the second viewfinder range.
  • the processors are further configured to acquire the resolution of the image in the first area and the resolution of the display screen .
  • the above-mentioned processor is further configured to calculate a first ratio according to the resolution of the image in the first area and the resolution of the display screen, where the first ratio is the ratio between the resolution of the image in the first area and the resolution of the display screen.
  • the above processor is further configured to splice the second image on the first target area image for display if the first ratio is greater than the first preset ratio.
  • the processors when the computer instructions are executed by the one or more processors, the processors are also used to obtain the first magnification, and the first magnification is the third operation trigger electronic The zoom factor after the device zooms in on the first image.
  • the above-mentioned processor is further configured to splice and display the second image on the first target area image if the first magnification is greater than the first preset zoom magnification.
  • the processors are further configured to obtain the scaling factor of the first image, the resolution of the first image and the resolution of the first region image.
  • the above-mentioned processor is further configured to calculate and obtain the first magnification according to the zoom magnification of the first image, the resolution of the first image, and the resolution of the first region image.
  • the first magnification satisfies the following formula:
  • M is used to represent the first magnification
  • B is used to represent the resolution of the first image
  • A is used to represent the resolution of the first region image
  • Z is used to represent the zoom ratio of the first image.
  • the above-mentioned processor is further configured to, in response to the third operation, remove the first Areas outside the target area image are blurred.
  • the electronic device further includes a third camera, and the field angle of the second camera is larger than the field angle of the third camera.
  • the above-mentioned memory is also used to store the third image captured by the third camera in response to the photographing operation, and the viewfinder range of the third camera to collect the third image is the fourth viewfinder Range, the second viewfinder range is larger than the fourth viewfinder range.
  • the processors are further configured to receive a fourth operation, and the fourth operation is used to trigger a display screen to display
  • the second region image in the enlarged second image the second region image includes the third target region image
  • the viewfinder range of the third target region image relative to the second camera is the fifth viewfinder range
  • the second viewfinder range includes the fifth The viewfinder range
  • the fifth viewfinder range overlaps with the fourth viewfinder range.
  • the above-mentioned processor is also used to acquire the resolution of the image in the second area and the resolution of the display screen.
  • the above-mentioned processor is further configured to calculate a second ratio according to the resolution of the second area image and the resolution of the display screen, and the second ratio is the ratio between the resolution of the second area image and the resolution of the display screen.
  • the above-mentioned processor is further configured to splice and display the third image on the third target area image if the second ratio is greater than the first preset ratio.
  • the processors are further configured to receive a fourth operation, and the fourth operation is used to trigger a display screen to display
  • the second region image in the enlarged second image the second region image includes the third target region image
  • the viewfinder range of the third target region image relative to the second camera is the fifth viewfinder range
  • the second viewfinder range includes the fifth The viewfinder range, the fifth viewfinder range overlaps with the fourth viewfinder range.
  • the processor above is further configured to acquire a second magnification, where the second magnification is a zoom magnification after the fourth operation triggers the electronic device to enlarge the second image.
  • the above-mentioned processor is further configured to splice and display the third image on the third target area image if the second magnification is greater than the second preset zoom magnification.
  • the present application provides an electronic device, which includes: a memory, a display screen, and one or more processors, the above-mentioned memory, the display screen and the above-mentioned processor are coupled; the memory is used to store computer program codes, and the computer program The code includes computer instructions; when the computer instructions are executed by the above-mentioned one or more processors, the electronic device is made to execute the method described in the first aspect and any possible design manner thereof.
  • the present application provides a chip system, which is applied to an electronic device.
  • the system-on-a-chip includes one or more interface circuits and one or more processors.
  • the interface circuit and the processor are interconnected by wires.
  • the interface circuit is for receiving a signal from the memory of the electronic device and sending the signal to the processor, the signal including computer instructions stored in the memory.
  • the processor executes the computer instructions
  • the electronic device executes the method described in the first aspect and any possible design manner thereof.
  • the present application provides a computer-readable storage medium, the computer-readable storage medium includes computer instructions, and when the computer instructions are run on an electronic device, the electronic device executes the electronic device according to the first aspect and any one thereof.
  • the computer-readable storage medium includes computer instructions, and when the computer instructions are run on an electronic device, the electronic device executes the electronic device according to the first aspect and any one thereof.
  • One possible design approach is described.
  • the present application provides a computer program product, which, when running on a computer, causes the computer to execute the method described in the first aspect and any possible design manner thereof.
  • the electronic device described in the second aspect and any possible design method provided above the electronic device described in the third aspect, the chip system described in the fourth aspect, and the computer described in the fifth aspect
  • the beneficial effects achieved by the computer program product described in the sixth aspect can refer to the beneficial effects in the first aspect and any possible design manner thereof, which will not be repeated here.
  • FIG. 1 is a schematic diagram of an example of an image viewing interface provided by an embodiment of the present application
  • FIG. 2A is a schematic diagram of a hardware structure of an electronic device provided in an embodiment of the present application.
  • FIG. 2B is a schematic diagram of an example of a viewing range provided by an embodiment of the present application.
  • FIG. 3 is a schematic diagram of an example of an image preview interface provided by an embodiment of the present application.
  • FIG. 4A is a schematic diagram of an example of an interface for saving an image provided by an embodiment of the present application.
  • FIG. 4B is a schematic diagram of an example of another interface for saving an image provided by the embodiment of the present application.
  • FIG. 5 is a schematic diagram of an example of a display image provided by an embodiment of the present application.
  • FIG. 6 is a schematic diagram of an example of another image viewing interface provided by the embodiment of the present application.
  • FIG. 7 is a schematic diagram of an example of a viewing range of an image provided by an embodiment of the present application.
  • FIG. 8 is a schematic diagram of an example of another image viewing interface provided by the embodiment of the present application.
  • FIG. 9 is a schematic diagram of an example of another image preview interface provided by the embodiment of the present application.
  • FIG. 10 is a schematic diagram of an example of another image viewing interface provided by the embodiment of the present application.
  • FIG. 11 is a schematic diagram of an example of another image preview interface provided by the embodiment of the present application.
  • FIG. 12 is a schematic diagram of an example of another image viewing interface provided by the embodiment of the present application.
  • FIG. 13 is a schematic diagram of an example of another image preview interface provided by the embodiment of the present application.
  • FIG. 14 is a schematic diagram of an example of another image viewing interface provided by the embodiment of the present application.
  • FIG. 15 is a schematic diagram of the structural composition of a chip system provided by an embodiment of the present application.
  • A/B can be understood as A or B.
  • first and second are used for descriptive purposes only, and cannot be understood as indicating or implying relative importance or implicitly specifying the quantity of indicated technical features. Thus, a feature defined as “first” and “second” may explicitly or implicitly include one or more of these features. In the description of this embodiment, unless otherwise specified, "plurality” means two or more.
  • words such as “exemplary” or “for example” are used as examples, illustrations or illustrations. Any embodiment or design described herein as “exemplary” or “for example” is not to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the words “exemplary” or “such as” is intended to present concepts in a specific manner.
  • Super-resolution reconstruction refers to using one or a group of low-quality, low-resolution images to generate a high-quality, high-resolution image.
  • the super-resolution reconstruction may include a reconstruction-based method or a learning-based method.
  • the electronic device can transmit the original image to the ISP module.
  • the RAW format is an unprocessed and uncompressed format.
  • the ISP module can analyze the raw image to check the density gap between adjacent pixels in the image. Then, the ISP module can use the preset adjustment algorithm in the ISP module to properly adjust the original image, so as to improve the quality of the image captured by the camera.
  • a mobile phone As an example, multiple cameras can be installed in the mobile phone, such as a main camera, a telephoto camera, and a wide-angle camera.
  • the mobile phone can use different cameras to capture images in the same shooting scene to obtain images with different characteristics.
  • the user when a user takes different images of the same scene (such as a main image, a telephoto image and a wide-angle image) through an electronic device, the user needs to switch the shooting mode of the electronic device to obtain different images of the same scene.
  • the main image is an image of the electronic device through the main camera
  • the telephoto image is the image collected by the electronic device through the telephoto camera
  • the wide-angle image is the image collected by the electronic device through the wide-angle camera.
  • the electronic device needs to display images with different characteristics in response to a user's switching operation.
  • the electronic device displays an image display interface 101 , and the image display interface 101 includes a wide-angle image 102 .
  • the electronic device may switch the displayed wide-angle image 102 to the main image 103 as shown in (b) in FIG. 1 .
  • the electronic device may switch the displayed main image 103 to the telephoto image 104 as shown in (c) of FIG. 1 .
  • the electronic device not only needs to respond to the user's multiple operations before it can capture multiple images with different characteristics. Moreover, the electronic device needs to switch the displayed images, so that the user can view images with different characteristics through the mobile phone.
  • an embodiment of the present application provides an image display method.
  • the electronic device may respond to a user's one photographing operation to obtain the first image and the second image through different cameras, and the viewing range of the first image is larger than the viewing range of the second image.
  • the electronic device can obtain multiple images with different characteristics only in response to one operation of the user, which simplifies the shooting process of the electronic device.
  • the electronic device when the electronic device displays the first image, if the electronic device receives an operation of enlarging the zoom ratio of the first image, the electronic device may display the second image on an upper layer of the first image. In this way, the electronic device can display the first image and the second image at the same time, so that the user can view images with different characteristics at the same time, which improves the user experience.
  • the image collected by the electronic device through the camera may be: an image after the ISP adjusts the original image collected by the camera.
  • the image collected by the electronic device through the main camera is: an image obtained by adjusting the original image collected by the main camera by the ISP;
  • the image collected by the electronic device through the telephoto camera is: collected by the ISP from the telephoto camera
  • the image collected by the electronic device through the wide-angle camera is: an image obtained by adjusting the original image collected by the wide-angle camera by the ISP.
  • the image captured by the electronic device through the camera in the embodiment of the present application may be an image in RAW format (that is, an original image), which is not limited in the embodiment of the present application.
  • the image in RAW format is an image that records the original information of the camera sensor and some metadata (ISO setting, shutter speed, aperture value, white balance, etc.) generated by the image captured by the camera, and the image The image is not processed by the IPS module.
  • ISO is the abbreviation of International Organization for Standardization.
  • the electronic device in the embodiment of the present application can be a tablet computer, a mobile phone, a desktop, a laptop, a handheld computer, a notebook computer, an ultra-mobile personal computer (ultra-mobile personal computer, UMPC), a netbook, and a cellular Telephones, personal digital assistants (personal digital assistant, PDA), augmented reality (augmented reality, AR) ⁇ virtual reality (virtual reality, VR) equipment, vehicle-mounted equipment and other equipment, the embodiment of the present application does not make a special description of the specific form of the electronic equipment limit.
  • the image display method provided in the present application may be executed by an image display device, and the execution device may be the electronic device shown in FIG. 2A . Meanwhile, the executing device may also be a central processing unit (Central Processing Unit, CPU) of the electronic device, or a control module in the electronic device for fusing images.
  • CPU Central Processing Unit
  • the method for displaying an image provided by the embodiment of the present application is described by taking the method for displaying an image performed by an electronic device as an example.
  • this application takes the mobile phone 200 shown in FIG. 2A as an example to introduce the electronic device provided by this application.
  • the mobile phone 200 shown in FIG. 2A is only an example of an electronic device, and the mobile phone 200 may have more or fewer components than those shown in the figure, may combine two or more components, or may with different part configurations.
  • the various components shown in FIG. 2A may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
  • mobile phone 200 can comprise: processor 210, external memory interface 220, internal memory 221, universal serial bus (universal serial bus, USB) interface 230, charging management module 240, power management module 241, battery 242 , antenna 1, antenna 2, mobile communication module 250, wireless communication module 260, audio module 270, speaker 270A, receiver 270B, microphone 270C, earphone jack 270D, sensor module 280, button 290, motor 291, indicator 292, camera 293 , a display screen 294, and a subscriber identification module (subscriber identification module, SIM) card interface 295, etc.
  • SIM subscriber identification module
  • the above-mentioned sensor module 280 may include sensors such as pressure sensor, gyroscope sensor, air pressure sensor, magnetic sensor, acceleration sensor, distance sensor, proximity light sensor, fingerprint sensor, temperature sensor, touch sensor, ambient light sensor and bone conduction sensor.
  • sensors such as pressure sensor, gyroscope sensor, air pressure sensor, magnetic sensor, acceleration sensor, distance sensor, proximity light sensor, fingerprint sensor, temperature sensor, touch sensor, ambient light sensor and bone conduction sensor.
  • the processor 210 may include one or more processing units, for example: the processor 210 may include a memory, a video codec, a baseband processor, and/or a neural-network processing unit (NPU), etc. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • the processor 210 may include a memory, a video codec, a baseband processor, and/or a neural-network processing unit (NPU), etc.
  • different processing units may be independent devices, or may be integrated in one or more processors.
  • the controller may be the nerve center and command center of the handset 200 .
  • the controller can generate an operation control signal according to the instruction opcode and timing signal, and complete the control of fetching and executing the instruction.
  • a memory may also be provided in the processor 210 for storing instructions and data.
  • the memory in processor 210 is a cache memory.
  • processor 210 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • SIM subscriber identity module
  • USB universal serial bus
  • the interface connection relationship between the modules shown in this embodiment is only a schematic illustration, and does not constitute a structural limitation of the mobile phone 200 .
  • the mobile phone 200 may also adopt different interface connection methods in the above embodiments, or a combination of multiple interface connection methods.
  • the charging management module 240 is configured to receive charging input from the charger.
  • the charger may be a wireless charger or a wired charger. While the charging management module 240 is charging the battery 242 , it can also supply power to the electronic device through the power management module 241 .
  • the power management module 241 is used for connecting the battery 242 , the charging management module 240 and the processor 210 .
  • the power management module 241 receives the input from the battery 242 and/or the charging management module 240 to provide power for the processor 210 , internal memory 221 , external memory, display screen 294 , camera 293 , and wireless communication module 260 .
  • the power management module 241 and the charging management module 240 can also be set in the same device.
  • the wireless communication function of the mobile phone 200 can be realized by the antenna 1, the antenna 2, the mobile communication module 250, the wireless communication module 260, the modem processor and the baseband processor.
  • the antenna 1 of the mobile phone 200 is coupled to the mobile communication module 250, and the antenna 2 is coupled to the wireless communication module 260, so that the mobile phone 200 can communicate with the network and other devices through wireless communication technology.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in handset 200 can be used to cover single or multiple communication frequency bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
  • the mobile communication module 250 can provide wireless communication solutions including 2G/3G/4G/5G applied on the mobile phone 200 .
  • the mobile communication module 250 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA) and the like.
  • the mobile communication module 250 can also amplify the signal modulated by the modem processor, convert it into electromagnetic wave and radiate it through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 250 may be set in the processor 210 .
  • the wireless communication module 260 can provide applications on the mobile phone 200 including wireless local area networks (wireless local area networks, WLAN) (such as (wireless fidelity, Wi-Fi) network), frequency modulation (frequency modulation, FM), infrared technology (infrared, IR) ) and other wireless communication solutions.
  • wireless local area networks wireless local area networks, WLAN
  • wireless fidelity, Wi-Fi wireless fidelity, Wi-Fi
  • FM frequency modulation
  • IR infrared technology
  • the wireless communication module 260 may be one or more devices integrating at least one communication processing module.
  • the mobile phone 200 realizes the display function through the GPU, the display screen 294, and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 294 and the application processor.
  • the display screen 294 is used to display images, videos and the like.
  • the display screen 294 includes a display panel.
  • the display screen 294 may be used to display a gallery interface, a shooting interface, and the like.
  • the mobile phone 200 can realize the shooting function through ISP, camera 293 , video codec, GPU, display screen 294 and application processor.
  • the ISP is used for processing the data fed back by the camera 293 .
  • Camera 293 is used to capture still images or video.
  • the mobile phone 200 may include 1 or N cameras 293, where N is a positive integer greater than 1.
  • the N cameras 293 may include: a main camera, a telephoto camera, and a wide-angle camera.
  • the N cameras 293 may also include: at least one camera such as an infrared camera, a depth camera, or a black and white camera. The following briefly introduces the characteristics (that is, advantages and disadvantages) and applicable scenarios of the above-mentioned cameras.
  • the main camera has the characteristics of large light input, high resolution, and moderate field of view.
  • the main camera is generally used as the default camera of an electronic device (such as a mobile phone). That is to say, the electronic device (such as a mobile phone) can start the main camera by default in response to the user's operation of starting the "camera” application, and display the image captured by the main camera on the preview interface.
  • the telephoto camera has a longer focal length, which is suitable for shooting objects that are far away from the mobile phone (ie, distant objects).
  • the amount of light entering the telephoto camera is small.
  • Using the telephoto camera to capture images in low-light scenes may affect the image quality due to insufficient light input.
  • the field of view of the telephoto camera is small, so it is not suitable for shooting images of larger scenes, that is, it is not suitable for shooting larger objects (such as buildings or landscapes, etc.).
  • Wide-angle camera has a wider field of view and are good for capturing larger subjects such as landscapes.
  • the focal length of the wide-angle camera is relatively short, and when the wide-angle camera shoots an object at a short distance, the object in the captured wide-angle image is likely to be distorted (for example, the object in the image becomes wider and flatter than the original object).
  • the viewing angles in the embodiments of the present application include horizontal viewing angles and vertical viewing angles.
  • the mobile phone 200 may have multiple shooting modes.
  • the multiple shooting modes include: a wide-angle shooting mode, a normal shooting mode, and a telephoto shooting mode.
  • the mobile phone 200 adopts different shooting modes, and can capture images through different cameras.
  • the shooting mode of the mobile phone 200 is the wide-angle shooting mode
  • the mobile phone 200 may collect images through the wide-angle camera and the main camera (or collect images through the wide-angle camera, the main camera and the telephoto camera).
  • the shooting mode of the mobile phone 200 is the normal shooting mode
  • the mobile phone 200 may collect images through the main camera and the telephoto camera.
  • the shooting mode of the mobile phone 200 is the telephoto shooting mode
  • the mobile phone 200 may take images through the telephoto camera.
  • the external memory interface 220 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the mobile phone 200.
  • the external memory card communicates with the processor 210 through the external memory interface 220 to implement a data storage function. Such as saving music, video and other files in the external memory card.
  • the internal memory 221 may be used to store computer-executable program codes including instructions.
  • the processor 210 executes various functional applications and data processing of the mobile phone 200 by executing instructions stored in the internal memory 221 .
  • the processor 210 may execute instructions stored in the internal memory 221, and the internal memory 221 may include a program storage area and a data storage area.
  • the stored program area can store an operating system, at least one application program required by a function (such as a sound playing function, an image playing function, etc.) and the like.
  • the storage data area can store data (such as audio data, phone book, etc.) created during the use of the mobile phone 200 .
  • the internal memory 221 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (universal flash storage, UFS) and the like.
  • the mobile phone 200 can realize the audio function through the audio module 270, the speaker 270A, the receiver 270B, the microphone 270C, the earphone interface 270D, and the application processor. Such as music playback, recording, etc.
  • the keys 290 include a power key, a volume key and the like.
  • the key 290 may be a mechanical key. It can also be a touch button.
  • the motor 291 can generate a vibrating reminder. Motor 291 can be used for incoming call vibration prompt, also can be used for touch vibration feedback.
  • the indicator 292 can be an indicator light, which can be used to indicate the charging status, the change of the battery capacity, and also can be used to indicate messages, missed calls, notifications and so on.
  • the SIM card interface 295 is used for connecting a SIM card. The SIM card can be inserted into the SIM card interface 295 or pulled out from the SIM card interface 295 to realize contact and separation with the mobile phone 200 .
  • the mobile phone 200 can support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • SIM card interface 295 can support Nano SIM card, Micro SIM card, SIM card etc.
  • the mobile phone 200 may also be a flashlight, a micro projection device, a near field communication (Near Field Communication, NFC) device, etc., which will not be repeated here.
  • NFC Near Field Communication
  • the structure shown in this embodiment does not constitute a specific limitation on the mobile phone 200 .
  • the mobile phone 200 may include more or fewer components than shown, or combine some components, or separate some components, or arrange different components.
  • the illustrated components can be realized in hardware, software or a combination of software and hardware.
  • the electronic device may include a first camera and a second camera.
  • the field angle of the first camera is larger than the field angle of the second camera.
  • the angle of view of the first camera is greater than the angle of view of the second camera means that the horizontal angle of view of the first camera is greater than that of the second camera, and the vertical angle of view of the first camera is greater than that of the second camera. The vertical field of view of the camera.
  • the viewfinder range in which the first camera collects the first image is the first viewfinder range
  • the viewfinder range in which the second camera collects the second image is the second viewfinder range.
  • the first viewing range is larger than the second viewing range.
  • the viewing range of the image captured by the camera refers to the area range that the camera can capture.
  • the main camera may capture an image corresponding to the area 203 in the area 202 . That is to say, the areas in the area 202 except the area 203 are not within the viewing range of the image captured by the main camera.
  • the viewfinder range of the image corresponds to the viewfinder range of the image captured by the camera.
  • the viewfinder range of the first image may indicate the viewfinder range (that is, the first viewfinder range) in which the first camera captures the first image.
  • the viewfinder range of the second image may indicate the viewfinder range (that is, the second viewfinder range) in which the second camera captures the second image.
  • the first camera and the second camera may have three combinations: combination mode (1), combination mode (2) and combination mode (3).
  • combination mode (1) the first camera may be a wide-angle camera, and the second camera may be a main camera.
  • combination mode (2) the first camera may be a wide-angle camera, and the second camera may be a telephoto camera.
  • combination mode (3) the first camera may be a main camera, and the second camera may be a telephoto camera. The embodiments of the present application will be described below in conjunction with combination mode (1), combination mode (2) and combination mode (3).
  • the first camera and the second camera may be combined (1), that is, the first camera may be a wide-angle camera, and the second camera may be a main camera.
  • the electronic device can start the shooting application and display an image preview interface (namely the first interface).
  • the image preview interface is a viewfinder interface for the electronic device to take pictures.
  • the image preview interface includes the first image captured by the wide-angle camera.
  • the shooting mode of the electronic device is a wide-angle shooting mode.
  • the electronic device can display an image preview interface 301, and the image preview interface 301 includes a viewfinder frame 302, a camera conversion key 303, a shooting key 304, an album key 305, a preview image 306, a flash option 307, a "recording” option, a "photographing” option, "More” option etc.
  • the preview image 306 is an image collected by a wide-angle camera.
  • the preview image 306 may be an image after the electronic device performs image processing on the wide-angle image and the main image (for details, refer to the following description, which will not be repeated here). Afterwards, the electronic device may receive the user's photographing operation.
  • the electronic device can capture the first image (that is, the wide-angle image) through the wide-angle camera and the second image (that is, the main image) through the main camera at the same time.
  • the viewfinder range of the wide-angle image is larger than that of the main The viewfinder range of the image.
  • the wide-angle image 308 shown in (b) in Fig. 3 is the image collected by the wide-angle camera
  • the main image 309 shown in (c) in Fig. 3 is the image collected by the main camera
  • the aforementioned electronic device collects wide-angle images through the wide-angle camera at the same time, and collects the main image through the main camera refers to: the moment when the wide-angle camera collects the wide-angle image (such as the first moment) and the moment when the main camera collects the main image (such as The second moment) is the same; or, the time difference between the first moment and the second moment is small (for example, the time difference is less than 1 millisecond, 0.5 millisecond or 2 milliseconds, etc.).
  • the embodiment of the present application does not limit the order in which the wide-angle camera acquires the wide-angle image and the main camera acquires the main image.
  • the electronic device in response to the photographing operation (also referred to as the first operation), after the electronic device captures a wide-angle image through the wide-angle camera and a main image through the main camera, the electronic device may save the wide-angle image and the main image.
  • the electronic device can save the wide-angle image and the main image through the manner (a) and the manner (b).
  • Method (a) is that the electronic device saves the wide-angle image in a visible form, and saves the main image in an invisible form.
  • Way (b) is to save the wide-angle image and the main image in visible form.
  • the electronic device saves the wide-angle image and the main image in manner (a), that is, the electronic device saves the wide-angle image in a visible form, and saves the main image in an invisible form.
  • the electronic device may display a gallery interface 401 , the gallery interface 401 includes a wide-angle image 308 , and the gallery interface 401 does not display a main image.
  • the electronic device stores the wide-angle image and the main image in manner (b), that is, the electronic device stores the wide-angle image and the main image in a visible form.
  • the gallery interface 401 displayed by the electronic device includes a wide-angle image 308 and a main image 309 .
  • the embodiments of the present application are introduced by taking the electronic device storing the wide-angle image and the main image in the manner (a) as an example.
  • the electronic device may receive a second operation, and the second operation is used to trigger the electronic device to display the wide-angle image.
  • the electronic device displays a wide-angle image.
  • the electronic device may receive an operation (such as a click operation) performed by the user on the wide-angle image 308, and display an image viewing interface 501 as shown in (b) in FIG. 5 ,
  • the image viewing interface 501 includes a wide-angle image 308 .
  • the electronic device may stitch and display the main image on the second target area image in the wide-angle image, and the second target area image overlaps with the second view range relative to the view range of the first camera. That is, in response to the second operation, the electronic device may display an image after stitching the wide-angle image and the main image.
  • the electronic device may receive a third operation, where the third operation is used to trigger the electronic device to display the first region image in the enlarged wide-angle image.
  • the first area image includes the first target area image
  • the viewing range of the first target area image is the third viewing range relative to the wide-angle camera (that is, the first camera can obtain the first target area image by using the third viewing range).
  • the first viewfinder range includes the third viewfinder range
  • the third viewfinder range overlaps with the second viewfinder range.
  • the first viewing range is the viewing range of the wide-angle image 308 shown in (a) in FIG. 6
  • the third viewing range is the viewing range of the first target area image 604 shown in (a) in FIG. 6 .
  • the image viewing interface 601 displayed by the electronic device includes a wide-angle image 308 .
  • the electronic device may receive a third operation (such as an expansion operation) performed by the user on the image viewing interface 601, and trigger the electronic device to display an image 603 as shown in (b) in FIG. area, the viewing range of the image 603 is the same as the viewing range of the area image 602 in the wide-angle image 308 . That is to say, the image 603 is the enlarged region image 602 .
  • the image 603 includes a first target area image 604 , and the viewing range of the first target area image 604 is the same as that of the main image (eg, the main image 309 shown in (c) of FIG. 3 ).
  • the electronic device displays the first region image in the enlarged wide-angle image
  • the quality of the image displayed by the electronic device is poor, so that the user cannot view the high-resolution image. quality images.
  • the electronic device in response to the third operation, may splice the main image on the first target area image for display. That is, in response to the third operation, the electronic device may display an area of the first area image other than the first target area image and the main image. Exemplarily, as shown in (c) of FIG. 6 , the electronic device may splice the main image 309 on the first target area image 604 of the image 603 for display.
  • the electronic device may splice and display the main image on the first target area image. In this way, the electronic device can simultaneously display a part of the wide-angle image and the main image.
  • the technical solution of the present application can simplify the process of displaying images with different characteristics on electronic devices and improve user experience.
  • calibration information is stored in the electronic device, and the calibration information includes: the relationship between the field of view of the wide-angle camera and the field of view of the main camera, the relationship between the viewfinder range of the main image and the viewfinder range of the wide-angle image corresponding relationship.
  • the electronic device may draw and display the main image on the first target area according to the calibration information.
  • the electronic device may determine the first target area image in the following implementation manners.
  • the electronic device can save the two-dimensional coordinates of two opposite corners (such as the upper left corner and the lower right corner, or the upper right corner and the lower left corner) in the viewfinder range of the main image in the coordinate system of the viewfinder range of the wide-angle image.
  • the two-dimensional coordinates can reflect the corresponding relationship between the viewing range of the main image and the viewing range of the wide-angle image.
  • the coordinate origin of the coordinate system of the viewfinder range of the wide-angle image is any corner (such as the upper left corner or the lower left corner) in the viewfinder range of the wide-angle image
  • the x-axis and y-axis are two adjacent sides.
  • FIG. 7 shows an example of a coordinate system of a viewing range 710 of a wide-angle image.
  • the point o is the coordinate origin
  • the x-axis is the lower side of the field of view 710
  • the y-axis is the left side of the field of view 710 .
  • the electronic device can save the two-dimensional coordinates A1(x1, y1) and A2(x2, y2) of the upper left corner A1 and the lower right corner A2 of the viewing range 720 of the main image in the xoy coordinate system shown in FIG. 7 .
  • the two-dimensional coordinates A1 ( x1 , y1 ) and A2 ( x2 , y2 ) can reflect the corresponding relationship between the viewing range of the main image and the viewing range of the wide-angle image.
  • the electronic device may display the main image and the areas in the first area image other than the first target area image.
  • the electronic device can simultaneously have the characteristics of the wide-angle image and the characteristics of the main image in one image, thereby ensuring that the user can view the wide-angle image and the main image at the same time.
  • the image quality of the main image is higher than that of the first target area image, and the viewing range of the main image is the same as that of the first target area image. Therefore, the image quality viewed by the user can be improved, and the use experience of the user can be improved.
  • the electronic device may control the timing at which the electronic device splices and displays the main image on the first target area image. Specifically, the electronic device may control the timing at which the electronic device splices and displays the main image on the first target area image through methods (a) and (b).
  • the method (a) is that the electronic device controls the timing at which the electronic device splices the main image on the image of the first target area and displays it according to the resolution of the image and the resolution of the display screen.
  • the method (b) is: the electronic device can control the timing when the electronic device splices the main image on the image of the first target area and displays it according to the scaling factor of the image.
  • the electronic device may use the method (a), that is, the electronic device controls the timing at which the electronic device splices the main image on the first target area image and displays it through the resolution of the image and the resolution of the display screen. Specifically, the electronic device may acquire the resolution of the image in the first region in the enlarged wide-angle image and the resolution of the display screen. Afterwards, the electronic device may calculate a first ratio according to the resolution of the first area image and the resolution of the display screen, where the first ratio is a ratio between the resolution of the first area image and the resolution of the display screen.
  • the electronic device may obtain the first ratio through formula one.
  • N is used to represent the first ratio
  • C is used to represent the resolution of the display screen of the electronic device
  • A is used to represent the resolution of the first region image.
  • the first ratio is:
  • the first ratio is 0.8.
  • the electronic device stores the ratio of the area of the first region image to the area of the enlarged wide-angle image.
  • the electronic device may determine the resolution of the first region image according to the resolution of the wide-angle image and the ratio of the area of the first region image to the area of the enlarged wide-angle image.
  • the larger the zoom ratio of the wide-angle image is the smaller the ratio of the area of the first region image to the area of the enlarged wide-angle image is.
  • the ratio of the area of the first area image to the area of the enlarged wide-angle image is 0.8
  • the resolution of the wide-angle image is 4000 ⁇ 3000, so the resolution of the first area image is 3200 ⁇ 2400.
  • the electronic device may determine whether to splice the main image on the first target area image for display according to the first ratio and the first preset ratio.
  • the first preset ratio is greater than 0.5, and the first preset ratio is smaller than 0.95.
  • the embodiment of the present application will be introduced by taking the first preset ratio of 0.8 as an example.
  • the electronic device may splice the main image on the first target area image for display. If the first ratio is smaller than the first preset ratio, the electronic device only displays the first region image, and does not splice the main image on the first target region image for display. Exemplarily, if the first ratio is 0.85, the electronic device may stitch and display the main image on the first target area image (for example, (c) in FIG. 6 ). If the first ratio is 0.75, the electronic device only displays the first region image (eg (b) in FIG. 6 ).
  • the resolution of the display screen of the electronic device is constant, if the resolution of the first area image is smaller, the first ratio is larger, and the electronic device stitches the main image on the first target area image to display The higher the probability. That is to say, when the zoom ratio of the wide-angle image is larger, the electronic device has a higher probability of displaying the main image spliced on the first target area image.
  • the electronic device can control the timing of splicing and displaying the main image on the first target area image. In this way, the electronic device can display the main image under preset conditions, which improves user experience.
  • the electronic device can use the method (b), that is, the electronic device can control the timing at which the electronic device splices the main image on the first target area image and displays it according to the scaling factor of the image. Specifically, the electronic device may obtain the first magnification, which is the zoom magnification after the third operation triggers the electronic device to enlarge the wide-angle image (ie, the zoom magnification of the enlarged wide-angle image, such as the zoom magnification of image 603 ).
  • the electronic device can obtain the first magnification in the following manner. Specifically, the electronic device may obtain the zoom ratio of the wide-angle image (that is, the unmagnified wide-angle image, such as the zoom ratio of the wide-angle image 308), the resolution of the wide-angle image (such as the resolution of the wide-angle image 308), and the resolution of the first region image. rate (such as the resolution of image 603). Afterwards, the electronic device may calculate the first magnification according to the zoom magnification of the wide-angle image, the resolution of the wide-angle image, and the resolution of the first region image. Exemplarily, the electronic device may obtain the first magnification through formula 2.
  • M is used to represent the first magnification
  • B is used to represent the resolution of the wide-angle image
  • A is used to represent the resolution of the first region image
  • Z is used to represent the zoom ratio of the wide-angle image.
  • the first ratio is:
  • the first magnification is 0.54.
  • the electronic device may also calculate the first magnification according to the zoom magnification of the wide-angle image, the pixels in the first direction in the wide-angle image, and the pixels in the first direction in the first region image, and the first direction may be horizontally or vertically.
  • the electronic device may determine whether to splice and display the main image on the first target area image according to the first magnification and the first preset zoom magnification.
  • the first preset zoom ratio is greater than 0.7X
  • the first preset zoom ratio is smaller than 0.95X.
  • the embodiment of the present application will be introduced by taking the first preset magnification of 0.9X as an example.
  • the electronic device may splice the main image on the first target area image for display. If the first magnification is smaller than the first preset zoom magnification, the electronic device only displays the first region image, and does not splice the main image on the first target region image for display. Exemplarily, if the first zoom ratio is 0.95X, the electronic device may stitch and display the main image on the first target area image (for example, (c) in FIG. 6 ). If the first magnification is 0.75X, the electronic device only displays the image of the first region (eg (b) in FIG. 6 ).
  • the electronic device can control to splice and display the main image on the first target area image. In this way, the electronic device can display the main image under preset conditions, which improves user experience.
  • the stitching of the main image on the first target area image may also be saved after shooting.
  • the electronic device in response to the third operation, may perform image fusion on the first region image and the second image (for example, the main image) to obtain the fourth image. Afterwards, the electronic device may display the fourth image.
  • the electronic device may control the timing of performing image fusion on the first region image and the main image.
  • image fusion can improve image quality.
  • the viewing range of the image in the first area is relatively large, and the image quality of the main image is relatively high.
  • the electronic device performs image fusion on the main image and the first area image, and can synthesize the characteristics of the main image and the first area image to obtain a fourth image with a larger viewfinder range and higher local image definition. In this way, the image quality viewed by the user can be improved, and the user experience is improved.
  • the electronic device may perform blurring processing on areas in the first area image other than the first target area image.
  • the electronic device may add a coating to an upper layer of an area in the first area image other than the first target area image.
  • the coating layer 801 is displayed on top of areas in the first area image other than the first target area image.
  • the electronic device may perform blurring processing on areas in the first area image other than the first target area image by using a blurring processing algorithm.
  • the embodiment of the present application does not limit the fuzzy processing algorithm.
  • the blurring algorithm can be Gaussian blurring.
  • the blurring algorithm may be box blurring.
  • the electronic device blurs the areas in the first area image except for the first target area image, which can reduce the degree of abnormality at the junction of the main image and the first area image. In this way, the image quality displayed by the electronic device can be improved, and the user experience is improved.
  • the first camera and the second camera may be combined (2), that is, the first camera may be a wide-angle camera, and the second camera may be a telephoto camera.
  • the electronic device can start the shooting application and display an image preview interface (namely the first interface).
  • the image preview interface is a viewfinder interface for the electronic device to take pictures.
  • the image preview interface includes the first image captured by the wide-angle camera.
  • the shooting mode of the electronic device is a wide-angle shooting mode.
  • the electronic device may display an image preview interface 901, where the image preview interface 901 includes a preview image 902, where the preview image 902 is an image collected by a wide-angle camera.
  • the preview image 902 may be an image after the electronic device performs image processing on the wide-angle image and the telephoto image (for details, refer to the following description, which will not be repeated here).
  • the electronic device may receive the user's photographing operation.
  • the electronic device can capture the first image (ie wide-angle image) through the wide-angle camera and the second image (ie telephoto image) through the telephoto camera at the same time.
  • the wide-angle image 903 shown in (b) in Fig. 9 is the image collected by the wide-angle camera
  • the telephoto image 904 shown in (c) in Fig. 9 is the image collected by the telephoto camera
  • the wide-angle image 903 The viewfinder range is larger than the viewfinder range of the telephoto image 904 .
  • the above-mentioned electronic device collects wide-angle images through the wide-angle camera at the same time, and collects telephoto images through the telephoto camera refers to: the moment when the wide-angle camera collects the wide-angle image (such as the first moment) and the time when the telephoto camera collects the telephoto image
  • the time of the time (such as the third time) is the same; or, the time difference between the first time and the third time is small (for example, the time difference is less than 1 millisecond).
  • the embodiment of the present application does not limit the order in which the wide-angle camera collects the wide-angle image and the telephoto camera collects the telephoto image.
  • the electronic device in response to the photographing operation, after the electronic device captures a wide-angle image with a wide-angle camera and a telephoto image with a telephoto camera, the electronic device may save the wide-angle image and the telephoto image.
  • the electronic device can store the wide-angle image in a visible form, and store the telephoto image in an invisible form.
  • the electronic device can save the wide-angle image and the telephoto image in a visible form.
  • the electronic device may receive a second operation, and the second operation is used to trigger the electronic device to display the wide-angle image.
  • the electronic device displays a wide-angle image.
  • the electronic device may receive a third operation, where the third operation is used to trigger the electronic device to display the first region image in the enlarged wide-angle image.
  • the first area image includes a first target area image
  • the first target area image is an area in the wide-angle image having the same viewing range as that of the telephoto image.
  • the image viewing interface 1001 displayed by the electronic device includes a wide-angle image 903 .
  • the electronic device may receive a third operation (such as an expansion operation) performed by the user on the image viewing interface 1001, and trigger the electronic device to display an image 1002 as shown in (b) in FIG. area, the viewing range of the image 1002 is the same as the viewing range of the area 1003 in the wide-angle image 903 . That is, the image 1002 is an enlarged region 1003 .
  • the image 1002 includes a first target area image 1004, and the viewing range of the first target area image 1004 is the same as that of the telephoto image (for example, the telephoto image 904 shown in (c) of FIG. 9 ).
  • the electronic device displays the first region image in the enlarged wide-angle image
  • the quality of the image displayed by the electronic device is poor, so that the user cannot view the high-resolution image. quality images.
  • the electronic device in response to the third operation, may stitch the telephoto image on the first target area image for display. That is, in response to the third operation, the electronic device may display a region in the first region image other than the first target region image and the telephoto image. Exemplarily, as shown in (c) in FIG. 10 , the electronic device may splice the telephoto image 904 on the first target area image 1004 of the image 1002 for display.
  • calibration information is stored in the electronic device, and the calibration information includes: the relationship between the field of view of the wide-angle camera and the field of view of the telephoto camera, the viewfinder range of the telephoto image and the viewfinder range of the wide-angle image Correspondence between.
  • the electronic device may draw and display the telephoto image on the first target area image according to the calibration information.
  • the electronic device may display the area and the telephoto image in the first area image other than the first target area image.
  • the electronic device can simultaneously have the characteristics of the wide-angle image and the characteristics of the telephoto image in one image, thereby ensuring that the user can view the wide-angle image and the telephoto image at the same time.
  • the image quality of the telephoto image is higher than that of the first target area image, and the viewing range of the telephoto image is the same as that of the first target area image. Therefore, the image quality viewed by the user can be improved, and the use experience of the user can be improved.
  • the electronic device may control the timing at which the electronic device stitches the telephoto image onto the first target area image for display.
  • the electronic device controlling the electronic device to stitch the telephoto image on the first target area image to display the timing process you can refer to the method (a) and method (b) in the above embodiment where the electronic device controls the electronic device to stitch the main image. The description of the display timing on the first target area image will not be repeated here.
  • the first preset ratio is greater than 0.25, and the first preset ratio is smaller than 0.9025.
  • the first preset ratio may be 0.64.
  • the first preset zoom ratio is greater than 2.45X, and the first preset zoom ratio is smaller than 3.15X.
  • the first preset zoom ratio may be 3X.
  • the electronic device may control the timing of splicing and displaying the telephoto image on the image of the first target area. In this way, the electronic device can display a telephoto image under a preset condition, which improves user experience.
  • the electronic device stitches the telephoto image on the first target area image for display, since the telephoto image is stitched on the first target area image, the stitching place between the telephoto image and the first area image may be Abnormal phenomena such as image distortion and non-smoothness occur (for example, the stitching region 1005 shown in (c) in FIG. 10 ). In this way, the user experience may be poor.
  • the electronic device may perform blurring processing on areas in the first area image other than the first target area image. Specifically, for the manner in which the electronic device blurs the area in the first area image other than the first target area image, you can refer to the above-mentioned embodiment where the electronic device blurs the area in the first area image other than the first target area image The processing method will not be repeated here.
  • the electronic device performs blurring processing on areas in the first area image other than the first target area image, which can reduce the degree of abnormality at the joint of the telephoto image and the first area image. In this way, the image quality displayed by the electronic device can be improved, and the user experience is improved.
  • the first camera and the second camera can be combined (3), that is, the first camera can be the main camera, and the second camera can be a telephoto camera.
  • the electronic device can start the shooting application and display an image preview interface (namely the first interface), the image preview interface is a viewfinder interface for the electronic device to take pictures, and the image preview interface includes the first image captured by the main camera.
  • the shooting mode of the electronic device is a normal shooting mode.
  • the electronic device may display an image preview interface 1101, and the image preview interface 1101 includes a preview image 1102, and the preview image 1102 is an image collected by a main camera.
  • the preview image 1102 may be an image after the electronic device performs image processing on the main image and the telephoto image (for details, please refer to the following description, which will not be repeated here).
  • the electronic device may receive the user's photographing operation.
  • the electronic device can capture the first image (ie, the main image) through the main camera and the second image (ie, the telephoto image) through the telephoto camera at the same time.
  • the viewfinder range of the main image Larger than the viewfinder range of telephoto images.
  • the main image 1103 shown in (b) in Figure 11 is an image collected by the main camera
  • the telephoto image 1104 shown in (c) in Figure 11 is an image collected by the telephoto camera
  • the The viewfinder range is larger than the viewfinder range of the telephoto image 1104 .
  • the above-mentioned electronic device collects the main image through the main camera at the same time, and collects the telephoto image through the telephoto camera refers to: the moment when the main camera collects the main image (such as the second moment) and the time when the telephoto camera collects the telephoto image
  • the time of the second time (such as the third time) is the same; or, the time difference between the second time and the third time is small (for example, the time difference is less than 1 millisecond).
  • the embodiment of the present application does not limit the order in which the main camera acquires the main image and the telephoto camera acquires the telephoto image.
  • the electronic device in response to the photographing operation, after the electronic device collects the main image through the main camera and the telephoto image through the telephoto camera, the electronic device may save the main image and the telephoto image.
  • the electronic device may store the main image in a visible form, and store the telephoto image in an invisible form.
  • the electronic device may save the main image and the telephoto image in visible form.
  • the electronic device may receive a second operation, and the second operation is used to trigger the electronic device to display the main image.
  • the electronic device displays the main image.
  • the electronic device may receive a third operation, where the third operation is used to trigger the electronic device to display the first region image in the enlarged main image.
  • the first area image includes a first target area image
  • the first target area image is an area in the main image that is the same as the viewing range of the telephoto image.
  • the image viewing interface 1201 displayed by the electronic device includes a main image 1103 .
  • the electronic device may receive a third operation (such as an expansion operation) performed by the user on the image viewing interface 1201, and trigger the electronic device to display an image 1202 as shown in (b) in FIG. area, the viewing range of the image 1202 is the same as the viewing range of the area 1203 in the main image 1103 . That is, the image 1202 is an enlarged region 1203 .
  • the image 1202 includes a first target area image 1204, and the viewing range of the first target area image 1204 is the same as that of the telephoto image (for example, the telephoto image 1104 shown in (c) of FIG. 11 ).
  • the electronic device may stitch both the main image to the second target area image in the wide-angle image, and stitch the telephoto image to an area in the main image that has the same viewing range as the telephoto image.
  • the electronic device in response to the third operation, may stitch the telephoto image on the first target area image for display. That is, in response to the third operation, the electronic device may display a region in the first region image other than the first target region image and the telephoto image. Exemplarily, as shown in (c) of FIG. 12 , the electronic device may splice the telephoto image 1104 on the first target area image 1204 of the image 1202 for display.
  • the electronic device may display the area and the telephoto image in the first area image other than the first target area image.
  • the electronic device can simultaneously have the characteristics of the main image and the characteristics of the telephoto image in one image, thereby ensuring that the user can view the main image and the telephoto image at the same time.
  • the image quality of the telephoto image is higher than that of the first target area image, and the viewing range of the telephoto image is the same as that of the first target area image. Therefore, the image quality viewed by the user can be improved, and the use experience of the user can be improved.
  • the electronic device may control the timing at which the electronic device stitches the telephoto image onto the first target area image for display.
  • the electronic device controlling the electronic device to stitch the telephoto image on the first target area image to display the timing process you can refer to the method (a) and method (b) in the above embodiment where the electronic device controls the electronic device to stitch the main image. The description of the display timing on the first target area image will not be repeated here.
  • the first preset ratio is greater than 0.5, and the first preset ratio is smaller than 0.95.
  • the first preset ratio may be 0.8.
  • the first preset zoom ratio is greater than 2.45X, and the first preset zoom ratio is smaller than 3.15X.
  • the first preset zoom ratio may be 3X.
  • the electronic device may control the timing of splicing and displaying the telephoto image on the image of the first target area. In this way, the electronic device can display a telephoto image under a preset condition, which improves user experience.
  • the electronic device may perform blurring processing on areas in the first area image other than the first target area image. Specifically, for the manner in which the electronic device blurs the area in the first area image other than the first target area image, you can refer to the above-mentioned embodiment where the electronic device blurs the area in the first area image other than the first target area image The processing method will not be repeated here.
  • the electronic device performs blurring processing on areas in the first area image other than the first target area image, which can reduce the degree of abnormality at the joint of the telephoto image and the first area image. In this way, the image quality displayed by the electronic device can be improved, and the user experience is improved.
  • the electronic device when the first camera and the second camera are combined in (1), that is, the first camera can be a wide-angle camera, and the second camera can be a main camera, the electronic device can also include a long focus camera (i.e. the third camera). Wherein, the field angle of the wide-angle camera is larger than that of the main camera, and the field angle of the main camera is larger than that of the telephoto camera.
  • the electronic device can start the shooting application and display an image preview interface (namely the first interface).
  • the image preview interface is a viewfinder interface for the electronic device to take pictures.
  • the image preview interface includes the first image captured by the wide-angle camera. Exemplarily, as shown in (a) of FIG.
  • the shooting mode of the electronic device is a wide-angle shooting mode.
  • the electronic device can display an image preview interface 1301, and the image preview interface 1301 includes a preview image 1302, and the preview image 1302 is an image collected by a wide-angle camera.
  • the preview image 1302 may be an image after the electronic device performs image processing on the wide-angle image, the main image, and the telephoto image (for details, refer to the following description, which will not be repeated here).
  • the electronic device can receive a user's camera operation.
  • the electronic device can capture the first image (ie, the wide-angle image) through the wide-angle camera, the second image (ie, the main image) through the main camera, and the third image through the telephoto camera at the same time.
  • the viewfinder range of the telephoto camera to collect the telephoto image is the fourth viewfinder range (ie, the viewfinder range of the telephoto image).
  • the viewing range of the wide-angle image is larger than that of the telephoto image
  • the viewing range of the main image is larger than that of the telephoto image.
  • the wide-angle image 1303 shown in (b) in FIG. 13 is an image collected by a wide-angle camera
  • the main image 1304 shown in (c) in FIG. 13 is an image collected by a main camera
  • the telephoto image 1305 shown in ) is an image collected by a telephoto camera
  • the viewing range of the wide-angle image 1303 is larger than that of the main image 1304
  • the viewing range of the main image 1304 is larger than that of the telephoto image 1305.
  • the aforementioned electronic device collects a wide-angle image through a wide-angle camera, collects a main image through a main camera, and collects a telephoto image through a telephoto camera at the same time.
  • the moment when the main camera collects the main image (such as the second moment) is the same as the moment when the telephoto camera collects the telephoto image (such as the third moment); or, the time difference between the first moment and the second moment, the first moment and the second moment
  • the time difference between the three moments and the time difference between the second moment and the third moment are all small (for example, the time difference is less than 1 millisecond).
  • the embodiment of the present application does not limit the order in which the wide-angle camera acquires the wide-angle image, the main camera acquires the main image, and the telephoto camera acquires the telephoto image.
  • the electronic device in response to the photographing operation, after the electronic device collects a wide-angle image through a wide-angle camera, collects a main image through a main camera, and collects a telephoto image through a telephoto camera, the electronic device can save the wide-angle image, main image, and Telephoto image.
  • the electronic device can store the wide-angle image in a visible form, and store the main image and the telephoto image in an invisible form.
  • the electronic device may save the wide-angle image, the main image and the telephoto image in a visible form.
  • the electronic device may receive a second operation, and the second operation is used to trigger the electronic device to display the wide-angle image.
  • the electronic device displays a wide-angle image.
  • the electronic device may receive a third operation, where the third operation is used to trigger the electronic device to display the first region image in the enlarged wide-angle image.
  • the electronic device may stitch and display the main image on the first target area image.
  • the electronic device may receive a fourth operation, and the fourth operation is used to trigger the electronic device to display the second part of the enlarged main image.
  • An area image, the second area image includes a third target area image, and the viewing range of the third target area image relative to the second camera is a fifth viewing range, the second viewing range includes a fifth viewing range, and the fifth viewing range is the same as The fourth viewing range overlaps.
  • the fifth viewing range is the viewing range of the third target area image 1405 shown in (b) of FIG. 14 .
  • the electronic device can An image viewing interface 1401 is displayed, and the image viewing interface 1401 includes an image 1402 (that is, an image in which the main image 309 shown in (c) of FIG. 6 is spliced with the first target area image 604 ).
  • the electronic device may receive the user's fourth operation (such as an expansion operation) on the image viewing interface 1401, and trigger the electronic device to display an image 1404 as shown in (b) in FIG. , the viewing range of the image 1404 is the same as the viewing range of the area image 1403 in the image 1402 .
  • the image 1404 is the enlarged region image 1403 .
  • the image 1404 includes a third target area image 1405, and the viewing range of the third target area image 1405 is the same as that of the telephoto image (for example, the telephoto image 1305 shown in (d) of FIG. 13 ).
  • the electronic device in response to the fourth operation, may stitch the telephoto image on the third target area image for display. That is, in response to the fourth operation, the electronic device may display a region in the second region image other than the third target region image and the telephoto image. Exemplarily, as shown in (c) of FIG. 14 , the electronic device may splice the telephoto image 1305 on the third target area image 1405 of the image 1404 for display.
  • calibration information is stored in the electronic device, and the calibration information includes: the relationship between the field of view angle of the main camera and the field of view angle of the telephoto camera, the viewfinder range of the main image and the viewfinder range of the telephoto image Correspondence between.
  • the electronic device may draw and display the telephoto image on the third target area image according to the calibration information.
  • the electronic device may control the timing at which the electronic device stitches the telephoto image on the image of the third target area for display.
  • the electronic device can control the timing at which the electronic device stitches the telephoto image on the image of the third target area for display through the resolution of the image and the resolution of the display screen.
  • the electronic device may control the timing at which the electronic device stitches the telephoto image on the image of the third target area for display according to the zoom ratio of the image.
  • the electronic device can acquire the resolution of the image in the second area and the resolution of the display screen. Afterwards, the electronic device may calculate a second ratio according to the resolution of the image in the second area and the resolution of the display screen, where the second ratio is the ratio between the resolution of the image in the second area and the resolution of the display screen.
  • the second ratio is the ratio between the resolution of the image in the second area and the resolution of the display screen.
  • the electronic device may determine whether to splice the main image on the first target area image for display according to the second ratio and the first preset ratio.
  • the first preset ratio is greater than 0.5, and the first preset ratio is smaller than 0.95.
  • the embodiment of the present application will be introduced by taking the first preset ratio of 0.8 as an example.
  • the electronic device may stitch the telephoto image on the third target area image for display. If the second ratio is smaller than the first preset ratio, the electronic device only displays the second area image, and does not splice the telephoto image on the third target area image for display. Exemplarily, if the first ratio is 0.85, the electronic device may stitch and display the telephoto image on the third target area image (for example, (c) in FIG. 14 ). If the second ratio is 0.75, the electronic device only displays the second area image (eg (b) in FIG. 14 ).
  • the electronic device can control the telephoto image to be spliced and displayed on the third target area image. In this way, the electronic device can display a telephoto image under a preset condition, which improves user experience.
  • the electronic device may acquire the second magnification, which is the zoom magnification after the fourth operation triggers the electronic device to enlarge the main image (that is, the zoom magnification of the enlarged main image, such as the zoom magnification of image 1404 ).
  • the electronic device may acquire the second magnification in the following manner. Specifically, the electronic device may acquire the zoom ratio of the main image (that is, the unmagnified main image, such as the zoom ratio of the main image 1304), the resolution of the main image (such as the resolution of the main image 1304), and the resolution of the second area image. rate (such as the resolution of image 1404).
  • the electronic device may calculate the second magnification according to the zoom magnification of the main image, the resolution of the main image, and the resolution of the second area image. Specifically, for the method for calculating the second magnification by the electronic device, reference may be made to the above formula 2, which will not be repeated here.
  • the electronic device may determine whether to stitch the telephoto image on the third target area image for display according to the second magnification and the second preset zoom magnification.
  • the second preset zoom ratio is greater than 2.45X
  • the second preset zoom ratio is smaller than 3.15X.
  • the embodiment of the present application will be introduced by taking the first preset magnification as 3X as an example.
  • the electronic device may splice the telephoto image on the third target area image for display. If the second magnification is smaller than the second preset zoom magnification, the electronic device only displays the second area image, and does not splice the telephoto image on the third target area image for display. Exemplarily, if the second reduction ratio is 3.5X, the electronic device may stitch and display the telephoto image on the image of the third target area (for example, (c) in FIG. 14 ). If the second magnification is 2X, the electronic device only displays the second area image (eg (b) in FIG. 14 ).
  • the electronic device may control the timing of splicing and displaying the telephoto image on the image of the third target area. In this way, the electronic device can display a telephoto image under a preset condition, which improves user experience.
  • the electronic device may receive a fifth operation, and the fifth operation is used to trigger the electronic device to display the image of the third region in the enlarged fourth image, the third The area images include a third target area image.
  • the electronic device may perform image fusion on the third area image and the third image (ie, the telephoto image) to obtain a fifth image. Afterwards, the electronic device may display the fifth image.
  • the electronic device can control the timing of performing image fusion on the third area image and the telephoto image.
  • image fusion can improve image quality.
  • the viewfinder range of the image in the third area is larger, and the image quality of the telephoto image is higher.
  • the electronic device performs image fusion on the telephoto image and the third area image, and can synthesize the characteristics of the telephoto image and the third area image to obtain a fifth image with a larger viewfinder range and higher local image definition. In this way, the image quality viewed by the user can be improved, and the user experience is improved.
  • the electronic device stitches the telephoto image on the image of the third target area for display
  • the splicing part of the telephoto image and the image of the second area may be Abnormal phenomena such as image distortion and non-smoothness occur (for example, the stitching region 1406 shown in (c) of FIG. 14 ). In this way, user experience may be poor.
  • the electronic device may perform blurring processing on areas in the second area image other than the third target area image. Specifically, for the manner in which the electronic device blurs the area in the second area image other than the third target area image, you can refer to the above-mentioned embodiment where the electronic device blurs the area in the first area image other than the first target area image The processing method will not be repeated here.
  • the electronic device performs blurring processing on the areas in the second area image except the third target area image, which can reduce the degree of abnormality at the joint of the telephoto image and the second area image. In this way, the image quality displayed by the electronic device can be improved, and the user experience is improved.
  • the electronic device may display conventional images in a conventional display manner.
  • the electronic device may display a telephoto image, a main image, and the like on the screen.
  • the electronic device may also display the above stitched image (for example, the image shown in (c) in FIG. 6 and the image shown in (c) in FIG. 10 ) in a preset mode.
  • the electronic device in order to facilitate the electronic device to determine the display mode of the image, when the electronic device saves the above-mentioned first image, second image (or third image), it may add A first identifier, where the first identifier is used to instruct the electronic device to display the spliced image in a preset mode.
  • the electronic device may determine whether to display the image in a preset mode according to the image information of the image. Exemplarily, after receiving the third operation of the user, the electronic device may detect whether the first identifier exists in the image information of the image. If the image information has the first identifier, the electronic device can display the spliced image according to a preset mode. If the image information does not have the first identifier, the electronic device may display the image in a conventional display manner.
  • adding the first mark to the image by the electronic device can cause the electronic device to display the spliced image according to a preset mode. That is to say, the electronic device can simultaneously have the characteristics of the wide-angle image, the characteristics of the main image and the characteristics of the telephoto image in one image, thereby ensuring that the user can view the wide-angle image, the main image and the telephoto image at the same time. In this way, the image quality viewed by the user can be improved, and the user experience is improved.
  • the electronic device may share the first image, the second image and the third image with other electronic devices. Specifically, the electronic device may share the first image, the second image, and the third image with other electronic devices by transmitting a data packet. Alternatively, the electronic device may share the first image, the second image and the third image with other electronic devices by transmitting the first image, the second image and the third image respectively.
  • the electronic device may share the first image, the second image and the third image with other electronic devices by transmitting a data packet.
  • the electronic device may send a data packet to other electronic devices (which may be referred to as a receiving end device), where the data packet includes the first image, the second image, and the third image.
  • the receiver device can receive the data packet, and save the first image, the second image and the third image.
  • the receiver device may store the first image in a visible form, and store the second image and the third image in an invisible form (for example, FIG. 4A ). If the receiving end device can recognize the first identifier, the receiving end device can display the spliced image.
  • the receiving-end device may save the above-mentioned first image, second image, and third image in a visible form (for example, FIG. 4B ). If the receiving end device cannot recognize the first logo, the receiving end device only displays the first image.
  • the electronic device may share the first image, the second image and the third image with other electronic devices by transmitting the first image, the second image and the third image respectively.
  • the electronic device may send the first image, the second image, and the third image to the receiving end device.
  • the receiver device may save the first image, the second image and the third image in a visible form.
  • the receiving end device can display the spliced image.
  • the receiving end device may save the above-mentioned first image in a visible form, and save the second image and the third image in an invisible form. If the receiving end device cannot recognize the first logo, the receiving end device only displays the first image.
  • the receiving end device can recognize the first identifier, the receiving end device can also display the spliced image. That is to say, the receiver device can have the characteristics of the wide-angle image, the main image and the telephoto image in one image at the same time, thereby ensuring that the user can view the wide-angle image, the main image and the telephoto image at the same time. In this way, the image quality viewed by the user can be improved, and the user experience is improved.
  • the electronic device includes hardware structures and/or software modules corresponding to each function.
  • the steps of an image display method described in the embodiments disclosed in this application can be implemented in the form of hardware or a combination of hardware and computer software. Whether a certain function is executed by hardware or by electronic equipment software driving hardware depends on the specific application and design constraints of the technical solution. Those skilled in the art may use different methods to implement the described functions for each specific application, but such implementation should not be regarded as exceeding the scope of the present application.
  • the image display device can be divided into functional modules or functional units according to the above method examples.
  • each functional module or functional unit can be divided corresponding to each function, or two or more functions can be integrated. in a processing module.
  • the above-mentioned integrated modules can be implemented in the form of hardware, or in the form of software function modules or functional units.
  • the division of modules or units in the embodiment of the present application is schematic, and is only a logical function division, and there may be another division manner in actual implementation.
  • the electronic device may include memory and one or more processors.
  • the memory is coupled to the processor.
  • the electronic device may also include a camera. Alternatively, the electronic device can be connected with an external camera.
  • the memory is used to store computer program code comprising computer instructions.
  • the processor executes the computer instructions, the electronic device can execute various functions or steps performed by the mobile phone in the foregoing method embodiments.
  • the chip system includes at least one processor 1501 and at least one interface circuit 1502 .
  • the processor 1501 and the interface circuit 1502 can be interconnected through wires.
  • interface circuit 1502 may be used to receive signals from other devices, such as memory of an electronic device.
  • the interface circuit 1502 may be used to send signals to other devices (such as the processor 1501).
  • the interface circuit 1502 can read instructions stored in the memory, and send the instructions to the processor 1501 .
  • the electronic device such as the mobile phone 200 as shown in FIG. 2A
  • the chip system may also include other discrete devices, which is not specifically limited in this embodiment of the present application.
  • the embodiment of the present application also provides a computer-readable storage medium.
  • the computer-readable storage medium includes computer instructions.
  • the device executes various functions or steps executed by the mobile phone in the foregoing method embodiments.
  • the embodiment of the present application also provides a computer program product, which, when the computer program product is run on a computer, causes the computer to execute each function or step performed by the mobile phone in the method embodiment above.
  • the disclosed devices and methods may be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of the modules or units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components can be Incorporation or may be integrated into another device, or some features may be omitted, or not implemented.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be through some interfaces, and the indirect coupling or communication connection of devices or units may be in electrical, mechanical or other forms.
  • the unit described as a separate component may or may not be physically separated, and the component displayed as a unit may be one physical unit or multiple physical units, that is, it may be located in one place, or may be distributed to multiple different places . Part or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, each unit may exist separately physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units can be implemented in the form of hardware or in the form of software functional units.
  • the integrated unit is realized in the form of a software function unit and sold or used as an independent product, it can be stored in a readable storage medium.
  • the technical solution of the embodiment of the present application is essentially or the part that contributes to the prior art, or all or part of the technical solution can be embodied in the form of a software product, and the software product is stored in a storage medium Among them, several instructions are included to make a device (which may be a single-chip microcomputer, a chip, etc.) or a processor (processor) execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage medium includes: various media that can store program codes such as U disk, mobile hard disk, read only memory (ROM), random access memory (random access memory, RAM), magnetic disk or optical disk.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

一种图像的显示方法及电子设备,涉及电子设备技术领域,可以简化电子设备显示具备不同特点图像的过程。具体方案包括:电子设备可以接收用户在第一界面的第一操作,第一界面是电子设备拍照的取景界面,第一界面包括第一摄像头采集的预览图像。响应于第一操作,电子设备可以保存第一摄像头采集的第一图像,并保存第二摄像头采集的第二图像。响应于第二操作,电子设备可以显示第一图像。之后,电子设备可以接收第三操作,第三操作用于触发电子设备显示放大后的第一图像中的第一区域图像,第一区域图像包括第一目标区域图像。响应于第三操作,电子设备可以将第二图像拼接在第一目标区域图像上显示。

Description

一种图像的显示方法及电子设备
本申请要求于2021年05月10日提交国家知识产权局、申请号为202110506826.5、发明名称为“一种图像的显示方法及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及电子设备技术领域,尤其涉及一种图像的显示方法及电子设备。
背景技术
随着电子技术的发展,电子设备(如手机、平板电脑或智能手表等)的功能越来越多。例如,大多数电子设备中均可以安装摄像头,使电子设备具有拍摄图像的功能。
以手机为例,手机中可以安装多个摄像头,如主摄像头、长焦摄像头和广角摄像头等。其中,基于上述各个摄像头的特点,手机可以在同一个拍摄场景下,采用不同的摄像头拍摄图像,以得到不同特点的图像。例如,基于长焦摄像头焦距长的特点,手机可以采用长焦摄像头,拍摄得到局部清晰的长焦图像。又例如,基于主摄像头进光量大和分辨率高的特点,手机可以采用主摄像头,拍摄整体较为清晰的图像。又例如,基于广角摄像头焦距短和视角大的特点,手机可以采用广角摄像头,拍摄视角较大的图像。之后,用户可以通过手机查看不同特点的图像。
然而,上述技术方案中,电子设备需要切换显示的图像,才可以使用户通过手机查看到不同特点的图像。电子设备显示具备不同特点图像的过程较为繁琐,影响用户的使用体验。
发明内容
本申请提供一种图像的显示方法及电子设备,可以简化电子设备显示具备不同特点图像的过程,提高用户的使用体验。
第一方面,本申请提供一种图像的显示方法,该方法可以应用于电子设备,该电子设备包括显示屏、第一摄像头和第二摄像头,第一摄像头的视场角大于第二摄像头的视场角。
该方法中,电子设备可以接收用户在第一界面的第一操作,第一界面是电子设备拍照的取景界面,第一界面包括第一摄像头采集的预览图像。响应于第一操作,电子设备可以保存第一摄像头采集的第一图像,并保存第二摄像头采集的第二图像,第一摄像头采集第一图像的取景范围是第一取景范围,第二摄像头采集第二图像的取景范围是第二取景范围,第一取景范围大于第二取景范围。之后,电子设备可以接收第二操作,第二操作用于触发电子设备显示第一图像。响应于第二操作,电子设备可以显示第一图像。之后,电子设备可以接收第三操作,第三操作用于触发电子设备显示放大后的第一图像中的第一区域图像,第一区域图像包括第一目标区域图像,第一目标区域图像相对于第一摄像头的取景范围是第三取景范围,第一取景范围包括第三取景范围,第三取景范围与第二取景范围重合。响应于第三操作,电子设备可以将第二图 像拼接在第一目标区域图像上显示。
基于上述技术方案可知,电子设备在接收到第三操作之后,可以显示第一区域图像中除第一目标区域图像以外的区域和第二图像。如此,电子设备可以在一张图像中同时具备第一图像的特点和第二图像的特点,进而可以保障用户同时查看第一图像和第二图像。并且,相较于常规技术方案,本申请技术方案可以简化电子设备显示具备不同特点图像的过程,提高用户的使用体验。
结合第一方面,在一种可能的设计方式中,在电子设备接收第二操作之后,该方法还包括:响应于第二操作,电子设备可以将第二图像拼接在第一图像中第二目标区域图像上显示,第二目标区域图像相对于第一摄像头的取景范围与第二取景范围重合。
也就是说,电子设备显示第一图像时,无需接收用户的操作(例如第三操作),可以直接将第二图像拼接在第一图像中第二目标区域图像上显示。如此,电子设备可以同时显示第一图像中的部分区域和第二图像。相较于常规技术方案,本申请技术方案可以简化电子设备显示具备不同特点图像的过程,提高用户的使用体验。
结合第一方面,在另一种可能的设计方式中,在电子设备接收第三操作之后,方法还包括:电子设备可以获取第一区域图像的分辨率和显示屏的分辨率。电子设备可以根据第一区域图像的分辨率和显示屏的分辨率,计算得到第一比值,第一比值为第一区域图像的分辨率与显示屏的分辨率之间的比值。之后,若第一比值大于第一预设比值,电子设备则将第二图像拼接在第一目标区域图像上显示。
可以理解的是,电子设备通过比较第一比值和第一预设比值,可以控制将第二图像拼接在第一目标区域图像上显示的时机。如此,可以使电子设备在预设条件下显示第二图像,提高了用户的使用体验。
结合第一方面,在另一种可能的设计方式中,在电子设备接收第三操作之后,方法还包括:电子设备可以获取第一倍率,第一倍率为第三操作触发电子设备放大第一图像后的缩放倍率。若第一倍率大于第一预设缩放倍率,电子设备则将第二图像拼接在第一目标区域图像上显示。
可以理解的是,电子设备通过比较第一倍率和第一预设缩放倍率,可以控制将第二图像拼接在第一目标区域图像上显示。如此,可以使电子设备在预设条件下显示第二图像,提高了用户的使用体验。
结合第一方面,在另一种可能的设计方式中,电子设备获取第一倍率,可以包括:电子设备可以获取第一图像的缩放倍率、第一图像的分辨率和第一区域图像的分辨率。之后,电子设备根据第一图像的缩放倍率、第一图像的分辨率和第一区域图像的分辨率,计算得到第一倍率。
可以理解的是,电子设备获取第一图像的缩放倍率、第一图像的分辨率和第一区域图像的分辨率之后,可以计算得到第一倍率,进而控制将第二图像拼接在第一目标区域图像上显示。如此,可以使电子设备在预设条件下显示第二图像,提高了用户的使用体验。
结合第一方面,在另一种可能的设计方式中,第一倍率满足下述公式:
Figure PCTCN2022079142-appb-000001
其中,M用于表示第一倍率,B用于表示第一图像的分辨率,A用于表示第一区域图像的分辨率,Z用于表示第一图像的缩放倍率。
结合第一方面,在另一种可能的设计方式中,响应于第三操作,电子设备可以对第一区域图像中除第一目标区域图像以外的区域进行模糊处理。
可以理解的是,电子设备对第一区域图像中除第一目标区域图像以外的区域进行模糊处理,可以减小第二图像与第一区域图像的拼接处的异常程度。如此,可以提高电子设备显示的图像质量,提高了用户的使用体验。
结合第一方面,在另一种可能的设计方式中,电子设备还包括第三摄像头,第二摄像头的视场角大于第三摄像头的视场角。该方法还包括:响应于拍照操作,电子设备可以保存通过第三摄像头采集的第三图像,第三摄像头采集第三图像的取景范围是第四取景范围,第二取景范围大于第四取景范围。
结合第一方面,在另一种可能的设计方式中,电子设备可以接收第四操作,第四操作用于触发电子设备显示放大后的第二图像中的第二区域图像,第二区域图像包括第三目标区域图像,第三目标区域图像相对于第二摄像头的取景范围是第五取景范围,第二取景范围包括第五取景范围,第五取景范围与第四取景范围重合。电子设备可以获取第二区域图像的分辨率和显示屏的分辨率。电子设备可以根据第二区域图像的分辨率和显示屏的分辨率,计算得到第二比值,第二比值为第二区域图像的分辨率与显示屏的分辨率之间的比值。若第二比值大于第一预设比值,电子设备则将第三图像拼接在第三目标区域图像上显示。
也就是说,电子设备在接收到第三操作之后,可以显示第二区域图像中除第三目标区域图像以外的区域和第三图像。如此,电子设备可以在一张图像中同时具备第二图像的特点和第三图像的特点,进而可以保障用户同时查看第二图像和第三图像。并且,相较于常规技术方案,本申请技术方案可以简化电子设备显示具备不同特点图像的过程,提高用户的使用体验。
结合第一方面,在另一种可能的设计方式中,电子设备可以接收第四操作,第四操作用于触发电子设备显示放大后的第二图像中的第二区域图像,第二区域图像包括第三目标区域图像,第三目标区域图像相对于第二摄像头的取景范围是第五取景范围,第二取景范围包括第五取景范围,第五取景范围与第四取景范围重合。电子设备可以获取第二倍率,第二倍率为第四操作触发电子设备放大第二图像后的缩放倍率。若第二倍率大于第二预设缩放倍率,电子设备则将第三图像拼接在第三目标区域图像上显示。
也就是说,电子设备在接收到第三操作之后,可以显示第二区域图像中除第三目标区域图像以外的区域和第三图像。如此,电子设备可以在一张图像中同时具备第二图像的特点和第三图像的特点,进而可以保障用户同时查看第二图像和第三图像。并且,相较于常规技术方案,本申请技术方案可以简化电子设备显示具备不同特点图像的过程,提高用户的使用体验。
第二方面,本申请提供一种电子设备,该电子设备包括:存储器、显示屏和一个或多个处理器,上述存储器、显示屏与上述处理器耦合;存储器用于存储计算机程序代码,计算机程序代码包括计算机指令;当计算机指令被上述一个或多个处理器执行 时,上述处理器,用于接收用户在第一界面的第一操作,第一界面是电子设备拍照的取景界面,第一界面包括第一摄像头采集的预览图像。上述存储器,用于响应于第一操作,保存第一摄像头采集的第一图像,并保存第二摄像头采集的第二图像,第一摄像头采集第一图像的取景范围是第一取景范围,第二摄像头采集第二图像的取景范围是第二取景范围,第一取景范围大于第二取景范围。上述显示屏,用于响应于第二操作,显示第一图像。上述处理器,还用于接收第三操作,第三操作用于触发电子设备显示放大后的第一图像中的第一区域图像,第一区域图像包括第一目标区域图像,第一目标区域图像相对于第一摄像头的取景范围是第三取景范围,第一取景范围包括第三取景范围,第三取景范围与第二取景范围重合。上述处理器,还用于响应于第三操作,将第二图像拼接在第一目标区域图像上显示。
结合第二方面,在一种可能的设计方式中,当计算机指令被上述一个或多个处理器执行时,上述处理器,还用于响应于第二操作,将第二图像拼接在第一图像中第二目标区域图像上显示,第二目标区域图像相对于第一摄像头的取景范围与第二取景范围重合。
结合第二方面,在另一种可能的设计方式中,当计算机指令被上述一个或多个处理器执行时,上述处理器,还用于获取第一区域图像的分辨率和显示屏的分辨率。上述处理器,还用于根据第一区域图像的分辨率和显示屏的分辨率,计算得到第一比值,第一比值为第一区域图像的分辨率与显示屏的分辨率之间的比值。上述处理器,还用于若第一比值大于第一预设比值,则将第二图像拼接在第一目标区域图像上显示。
结合第二方面,在另一种可能的设计方式中,当计算机指令被上述一个或多个处理器执行时,上述处理器,还用于获取第一倍率,第一倍率为第三操作触发电子设备放大第一图像后的缩放倍率。上述处理器,还用于若第一倍率大于第一预设缩放倍率,则将第二图像拼接在第一目标区域图像上显示。
结合第二方面,在另一种可能的设计方式中,当计算机指令被上述一个或多个处理器执行时,上述处理器,还用于获取第一图像的缩放倍率、第一图像的分辨率和第一区域图像的分辨率。上述处理器,还用于根据第一图像的缩放倍率、第一图像的分辨率和第一区域图像的分辨率,计算得到第一倍率。
结合第二方面,在另一种可能的设计方式中,第一倍率满足下述公式:
Figure PCTCN2022079142-appb-000002
其中,M用于表示第一倍率,B用于表示第一图像的分辨率,A用于表示第一区域图像的分辨率,Z用于表示第一图像的缩放倍率。
结合第二方面,在另一种可能的设计方式中,计算机指令被上述一个或多个处理器执行时,上述处理器,还用于响应于第三操作,对第一区域图像中除第一目标区域图像以外的区域进行模糊处理。
结合第二方面,在另一种可能的设计方式中,电子设备还包括第三摄像头,第二摄像头的视场角大于第三摄像头的视场角。当计算机指令被上述一个或多个处理器执行时,上述存储器,还用于响应于拍照操作,保存通过第三摄像头采集的第三图像,第三摄像头采集第三图像的取景范围是第四取景范围,第二取景范围大于第四取景范 围。
结合第二方面,在另一种可能的设计方式中,当计算机指令被上述一个或多个处理器执行时,上述处理器,还用于接收第四操作,第四操作用于触发显示屏显示放大后的第二图像中的第二区域图像,第二区域图像包括第三目标区域图像,第三目标区域图像相对于第二摄像头的取景范围是第五取景范围,第二取景范围包括第五取景范围,第五取景范围与第四取景范围重合。上述处理器,还用于获取第二区域图像的分辨率和显示屏的分辨率。上述处理器,还用于根据第二区域图像的分辨率和显示屏的分辨率,计算得到第二比值,第二比值为第二区域图像的分辨率与显示屏的分辨率之间的比值。上述处理器,还用于若第二比值大于第一预设比值,则将第三图像拼接在第三目标区域图像上显示。
结合第二方面,在另一种可能的设计方式中,当计算机指令被上述一个或多个处理器执行时,上述处理器,还用于接收第四操作,第四操作用于触发显示屏显示放大后的第二图像中的第二区域图像,第二区域图像包括第三目标区域图像,第三目标区域图像相对于第二摄像头的取景范围是第五取景范围,第二取景范围包括第五取景范围,第五取景范围与第四取景范围重合。上述处理器,还用于获取第二倍率,第二倍率为第四操作触发电子设备放大第二图像后的缩放倍率。上述处理器,还用于若第二倍率大于第二预设缩放倍率,则将第三图像拼接在第三目标区域图像上显示。
第三方面,本申请提供一种电子设备,该电子设备包括:存储器、显示屏和一个或多个处理器,上述存储器、显示屏与上述处理器耦合;存储器用于存储计算机程序代码,计算机程序代码包括计算机指令;当计算机指令被上述一个或多个处理器执行时,使得电子设备执行如第一方面及其任一种可能的设计方式所述的方法。
第四方面,本申请提供一种芯片系统,该芯片系统应用于电子设备。该芯片系统包括一个或多个接口电路和一个或多个处理器。该接口电路和处理器通过线路互联。该接口电路用于从电子设备的存储器接收信号,并向处理器发送该信号,该信号包括存储器中存储的计算机指令。当处理器执行所述计算机指令时,电子设备执行如第一方面及其任一种可能的设计方式所述的方法。
第五方面,本申请提供一种计算机可读存储介质,该计算机可读存储介质包括计算机指令,当所述计算机指令在电子设备上运行时,使得所述电子设备执行如第一方面及其任一种可能的设计方式所述的方法。
第六方面,本申请提供一种计算机程序产品,当所述计算机程序产品在计算机上运行时,使得所述计算机执行如第一方面及其任一种可能的设计方式所述的方法。
可以理解地,上述提供的第二方面及其任一种可能的设计方式所述的电子设备,第三方面所述的电子设备,第四方面所述的芯片系统,第五方面所述的计算机可读存储介质,第六方面所述的计算机程序产品所能达到的有益效果,可参考如第一方面及其任一种可能的设计方式中的有益效果,此处不再赘述。
附图说明
图1为本申请实施例提供的一种图像查看界面的实例示意图;
图2A为本申请实施例提供的一种电子设备的硬件结构示意图;
图2B为本申请实施例提供的一种取景范围的实例示意图;
图3为本申请实施例提供的一种图像预览界面的实例示意图;
图4A为本申请实施例提供的一种保存图像界面的实例示意图;
图4B为本申请实施例提供的另一种保存图像界面的实例示意图;
图5为本申请实施例提供的显示图像的实例示意图;
图6为本申请实施例提供的另一种图像查看界面的实例示意图;
图7为本申请实施例提供的一种图像的取景范围的实例示意图;
图8为本申请实施例提供的另一种图像查看界面的实例示意图;
图9为本申请实施例提供的另一种图像预览界面的实例示意图;
图10为本申请实施例提供的另一种图像查看界面的实例示意图;
图11为本申请实施例提供的另一种图像预览界面的实例示意图;
图12为本申请实施例提供的另一种图像查看界面的实例示意图;
图13为本申请实施例提供的另一种图像预览界面的实例示意图;
图14为本申请实施例提供的另一种图像查看界面的实例示意图;
图15为本申请实施例提供的一种芯片系统的结构组成示意图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其它实施例,都属于本申请保护的范围。
本申请中字符“/”,一般表示前后关联对象是一种“或者”的关系。例如,A/B可以理解为A或者B。
术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征。在本实施例的描述中,除非另有说明,“多个”的含义是两个或两个以上。
此外,本申请的描述中所提到的术语“包括”和“具有”以及它们的任何变形,意图在于覆盖不排他的包含。例如包含了一系列步骤或模块的过程、方法、系统、产品或设备没有限定于已列出的步骤或模块,而是可选地还包括其他没有列出的步骤或模块,或可选地还包括对于这些过程、方法、产品或设备固有的其它步骤或模块。
另外,在本申请实施例中,“示例性的”、或者“例如”等词用于表示作例子、例证或说明。本申请中被描述为“示例性的”或“例如”的任何实施例或设计方案不应被解释为比其它实施例或设计方案更优选或更具优势。确切而言,使用“示例性的”、或者“例如”等词旨在以具体方式呈现概念。
为了便于理解本申请的技术方案,在对本申请实施例的深度图像的获取方法进行详细介绍之前,先对本申请实施例中所提到的专业名词进行介绍。
1、超分辨率重建。
超分辨率重建指的是就是利用一幅或者一组低质量、低分辨率图像生成一幅高质量、高分辨率图像。其中超分辨率重建可以包括基于重建的方法或者基于学习的方法。
2、图像信号处理(Image Signal Processing,ISP)模块。
在摄像头采集到原始图像(即RAW格式的图像)之后,电子设备可以将原始图像传送到ISP模块中。其中,RAW格式是未经处理、也未经压缩的格式。之后,ISP模块可以对原始图像进行分析,检查图像中相邻像素之间的密度差距。接着,ISP模块可以使用ISP模块中的预设调节算法对该原始图像进行适当调整,以提高摄像头采集的图像质量。
对本申请实施例中所提到的专业名词进行介绍之后,下面对常规技术进行介绍。
随着电子技术的发展,电子设备(如手机、平板电脑或智能手表等)的功能越来越多。以手机为例,手机中可以安装多个摄像头,如主摄像头、长焦摄像头和广角摄像头等。手机可以在同一个拍摄场景下,采用不同的摄像头拍摄图像,以得到不同特点的图像。
常规技术中,当用户通过电子设备拍摄同一场景下的不同图像(例如主图像、长焦图像和广角图像)时,用户需要切换电子设备的拍摄模式,才可以得到同一场景下的不同图像。其中,主图像是电子设备通过主摄像头的图像,长焦图像是电子设备通过长焦摄像头采集的图像,广角图像是电子设备通过广角摄像头采集的图像。并且,在电子设备拍摄得到主图像、长焦图像和广角图像之后,电子设备需要响应于用户的切换操作,显示具备不同特点的图像。
示例性的,如图1中的(a)所示,电子设备显示图像显示界面101,该图像显示界面101包括广角图像102。响应于用户的切换操作(例如向左滑动的操作),电子设备可以将显示的广角图像102切换为如图1中的(b)所示的主图像103。之后,响应于用户的切换操作,电子设备可以将显示的主图像103切换为如图1中的(c)所示的长焦图像104。
然而,上述方案中,电子设备不仅需要响应于用户的多次操作,才可以拍摄得到多张具备不同特点的图像。并且,电子设备需要切换显示的图像,才可以使用户通过手机查看到不同特点的图像。
为此,本申请实施例提供一种图像的显示方法。该方法中,电子设备可以响应于用户的一次拍照操作,通过不同的摄像头拍摄得到第一图像和第二图像,第一图像的取景范围大于第二图像的取景范围。如此,电子设备只需响应于用户的一次操作,便可以得到多张具备不同特点的图像,简化了电子设备的拍摄过程。
并且,在电子设备显示第一图像时,若电子设备接收到放大第一图像的缩放倍率的操作,电子设备可以在第一图像的上层显示第二图像。如此,电子设备可以同时显示第一图像和第二图像,使用户可以同时查看到具备不同特点的图像,提高了用户的使用体验。
需要说明的是,本申请实施例中电子设备通过摄像头采集的图像可以为:ISP对摄像头采集到原始图像进行调整后的图像。示例性的,电子设备通过主摄像头采集到的图像为:由ISP对主摄像头采集到的原始图像进行调整后的图像;电子设备通过长焦摄像头采集到的图像为:由ISP对长焦摄像头采集到的原始图像进行调整后的图像;电子设备通过广角摄像头采集到的图像为:由ISP对广角摄像头采集到的原始图像进行调整后的图像。可选的,本申请实施例中电子设备通过摄像头采集的图像可以为RAW格式的图像(即原始图像),本申请实施例对此不作限定。其中,RAW格式的 图像是一种记录了摄像头传感器的原始信息,同时记录了由摄像头拍摄图像所产生的一些元数据(ISO的设置、快门速度、光圈值、白平衡等)的图像,且该图像未被IPS模块进行处理。其中,ISO是国际标准化组织(International Organization for Standardization)的缩写。
示例性的,本申请实施例中的电子设备可以是平板电脑、手机、桌面型、膝上型、手持计算机、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本,以及蜂窝电话、个人数字助理(personal digital assistant,PDA)、增强现实(augmented reality,AR)\虚拟现实(virtual reality,VR)设备、车载设备等设备,本申请实施例对该电子设备的具体形态不作特殊限制。
本申请提供的图像的显示方法的执行主体可以为图像的显示装置,该执行装置可以为图2A所示的电子设备。同时,该执行装置还可以为该电子设备的中央处理器(Central Processing Unit,CPU),或者该电子设备中的用于融合图像的控制模块。本申请实施例中以电子设备执行图像的显示方法为例,说明本申请实施例提供的图像的显示方法。
请参考图2A,本申请这里以电子设备为图2A所示的手机200为例,对本申请提供的电子设备进行介绍。其中,图2A所示的手机200仅仅是电子设备的一个范例,并且手机200可以具有比图中所示出的更多的或者更少的部件,可以组合两个或更多的部件,或者可以具有不同的部件配置。图2A中所示出的各种部件可以在包括一个或多个信号处理和/或专用集成电路在内的硬件、软件、或硬件和软件的组合中实现。
如图2A所示,手机200可以包括:处理器210,外部存储器接口220,内部存储器221,通用串行总线(universal serial bus,USB)接口230,充电管理模块240,电源管理模块241,电池242,天线1,天线2,移动通信模块250,无线通信模块260,音频模块270,扬声器270A,受话器270B,麦克风270C,耳机接口270D,传感器模块280,按键290,马达291,指示器292,摄像头293,显示屏294,以及用户标识模块(subscriber identification module,SIM)卡接口295等。
其中,上述传感器模块280可以包括压力传感器,陀螺仪传感器,气压传感器,磁传感器,加速度传感器,距离传感器,接近光传感器,指纹传感器,温度传感器,触摸传感器,环境光传感器和骨传导传感器等传感器。
处理器210可以包括一个或多个处理单元,例如:处理器210可以包括存储器,视频编解码器,基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
控制器可以是手机200的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器210中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器210中的存储器为高速缓冲存储器。
在一些实施例中,处理器210可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
可以理解的是,本实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对手机200的结构限定。在另一些实施例中,手机200也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
充电管理模块240用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。充电管理模块240为电池242充电的同时,还可以通过电源管理模块241为电子设备供电。
电源管理模块241用于连接电池242,充电管理模块240与处理器210。电源管理模块241接收电池242和/或充电管理模块240的输入,为处理器210,内部存储器221,外部存储器,显示屏294,摄像头293,和无线通信模块260等供电。在一些实施例中,电源管理模块241和充电管理模块240也可以设置于同一个器件中。
手机200的无线通信功能可以通过天线1,天线2,移动通信模块250,无线通信模块260,调制解调处理器以及基带处理器等实现。在一些实施例中,手机200的天线1和移动通信模块250耦合,天线2和无线通信模块260耦合,使得手机200可以通过无线通信技术与网络以及其他设备通信。
天线1和天线2用于发射和接收电磁波信号。手机200中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。移动通信模块250可以提供应用在手机200上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块250可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块250还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块250的至少部分功能模块可以被设置于处理器210中。
无线通信模块260可以提供应用在手机200上的包括无线局域网(wireless local area networks,WLAN)(如(wireless fidelity,Wi-Fi)网络),调频(frequency modulation,FM),红外技术(infrared,IR)等无线通信的解决方案。例如,本申请实施例中,手机200可以通过无线通信模块260接入Wi-Fi网络。无线通信模块260可以是集成至少一个通信处理模块的一个或多个器件。
手机200通过GPU,显示屏294,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏294和应用处理器。
显示屏294用于显示图像,视频等。该显示屏294包括显示面板。例如,本申请实施例中,显示屏294可以用于显示图库界面和拍摄界面等。
手机200可以通过ISP,摄像头293,视频编解码器,GPU,显示屏294以及应用处理器等实现拍摄功能。ISP用于处理摄像头293反馈的数据。摄像头293用于捕获静态图像或视频。在一些实施例中,手机200可以包括1个或N个摄像头293,N为大于1的正整数。
在本申请实施例中,N个摄像头293可以包括:主摄像头、长焦摄像头和广角摄像头。可选的,N个摄像头293还可以包括:红外摄像头、深度摄像头或者黑白摄像头等至少一种摄像头。下面简单介绍上述各个摄像头的特点(即优势和劣势)以及适用场景。
(1)主摄像头。主摄像头具有进光量大、分辨率高,以及视场角适中的特点。主 摄像头一般作为电子设备(如手机)的默认摄像头。也就是说,电子设备(如手机)响应于用户启动“相机”应用的操作,可以默认启动主摄像头,在预览界面显示主摄像头采集的图像。
(2)长焦摄像头。长焦摄像头的焦距较长,可适用于拍摄距离手机较远的拍摄对象(即远处的物体)。但是,长焦摄像头的进光量较小。在暗光场景下使用长焦摄像头拍摄图像,可能会因为进光量不足而影响图像质量。并且,长焦摄像头的视场角较小,不适用于拍摄较大场景的图像,即不适用于拍摄较大的拍摄对象(如建筑或风景等)。
(3)广角摄像头。广角摄像头的视场角较大,可适用于拍摄较大的拍摄对象(例如风景)。但是,广角摄像头的焦距较短,在广角摄像头拍摄距离较近的物体时,拍摄得到的广角图像中物体容易产生畸变(例如图像中的物体相较于原物体变得宽扁)。
(4)黑白摄像头。由于黑白摄像头没有滤光片;因此,相比于彩色摄像头而言,黑白摄像头的进光量较大;并且,黑白摄像头的对焦速度比彩色摄像头的对焦速度快。但是,黑白摄像头采集到的图像只能呈现出不同等级的灰度,不能呈现出拍摄对象的真实色彩。需要说明的是,上述主摄像头、长焦摄像头等均为彩色摄像头。
需要说明的是,本申请实施例中的视场角包括水平视场角和垂直视场角。
需要说明的是,手机200可以具备多种拍摄模式。例如,该多种拍摄模式包括:广角拍摄模式、普通拍摄模式和长焦拍摄模式等。手机200采用不同的拍摄模式,可以通过不同的摄像头拍摄图像。例如,当手机200的拍摄模式为广角拍摄模式时,手机200可以通过广角摄像头和主摄像头采集图像(或者通过广角摄像头、主摄像头和长焦摄像头采集图像)。又例如,当手机200的拍摄模式为普通拍摄模式时,手机200可以通过主摄像头和长焦摄像头采集图像。又例如,当手机200的拍摄模式为长焦拍摄模式时,手机200可以通过长焦摄像头拍摄图像。
外部存储器接口220可以用于连接外部存储卡,例如Micro SD卡,实现扩展手机200的存储能力。外部存储卡通过外部存储器接口220与处理器210通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器221可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器210通过运行存储在内部存储器221的指令,从而执行手机200的各种功能应用以及数据处理。例如,在本申请实施例中,处理器210可以通过执行存储在内部存储器221中的指令,内部存储器221可以包括存储程序区和存储数据区。
其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储手机200使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器221可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
手机200可以通过音频模块270,扬声器270A,受话器270B,麦克风270C,耳机接口270D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
按键290包括开机键,音量键等。按键290可以是机械按键。也可以是触摸式按键。马达291可以产生振动提示。马达291可以用于来电振动提示,也可以用于触摸 振动反馈。指示器292可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。SIM卡接口295用于连接SIM卡。SIM卡可以通过插入SIM卡接口295,或从SIM卡接口295拔出,实现和手机200的接触和分离。手机200可以支持1个或N个SIM卡接口,N为大于1的正整数。SIM卡接口295可以支持Nano SIM卡,Micro SIM卡,SIM卡等。
尽管图2A未示出,手机200还可以闪光灯、微型投影装置、近场通信(Near Field Communication,NFC)装置等,在此不再赘述。
可以理解的是,本实施例示意的结构并不构成对手机200的具体限定。在另一些实施例中,手机200可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
以下实施例中的方法均可以在具有上述硬件结构的电子设备中实现。以下实施例中以上述具有上述硬件结构的电子设备为例,对本申请实施例的方法进行说明。
在一些实施例中,电子设备中可以包括第一摄像头和第二摄像头。其中,第一摄像头的视场角大于第二摄像头的视场角。具体的,第一摄像头的视场角大于第二摄像头的视场角是指,第一摄像头的水平视场角大于第二摄像头的水平视场角,第一摄像头的垂直视场角大于第二摄像头的垂直视场角。
第一摄像头采集第一图像的取景范围是第一取景范围,第二摄像头采集第二图像的取景范围是第二取景范围。第一取景范围大于第二取景范围。
需要说明的是,本申请实施例中摄像头采集图像的取景范围是指,摄像头可以拍摄的区域范围。示例性的,如图2B所示,主摄像头可以采集到区域202中区域203对应的图像。也就是说,区域202中除区域203以外的区域不在主摄像头采集图像的取景范围内。并且,本申请实施例中,图像的取景范围与摄像头采集图像的取景范围相对应。例如,第一图像的取景范围可以指示第一摄像头采集第一图像的取景范围(即第一取景范围)。又例如,第二图像的取景范围可以指示第二摄像头采集第二图像的取景范围(即第二取景范围)。
在本申请实施例中,第一摄像头和第二摄像头可以有组合方式(1)、组合方式(2)和组合方式(3)等三种组合方式。组合方式(1)中,第一摄像头可以为广角摄像头,第二摄像头可以为主摄像头。组合方式(2)中,第一摄像头可以为广角摄像头,第二摄像头为长焦摄像头。组合方式(3)中,第一摄像头可以为主摄像头,第二摄像头可以为长焦摄像头。下面分别结合组合方式(1)、组合方式(2)和组合方式(3)对本申请实施例进行说明。
在一些实施例中,第一摄像头和第二摄像头可以为组合方式(1),即第一摄像头可以为广角摄像头,第二摄像头可以为主摄像头。电子设备可以启动拍摄应用,显示图像预览界面(即第一界面),该图像预览界面是电子设备拍照的取景界面,该图像预览界面包括广角摄像头采集的第一图像。示例性的,如图3中的(a)所示,在电子设备启动拍摄应用之后,电子设备的拍摄模式为广角拍摄模式。电子设备可以显示图像预览界面301,该图像预览界面301包括取景框302、摄像头转化键303、拍摄键304、相册键305、预览图像306、闪光灯选项307、“录像”选项、“拍照”选项、“更多” 选项等。其中,预览图像306为广角摄像头采集的图像。可选的,该预览图像306可以为电子设备对广角图像和主图像进行图像处理之后的图像(具体可以参考下述描述,此处不予赘述)。之后,电子设备可以接收用户的拍照操作。响应于用户在图像预览界面的拍照操作,电子设备可以在同一时刻通过广角摄像头采集第一图像(即广角图像),通过主摄像头采集第二图像(即主图像),广角图像的取景范围大于主图像的取景范围。示例性的,图3中的(b)所示的广角图像308为广角摄像头采集的图像,图3中的(c)所示的主图像309为主摄像头采集的图像,广角图像308的取景范围大于主图像309的取景范围。
需要说明的是,上述电子设备在同一时刻通过广角摄像头采集广角图像,通过主摄像头采集主图像是指:广角摄像头采集广角图像的时刻(如第一时刻)和主摄像头采集主图像的时刻(如第二时刻)相同;或者,第一时刻与第二时刻之间的时间差较小(例如时间差均小于1毫秒0.5毫秒或者2毫秒等)。在第一时刻与第二时刻之间存在时间差时,本申请实施例对广角摄像头采集广角图像、主摄像头采集主图像的顺序不作限定。
在本申请实施例中,响应于拍照操作(也可以称为第一操作),在电子设备通过广角摄像头采集广角图像,通过主摄像头采集主图像之后,电子设备可以保存该广角图像和主图像。其中,电子设备可以通过方式(a)和方式(b)保存广角图像和主图像。方式(a)为电子设备以可见形式保存广角图像,以不可见形式保存主图像。方式(b)为以可见形式保存广角图像和主图像。
一种可能的设计中,电子设备以方式(a)保存广角图像和主图像,即电子设备以可见形式保存广角图像,以不可见形式保存主图像。示例性的,如图4A所示,电子设备可以显示图库界面401,该图库界面401包括广角图像308,该图库界面401未显示主图像。
另一种可能的设计中,电子设备以方式(b)保存广角图像和主图像,即电子设备以可见形式保存广角图像和主图像。示例性的,如图4B所示,电子设备显示的图库界面401包括广角图像308主图像309。以下实施例中,以电子设备以方式(a)保存广角图像和主图像为例,对本申请实施例进行介绍。
在本申请实施例中,电子设备可以接收第二操作,该第二操作用于触发电子设备显示广角图像。响应于第二操作,电子设备显示广角图像。示例性的,如图5中的(a)所示,电子设备可以接收用户作用于广角图像308的操作(例如点击操作),显示如图5中的(b)所示的图像查看界面501,该图像查看界面501包括广角图像308。可选的,响应于第二操作,电子设备可以将主图像拼接在广角图像中第二目标区域图像上显示,该第二目标区域图像相对于第一摄像头的取景范围与第二取景范围重合。也就是说,响应于第二操作,电子设备可以显示将广角图像与主图像进行拼接之后的图像。
之后,电子设备可以接收第三操作,该第三操作用于触发电子设备显示放大后的广角图像中的第一区域图像。其中,第一区域图像包括第一目标区域图像,第一目标区域图像相对于广角摄像头的取景范围是第三取景范围(即第一摄像头采用第三取景范围可以得到第一目标区域图像)。第一取景范围包括所述第三取景范围,第三取景 范围与第二取景范围重合。例如,第一取景范围为图6中的(a)所示的广角图像308的取景范围,第三取景范围为图6中的(a)所示的第一目标区域图像604的取景范围。以下实施例中,对于区域图像的取景范围的介绍,可以参考对于第一目标区域图像的取景范围的说明。
示例性的,如图6中的(a)所示,电子设备显示的图像查看界面601包括广角图像308。电子设备可以接收用户在图像查看界面601的第三操作(如扩张操作),触发电子设备显示如图6中的(b)所示的图像603,该图像603为放大后的广角图像308中的区域,图像603的取景范围与广角图像308中的区域图像602的取景范围相同。也就是说,图像603为放大后的区域图像602。图像603包括第一目标区域图像604,该第一目标区域图像604的取景范围与主图像(例如图3中的(c)所示的主图像309)的取景范围相同。
需要说明的是,在电子设备显示放大后的广角图像中的第一区域图像的情况下,由于第一区域图像的分辨率较低,电子设备显示的图像的质量较差,导致用户无法查看高质量的图像。
在一些实施例中,为了提高电子设备显示的图像质量,响应于第三操作,电子设备可以将主图像拼接在第一目标区域图像上显示。也就是说,响应于第三操作,电子设备可以显示第一区域图像中除第一目标区域图像以外的区域和主图像。示例性的,如图6中的(c)所示,电子设备可以将主图像309拼接在图像603的第一目标区域图像604上显示。
可以理解的是,电子设备在接收用户的第三操作之后,可以将主图像拼接在第一目标区域图像上显示。这样一来,电子设备可以同时显示广角图像中的部分区域和主图像。相较于常规技术方案,本申请技术方案可以简化电子设备显示具备不同特点图像的过程,提高用户的使用体验。
一种可能的实现方式,电子设备中保存有校准信息,该校准信息包括:广角摄像头的视场角与主摄像头的视场角之间的关系、主图像的取景范围与广角图像取景范围之间的对应关系。响应于第三操作,电子设备可以根据校准信息,在第一目标区域图像绘制并显示该主图像。
示例性的,本申请实施例中,电子设备可以采用以下实现方式确定第一目标区域图像。电子设备可以保存主图像的取景范围中的两个对角(如左上角和右下角,或者右上角和左下角)在广角图像取景范围的坐标系中的二维坐标。该二维坐标可以体现出主图像的取景范围与广角图像取景范围之间的对应关系。其中,广角图像取景范围的坐标系的坐标原点是广角图像的取景范围中任意一个角(如左上角或左下角),x轴和y轴为相邻的两条边。
请参考图7,其示出广角图像的取景范围710的一种坐标系实例。如图7所示,点o为坐标原点,x轴为视野范围710的下侧边,y轴为视野范围710的左侧边。电子设备可以保存主图像的取景范围720的左上角A1和右下角A2在图7所示的xoy坐标系中的二维坐标A1(x1,y1)和A2(x2,y2)。其中,二维坐标A1(x1,y1)和A2(x2,y2)可以体现出主图像的取景范围与广角图像取景范围之间的对应关系。
可以理解的是,电子设备在接收到第三操作之后,可以显示第一区域图像中除第 一目标区域图像以外的区域和主图像。如此,电子设备可以在一张图像中同时具备广角图像的特点和主图像的特点,进而可以保障用户同时查看广角图像和主图像。并且,主图像的图像质量高于第一目标区域图像的图像质量,主图像的取景范围与第一目标区域图像的取景范围相同。因此,可以提高用户查看的图像质量,提高了用户的使用体验。
需要说明的是,为了进一步提高用户的可操作性,电子设备可以控制电子设备将主图像拼接在第一目标区域图像上显示的时机。具体的,电子设备可以通过方法(a)和方法(b)控制电子设备将主图像拼接在第一目标区域图像上显示的时机。其中,方法(a)为电子设备通过图像的分辨率和显示屏的分辨率,控制电子设备将主图像拼接在第一目标区域图像上显示的时机。方法(b)为:电子设备可以根据图像的缩放倍率,控制电子设备将主图像拼接在第一目标区域图像上显示的时机。
在一些实施例中,电子设备可以通过方法(a),即电子设备通过图像的分辨率和显示屏的分辨率,控制电子设备将主图像拼接在第一目标区域图像上显示的时机。具体的,电子设备可以获取放大后的广角图像中第一区域图像的分辨率和显示屏的分辨率。之后,电子设备可以根据第一区域图像的分辨率和显示屏的分辨率,计算得到第一比值,该第一比值为第一区域图像的分辨率与显示屏的分辨率之间的比值。
示例性的,电子设备可以通过公式一计算得到第一比值。
Figure PCTCN2022079142-appb-000003
其中,N用于表示第一比值,C用于表示电子设备的显示屏的分辨率,A用于表示第一区域图像的分辨率。
示例性的,假如电子设备的显示屏的分辨率为2000×1000,第一区域图像的分辨率为2500×1000,则第一比值为:
Figure PCTCN2022079142-appb-000004
也就是说,第一比值为0.8。
需要说明的是,电子设备保存有第一区域图像的面积占放大后的广角图像的面积的比例。电子设备可以根据广角图像的分辨率和第一区域图像的面积占放大后的广角图像的面积的比例,确定第一区域图像的分辨率。其中,广角图像的缩放倍率越大,第一区域图像的面积占放大后的广角图像的面积的比例越小。示例性的,第一区域图像的面积占放大后的广角图像的面积的比例为0.8,广角图像的分辨率为4000×3000,则第一区域图像的分辨率为3200×2400。
电子设备可以根据第一比值和第一预设比值,确定是否将主图像拼接在第一目标区域图像上显示。其中,第一预设比值大于0.5,并且第一预设比值小于0.95。以下以第一预设比值为0.8为例,介绍本申请实施例。
一种可能的设计中,若第一比值大于第一预设比值,电子设备则可以将主图像拼接在第一目标区域图像上显示。若第一比值小于第一预设比值,电子设备则仅显示第一区域图像,不将主图像拼接在第一目标区域图像上显示。示例性的,若第一比值为0.85,电子设备则可以将主图像拼接在第一目标区域图像上显示(例如图6中的(c))。 若第一比值为0.75,电子设备则仅显示第一区域图像(例如图6中的(b))。
需要说明的是,广角图像的缩放倍率越大,第一区域图像的面积占放大后的广角图像的面积的比例越小。第一区域图像的面积占放大后的广角图像的面积的比例越小,第一区域图像的分辨率越小。并且,在电子设备的显示屏的分辨率不变的情况下,若第一区域图像的分辨率越小,则第一比值越大,电子设备将主图像拼接在第一目标区域图像上显示的概率越高。也就是说,在广角图像的缩放倍率较大的情况下,电子设备将主图像拼接在第一目标区域图像上显示的概率越高。
可以理解的是,电子设备通过比较第一比值和第一预设比值,可以控制将主图像拼接在第一目标区域图像上显示的时机。如此,可以使电子设备在预设条件下显示主图像,提高了用户的使用体验。
在一些实施例中,电子设备可以通过方法(b),即电子设备可以根据图像的缩放倍率,控制电子设备将主图像拼接在第一目标区域图像上显示的时机。具体的,电子设备可以获取第一倍率,该第一倍率为第三操作触发电子设备放大广角图像后的缩放倍率(即放大后的广角图像的缩放倍率,例如图像603的缩放倍率)。
电子设备可以通过以下方式获取第一倍率。具体的,电子设备可以获取广角图像的缩放倍率(即未放大的广角图像,例如广角图像308的缩放倍率)、广角图像的分辨率(例如广角图像308的分辨率)和第一区域图像的分辨率(例如图像603的分辨率)。之后,电子设备可以根据广角图像的缩放倍率、广角图像的分辨率和第一区域图像的分辨率,计算得到第一倍率。示例性的,电子设备可以通过公式二计算得到第一倍率。
Figure PCTCN2022079142-appb-000005
其中,M用于表示第一倍率,B用于表示广角图像的分辨率,A用于表示第一区域图像的分辨率,Z用于表示广角图像的缩放倍率。
示例性的,假如广角图像的分辨率为4000×3000,第一区域图像的分辨率为3600×2700,广角图像的缩放倍率为0.6,则第一倍率为:
Figure PCTCN2022079142-appb-000006
也就是说,第一倍率为0.54。
需要说明的是,电子设备也可以根据广角图像的缩放倍率、广角图像中第一方向上的像素和第一区域图像中第一方向上的像素,计算得到第一倍率,该第一方向可以为水平方向或者垂直方向。
电子设备可以根据第一倍率和第一预设缩放倍率,确定是否将主图像拼接在第一目标区域图像上显示。其中,第一预设缩放倍率大于0.7X,并且第一预设缩放倍率小于0.95X。以下以第一预设倍率为0.9X为例,介绍本申请实施例。
一种可能的设计中,若第一倍率大于第一预设缩放倍率,电子设备则可以将主图像拼接在第一目标区域图像上显示。若第一倍率小于第一预设缩放倍率,电子设备则 仅显示第一区域图像,不将主图像拼接在第一目标区域图像上显示。示例性的,若第一缩倍率为0.95X,电子设备则可以将主图像拼接在第一目标区域图像上显示(例如图6中的(c))。若第一倍率为0.75X,电子设备则仅显示第一区域图像(例如图6中的(b))。
可以理解的是,电子设备通过比较第一倍率和第一预设缩放倍率,可以控制将主图像拼接在第一目标区域图像上显示。如此,可以使电子设备在预设条件下显示主图像,提高了用户的使用体验。
可选的,也可以在拍摄后对将主图像拼接在第一目标区域图像上进行保存。保存时可以仅保存拼接后的图像,也可以保存拼接后的图像的同时保存主图像和广角图像。
在另一些实施例中,响应于第三操作,电子设备可以对第一区域图像和第二图像(例如主图像)进行图像融合,得到第四图像。之后,电子设备可以显示第四图像。
需要说明的是,电子设备可以控制对第一区域图像和主图像进行图像融合的时机。具体可以参考上述实施例中电子设备控制将主图像拼接在第一目标区域图像上显示的时机的说明,此处不予赘述。
可以理解的是,图像融合能够提高图像质量。并且,第一区域图像的取景范围较大,主图像的图像质量较高。电子设备对主图像和第一区域图像进行图像融合,可以综合主图像和第一区域图像的特点,得到取景范围较大和局部图像清晰度较高的第四图像。如此,可以提高用户查看的图像质量,提高了用户的使用体验。
需要说明的是,电子设备将主图像拼接在第一目标区域图像上显示时,由于主图像是拼接在第一目标区域图像上的,主图像与第一区域图像的拼接处可能会出现图像扭曲、不平滑等异常现象(例如图6中的(c)所示的拼接区域605)。如此,可能导致用户的使用体验较差。
在一些实施例中,为了避免主图像与第一区域图像的拼接处图像扭曲、不平滑等异常现象。电子设备可以对第一区域图像中除第一目标区域图像以外的区域进行模糊处理。示例性的,电子设备可以在第一区域图像中除第一目标区域图像以外的区域的上层添加涂层。例如,如图8所示,涂层801显示在第一区域图像中除第一目标区域图像以外的区域的上层。又例如,电子设备可以通过模糊处理算法对第一区域图像中除第一目标区域图像以外的区域进行模糊处理。本申请实施例对模糊处理算法不作限定。例如,该模糊处理算法可以为高斯模糊。又例如,该模糊处理算法可以为方框模糊。具体对于电子设备通过模糊处理算法对第一区域图像中除第一目标区域图像以外的区域进行模糊处理的方式,可以参考常规技术中电子设备通过模糊处理算法对图像进行模糊处理的方法,此处不予赘述。
可以理解的是,电子设备对第一区域图像中除第一目标区域图像以外的区域进行模糊处理,可以减小主图像与第一区域图像的拼接处的异常程度。如此,可以提高电子设备显示的图像质量,提高了用户的使用体验。
在另一些实施例中,第一摄像头和第二摄像头可以为组合方式(2),即第一摄像头可以为广角摄像头,第二摄像头为长焦摄像头。电子设备可以启动拍摄应用,显示图像预览界面(即第一界面),该图像预览界面是电子设备拍照的取景界面,该图像预览界面包括广角摄像头采集的第一图像。示例性的,如图9中的(a)所示,在电子 设备启动拍摄应用之后,电子设备的拍摄模式为广角拍摄模式。电子设备可以显示图像预览界面901,该图像预览界面901包括预览图像902,该预览图像902为广角摄像头采集的图像。可选的,该预览图像902可以为电子设备对广角图像和长焦图像进行图像处理之后的图像(具体可以参考下述描述,此处不予赘述)。之后,电子设备可以接收用户的拍照操作。响应于用户在图像预览界面的拍照操作,电子设备可以在同一时刻通过广角摄像头采集第一图像(即广角图像),通过长焦摄像头采集第二图像(即长焦图像),广角图像的取景范围大于长焦图像的取景范围。示例性的,图9中的(b)所示的广角图像903为广角摄像头采集的图像,图9中的(c)所示的长焦图像904为长焦摄像头采集的图像,广角图像903的取景范围大于长焦图像904的取景范围。
需要说明的是,上述电子设备在同一时刻通过广角摄像头采集广角图像,通过长焦摄像头采集长焦图像是指:广角摄像头采集广角图像的时刻(如第一时刻)和长焦摄像头采集长焦图像的时刻(如第三时刻)相同;或者,第一时刻与第三时刻之间的时间差较小(例如时间差均小于1毫秒)。在第一时刻与第三时刻之间存在时间差时,本申请实施例对广角摄像头采集广角图像、长焦摄像头采集长焦图像的顺序不作限定。
在本申请实施例中,响应于拍照操作,在电子设备通过广角摄像头采集广角图像,通过长焦摄像头采集长焦图像之后,电子设备可以保存该广角图像和长焦图像。其中,电子设备可以以可见形式保存广角图像,以不可见形式保存长焦图像。或者,电子设备可以以可见形式保存广角图像和长焦图像。
需要说明的是,具体对于电子设备保存广角图像和长焦图像的说明,可以参考上述实施例中对电子设备保存广角图像和主图像的介绍,此处不予赘述。
在本申请实施例中,电子设备可以接收第二操作,该第二操作用于触发电子设备显示广角图像。响应于第二操作,电子设备显示广角图像。之后,电子设备可以接收第三操作,该第三操作用于触发电子设备显示放大后的广角图像中的第一区域图像。其中,第一区域图像包括第一目标区域图像,第一目标区域图像为广角图像中与长焦图像的取景范围相同的区域。示例性的,如图10中的(a)所示,电子设备显示的图像查看界面1001包括广角图像903。电子设备可以接收用户在图像查看界面1001的第三操作(如扩张操作),触发电子设备显示如图10中的(b)所示的图像1002,该图像1002为放大后的广角图像903中的区域,图像1002的取景范围与广角图像903中的区域1003的取景范围相同。也就是说,图像1002为放大后的区域1003。图像1002包括第一目标区域图像1004,该第一目标区域图像1004的取景范围与长焦图像(例如图9中的(c)所示的长焦图像904)的取景范围相同。
需要说明的是,在电子设备显示放大后的广角图像中的第一区域图像的情况下,由于第一区域图像的分辨率较低,电子设备显示的图像的质量较差,导致用户无法查看高质量的图像。
在本申请实施例中,为了提高电子设备显示的图像质量,响应于第三操作,电子设备可以将长焦图像拼接在第一目标区域图像上显示。也就是说,响应于第三操作,电子设备可以显示第一区域图像中除第一目标区域图像以外的区域和长焦图像。示例性的,如图10中的(c)所示,电子设备可以将长焦图像904拼接在图像1002的第一 目标区域图像1004上显示。
一种可能的实现方式,电子设备中保存有校准信息,该校准信息包括:广角摄像头的视场角与长焦摄像头的视场角之间的关系、长焦图像的取景范围与广角图像取景范围之间的对应关系。响应于第三操作,电子设备可以根据校准信息,在第一目标区域图像绘制并显示该长焦图像。
需要说明的是,具体对于电子设备根据校准信息在第一目标区域图像绘制并显示长焦图像过程的说明,可以参考上述实施例中对于电子设备根据校准信息在第一目标区域图像绘制并显示主图像的介绍,此处不予赘述。
可以理解的是,电子设备在接收到第三操作之后,可以显示第一区域图像中除第一目标区域图像以外的区域和长焦图像。如此,电子设备可以在一张图像中同时具备广角图像的特点和长焦图像的特点,进而可以保障用户同时查看广角图像和长焦图像。并且,长焦图像的图像质量高于第一目标区域图像的图像质量,长焦图像的取景范围与第一目标区域图像的取景范围相同。因此,可以提高用户查看的图像质量,提高了用户的使用体验。
需要说明的是,为了进一步提高用户的可操作性,电子设备可以控制电子设备将长焦图像拼接在第一目标区域图像上显示的时机。具体对于电子设备控制电子设备将长焦图像拼接在第一目标区域图像上显示时机过程的说明,可以参考上述实施例中电子设备通过方法(a)和方法(b)控制电子设备将主图像拼接在第一目标区域图像上显示时机的描述,此处不予赘述。
需要说明的是,当电子设备参考上述方法(a)控制将长焦图像拼接在第一目标区域图像上显示的时机时,第一预设比值大于0.25,并且第一预设比值小于0.9025。例如,第一预设比值可以为0.64。当电子设备参考上述方法(b)控制将长焦图像拼接在第一目标区域图像上显示的时机时,第一预设缩放倍率大于2.45X,并且第一预设缩放倍率小于3.15X。例如,该第一预设缩放倍率可以为3X。
可以理解的是,电子设备可以控制将长焦图像拼接在第一目标区域图像上显示的时机。如此,可以使电子设备在预设条件下显示长焦图像,提高了用户的使用体验。
需要说明的是,电子设备将长焦图像拼接在第一目标区域图像上显示时,由于长焦图像是拼接在第一目标区域图像上的,长焦图像与第一区域图像的拼接处可能会出现图像扭曲、不平滑等异常现象(例如图10中的(c)所示的拼接区域1005)。如此,可能导致用户的使用体验较差。
在一些实施例中,为了避免长焦图像与第一区域图像的拼接处图像扭曲、不平滑等异常现象。电子设备可以对第一区域图像中除第一目标区域图像以外的区域进行模糊处理。具体对于电子设备对第一区域图像中除第一目标区域图像以外的区域进行模糊处理的方式,可以参考上述实施例中电子设备对第一区域图像中除第一目标区域图像以外的区域进行模糊处理的方法,此处不予赘述。
可以理解的是,电子设备对第一区域图像中除第一目标区域图像以外的区域进行模糊处理,可以减小长焦图像与第一区域图像的拼接处的异常程度。如此,可以改善电子设备显示的图像质量,提高了用户的使用体验。
在一些实施例中,第一摄像头和第二摄像头可以为组合方式(3),即第一摄像头 可以为主摄像头,第二摄像头可以为长焦摄像头。电子设备可以启动拍摄应用,显示图像预览界面(即第一界面),该图像预览界面是电子设备拍照的取景界面,该图像预览界面包括主摄像头采集的第一图像。示例性的,如图11中的(a)所示,在电子设备启动拍摄应用之后,电子设备的拍摄模式为普通拍摄模式。电子设备可以显示图像预览界面1101,该图像预览界面1101包括预览图像1102,该预览图像1102为主摄像头采集的图像。可选的,该预览图像1102可以为电子设备对主图像和长焦图像进行图像处理之后的图像(具体可以参考下述描述,此处不予赘述)。之后,电子设备可以接收用户的拍照操作。响应于用户在图像预览界面的拍照操作,电子设备可以在同一时刻通过主摄像头采集第一图像(即主图像),通过长焦摄像头采集第二图像(即长焦图像),主图像的取景范围大于长焦图像的取景范围。示例性的,图11中的(b)所示的主图像1103为主摄像头采集的图像,图11中的(c)所示的长焦图像1104为长焦摄像头采集的图像,主图像1103的取景范围大于长焦图像1104的取景范围。
需要说明的是,上述电子设备在同一时刻通过主摄像头采集主图像,通过长焦摄像头采集长焦图像是指:主摄像头采集主图像的时刻(如第二时刻)和长焦摄像头采集长焦图像的时刻(如第三时刻)相同;或者,第二时刻与第三时刻之间的时间差较小(例如时间差均小于1毫秒)。在第二时刻与第三时刻之间存在时间差时,本申请实施例对主摄像头采集主图像、长焦摄像头采集长焦图像的顺序不作限定。
在本申请实施例中,响应于拍照操作,在电子设备通过主摄像头采集主图像,通过长焦摄像头采集长焦图像之后,电子设备可以保存该主图像和长焦图像。其中,电子设备可以以可见形式保存主图像,以不可见形式保存长焦图像。或者,电子设备可以以可见形式保存主图像和长焦图像。
需要说明的是,具体对于电子设备保存主图像和长焦图像的说明,可以参考上述实施例中对电子设备保存广角图像和主图像的介绍,此处不予赘述。
在本申请实施例中,电子设备可以接收第二操作,该第二操作用于触发电子设备显示主图像。响应于第二操作,电子设备显示主图像。之后,电子设备可以接收第三操作,该第三操作用于触发电子设备显示放大后的主图像中的第一区域图像。其中,第一区域图像包括第一目标区域图像,第一目标区域图像为主图像中与长焦图像的取景范围相同的区域。示例性的,如图12中的(a)所示,电子设备显示的图像查看界面1201包括主图像1103。电子设备可以接收用户在图像查看界面1201的第三操作(如扩张操作),触发电子设备显示如图12中的(b)所示的图像1202,该图像1202为放大后的主图像1103中的区域,图像1202的取景范围与主图像1103中的区域1203的取景范围相同。也就是说,图像1202为放大后的区域1203。图像1202包括第一目标区域图像1204,该第一目标区域图像1204的取景范围与长焦图像(例如图11中的(c)所示的长焦图像1104)的取景范围相同。可选的,响应于第二操作,电子设备可以将主图像均拼接在广角图像中第二目标区域图像,并将长焦图像拼接在主图像中与长焦图像取景范围相同的区域。
需要说明的是,在电子设备显示放大后的主图像中的第一区域图像的情况下,由于第一区域图像的分辨率较低,电子设备显示的图像的质量较差,导致用户无法查看高质量的图像。
在本申请实施例中,为了提高电子设备显示的图像质量,响应于第三操作,电子设备可以将长焦图像拼接在第一目标区域图像上显示。也就是说,响应于第三操作,电子设备可以显示第一区域图像中除第一目标区域图像以外的区域和长焦图像。示例性的,如图12中的(c)所示,电子设备可以将长焦图像1104拼接在图像1202的第一目标区域图像1204上显示。
可以理解的是,电子设备在接收到第三操作之后,可以显示第一区域图像中除第一目标区域图像以外的区域和长焦图像。如此,电子设备可以在一张图像中同时具备主图像的特点和长焦图像的特点,进而可以保障用户同时查看主图像和长焦图像。并且,长焦图像的图像质量高于第一目标区域图像的图像质量,长焦图像的取景范围与第一目标区域图像的取景范围相同。因此,可以提高用户查看的图像质量,提高了用户的使用体验。
需要说明的是,为了进一步提高用户的可操作性,电子设备可以控制电子设备将长焦图像拼接在第一目标区域图像上显示的时机。具体对于电子设备控制电子设备将长焦图像拼接在第一目标区域图像上显示时机过程的说明,可以参考上述实施例中电子设备通过方法(a)和方法(b)控制电子设备将主图像拼接在第一目标区域图像上显示时机的描述,此处不予赘述。
需要说明的是,当电子设备参考上述方法(a)控制将长焦图像拼接在第一目标区域图像上显示的时机时,第一预设比值大于0.5,并且第一预设比值小于0.95。例如,第一预设比值可以为0.8。当电子设备参考上述方法(b)控制将长焦图像拼接在第一目标区域图像上显示的时机时,第一预设缩放倍率大于2.45X,并且第一预设缩放倍率小于3.15X。例如,该第一预设缩放倍率可以为3X。
可以理解的是,电子设备可以控制将长焦图像拼接在第一目标区域图像上显示的时机。如此,可以使电子设备在预设条件下显示长焦图像,提高了用户的使用体验。
在一些实施例中,为了避免长焦图像与第一区域图像的拼接处图像扭曲、不平滑等异常现象(例如图12中的(c)所示的拼接区域1205)。电子设备可以对第一区域图像中除第一目标区域图像以外的区域进行模糊处理。具体对于电子设备对第一区域图像中除第一目标区域图像以外的区域进行模糊处理的方式,可以参考上述实施例中电子设备对第一区域图像中除第一目标区域图像以外的区域进行模糊处理的方法,此处不予赘述。
可以理解的是,电子设备对第一区域图像中除第一目标区域图像以外的区域进行模糊处理,可以减小长焦图像与第一区域图像的拼接处的异常程度。如此,可以提高电子设备显示的图像质量,提高了用户的使用体验。
在一些实施例中,在第一摄像头和第二摄像头为组合方式(1)的情况下,即第一摄像头可以为广角摄像头,第二摄像头可以为主摄像头的情况下,电子设备还可以包括长焦摄像头(即第三摄像头)。其中,广角摄像头的视场角大于主摄像头的视场角,主摄像头的视场角大于长焦摄像头的视场角。电子设备可以启动拍摄应用,显示图像预览界面(即第一界面),该图像预览界面是电子设备拍照的取景界面,该图像预览界面包括广角摄像头采集的第一图像。示例性的,如图13中的(a)所示,在电子设备启动拍摄应用之后,电子设备的拍摄模式为广角拍摄模式。电子设备可以显示图像 预览界面1301,该图像预览界面1301包括预览图像1302,该预览图像1302为广角摄像头采集的图像。可选的,该预览图像1302可以为电子设备对广角图像、主图像和长焦图像进行图像处理之后的图像(具体可以参考下述描述,此处不予赘述)。
电子设备可以接收用户的拍照操作。响应于用户在图像预览界面的拍照操作,电子设备可以在同一时刻通过广角摄像头采集第一图像(即广角图像),通过主摄像头采集第二图像(即主图像),通过长焦摄像头采集第三图像(即长焦图像),长焦摄像头采集长焦图像的取景范围为第四取景范围(即长焦图像的取景范围)。广角图像的取景范围大于长焦图像的取景范围,主图像的取景范围大于长焦图像的取景范围。示例性的,图13中的(b)所示的广角图像1303为广角摄像头采集的图像,图13中的(c)所示的主图像1304为主摄像头采集的图像,图13中的(d)所示的长焦图像1305为长焦摄像头采集的图像,广角图像1303的取景范围大于主图像1304的取景范围,主图像1304的取景范围大于长焦图像1305的取景范围。
需要说明的是,上述电子设备在同一时刻通过广角摄像头采集广角图像,通过主摄像头采集主图像,通过长焦摄像头采集长焦图像是指:广角摄像头采集广角图像的时刻(如第一时刻)、主摄像头采集主图像的时刻(如第二时刻)和长焦摄像头采集长焦图像的时刻(如第三时刻)相同;或者,第一时刻与第二时刻之间的时间差、第一时刻与第三时刻之间的时间差、第二时刻与第三时刻之间的时间差均较小(例如时间差均小于1毫秒)。在第一时刻、第二时刻与第三时刻之间存在时间差时,本申请实施例对广角摄像头采集广角图像、主摄像头采集主图像和长焦摄像头采集长焦图像的顺序不作限定。
在本申请实施例中,响应于拍照操作,在电子设备通过广角摄像头采集广角图像,通过主摄像头采集主图像,通过长焦摄像头采集长焦图像之后,电子设备可以保存该广角图像、主图像和长焦图像。其中,电子设备可以以可见形式保存广角图像,以不可见形式保存主图像和长焦图像。或者,电子设备可以以可见形式保存广角图像、主图像和长焦图像。
需要说明的是,具体对于电子设备保存广角图像、主图像和长焦图像的说明,可以参考上述实施例中对电子设备保存广角图像和主图像的介绍,此处不予赘述。
在本申请实施例中,电子设备可以接收第二操作,该第二操作用于触发电子设备显示广角图像。响应于第二操作,电子设备显示广角图像。之后,电子设备可以接收第三操作,该第三操作用于触发电子设备显示放大后的广角图像中的第一区域图像。响应于第三操作,电子设备可以将主图像拼接在第一目标区域图像上显示。
需要说明的是,具体对于响应于第三操作,电子设备将主图像拼接在第一目标区域图像上显示过程的说明,可以参考上述实施例,此处不予赘述。
在一些实施例中,在电子设备将主图像拼接在第一目标区域图像上显示之后,电子设备可以接收第四操作,该第四操作用于触发电子设备显示放大后的主图像中的第二区域图像,该第二区域图像包括第三目标区域图像,该第三目标区域图像相对于第二摄像头的取景范围是第五取景范围,第二取景范围包括第五取景范围,第五取景范围与第四取景范围重合。例如,该第五取景范围为图14中的(b)所示的第三目标区域图像1405的取景范围。
示例性的,结合图6中的(c)所示,电子设备将主图像309拼接在图像603的第一目标区域图像604上显示之后,如图14中的(a)所示,电子设备可以显示图像查看界面1401,该图像查看界面1401包括图像1402(即图6中的(c)所示的主图像309拼接在第一目标区域图像604后的图像)。电子设备可以接收用户在图像查看界面1401的第四操作(如扩张操作),触发电子设备显示如图14中的(b)所示的图像1404,该图像1404为放大后的图像1402中的区域,图像1404的取景范围与图像1402中的区域图像1403的取景范围相同。也就是说,图像1404为放大后的区域图像1403。图像1404包括第三目标区域图像1405,该第三目标区域图像1405的取景范围与长焦图像(例如图13中的(d)所示的长焦图像1305)的取景范围相同。
需要说明的是,在电子设备显示放大后的图像中的第二区域图像的情况下,由于第二区域图像的分辨率较低,电子设备显示的图像的质量较差,导致用户无法查看高质量的图像。
在本申请实施例中,为了提高电子设备显示的图像质量,响应于第四操作,电子设备可以将长焦图像拼接在第三目标区域图像上显示。也就是说,响应于第四操作,电子设备可以显示第二区域图像中除第三目标区域图像以外的区域和长焦图像。示例性的,如图14中的(c)所示,电子设备可以将长焦图像1305拼接在图像1404的第三目标区域图像1405上显示。
一种可能的实现方式,电子设备中保存有校准信息,该校准信息包括:主摄像头的视场角与长焦摄像头的视场角之间的关系、主图像的取景范围与长焦图像取景范围之间的对应关系。响应于第四操作,电子设备可以根据校准信息,在第三目标区域图像绘制并显示该长焦图像。
需要说明的是,具体对于电子设备根据校准信息,在第三目标区域图像绘制并显示长焦图像过程的介绍,可以参考上述实施例,此处不予赘述。
需要说明的是,为了进一步提高用户的可操作性,电子设备可以控制电子设备将长焦图像拼接在第三目标区域图像上显示的时机。具体的,电子设备可以通过图像的分辨率和显示屏的分辨率,控制电子设备将长焦图像拼接在第三目标区域图像上显示的时机。或者,电子设备可以根据图像的缩放倍率,控制电子设备将长焦图像拼接在第三目标区域图像上显示的时机。
在一些实施例中,电子设备可以获取第二区域图像的分辨率和显示屏的分辨率。之后,电子设备可以根据第二区域图像的分辨率和显示屏的分辨率,计算得到第二比值,该第二比值为第二区域图像的分辨率与显示屏的分辨率之间的比值。具体对于电子设备计算得到第二比值过程的介绍,可以参考上述公式一,此处不予赘述。
电子设备可以根据第二比值和第一预设比值,确定是否将主图像拼接在第一目标区域图像上显示。其中,第一预设比值大于0.5,并且第一预设比值小于0.95。以下以第一预设比值为0.8为例,介绍本申请实施例。
一种可能的设计中,若第二比值大于第一预设比值,电子设备则可以将长焦图像拼接在第三目标区域图像上显示。若第二比值小于第一预设比值,电子设备则仅显示第二区域图像,不将长焦图像拼接在第三目标区域图像上显示。示例性的,若第一比值为0.85,电子设备则可以将长焦图像拼接在第三目标区域图像上显示(例如图14 中的(c))。若第二比值为0.75,电子设备则仅显示第二区域图像(例如图14中的(b))。
可以理解的是,电子设备通过比较第二比值和第一预设比值,可以控制将长焦图像拼接在第三目标区域图像上显示。如此,可以使电子设备在预设条件下显示长焦图像,提高了用户的使用体验。
在另一些实施例中,电子设备可以获取第二倍率,该第二倍率为第四操作触发电子设备放大主图像后的缩放倍率(即放大后的主图像的缩放倍率,例如图像1404的缩放倍率)。示例性的,电子设备可以通过以下方式获取第二倍率。具体的,电子设备可以获取主图像的缩放倍率(即未放大的主图像,例如主图像1304的缩放倍率)、主图像的分辨率(例如主图像1304的分辨率)和第二区域图像的分辨率(例如图像1404的分辨率)。之后,电子设备可以根据主图像的缩放倍率、主图像的分辨率和第二区域图像的分辨率,计算得到第二倍率。具体对于电子设备计算得到第二倍率的方法,可以参考上述公式二,此处不予赘述。
电子设备可以根据第二倍率和第二预设缩放倍率,确定是否将长焦图像拼接在第三目标区域图像上显示。其中,第二预设缩放倍率大于2.45X,并且第二预设缩放倍率小于3.15X。以下以第一预设倍率为3X为例,介绍本申请实施例。
一种可能的设计中,若第二倍率大于第二预设缩放倍率,电子设备则可以将长焦图像拼接在第三目标区域图像上显示。若第二倍率小于第二预设缩放倍率,电子设备则仅显示第二区域图像,不将长焦图像拼接在第三目标区域图像上显示。示例性的,若第二缩倍率为3.5X,电子设备则可以将长焦图像拼接在第三目标区域图像上显示(例如图14中的(c))。若第二倍率为2X,电子设备则仅显示第二区域图像(例如图14中的(b))。
可以理解的是,电子设备可以控制将长焦图像拼接在第三目标区域图像上显示的时机。如此,可以使电子设备在预设条件下显示长焦图像,提高了用户的使用体验。
在另一些实施例中,在电子设备得到第四图像之后,电子设备可以接收第五操作,该第五操作用于触发电子设备显示放大后的第四图像中的第三区域图像,该第三区域图像包括第三目标区域图像。响应于第五操作,电子设备可以对第三区域图像和第三图像(即长焦图像)进行图像融合,得到第五图像。之后,电子设备可以显示第五图像。
需要说明的是,电子设备可以控制对第三区域图像和长焦图像进行图像融合的时机。具体可以参考上述实施例中电子设备控制将长焦图像拼接在第一目标区域图像上显示的时机的说明,此处不予赘述。
可以理解的是,图像融合能够提高图像质量。并且,第三区域图像的取景范围较大,长焦图像的图像质量较高。电子设备对长焦图像和第三区域图像进行图像融合,可以综合长焦图像和第三区域图像的特点,得到取景范围较大和局部图像清晰度较高的第五图像。如此,可以提高用户查看的图像质量,提高了用户的使用体验。
需要说明的是,电子设备将长焦图像拼接在第三目标区域图像上显示时,由于长焦图像是拼接在第三目标区域图像上的,长焦图像与第二区域图像的拼接处可能会出现图像扭曲、不平滑等异常现象(例如图14中的(c)所示的拼接区域1406)。如此, 可能导致用户的使用体验较差。
在一些实施例中,为了避免长焦图像与第二区域图像的拼接处图像扭曲、不平滑等异常现象。电子设备可以对第二区域图像中除第三目标区域图像以外的区域进行模糊处理。具体对于电子设备对第二区域图像中除第三目标区域图像以外的区域进行模糊处理的方式,可以参考上述实施例中电子设备对第一区域图像中除第一目标区域图像以外的区域进行模糊处理的方法,此处不予赘述。
可以理解的是,电子设备对第二区域图像中除第三目标区域图像以外的区域进行模糊处理,可以减小长焦图像与第二区域图像的拼接处的异常程度。如此,可以改善电子设备显示的图像质量,提高了用户的使用体验。
需要说明的是,电子设备中存储的图像类型较多。其中,电子设备可以通过常规的显示方式显示常规类型的图像。例如,电子设备可以在屏幕显示长焦图像、主图像等。电子设备也可以通过预设模式显示上述拼接后的图像(例如图6中的(c)所示的图像和图10中的(c)所示的图像)。
在一些实施例中,为了便于电子设备确定图像的显示方式,电子设备在保存上述第一图像、第二图像(或者第三图像)时,可以在第一图像、第二图像的图像信息中添加第一标识,该第一标识用于指示电子设备按照预设模式显示拼接后的图像。响应于第三操作(或者第四操作),电子设备可以根据图像的图像信息,确定是否以预设模式显示图像。示例性的,电子设备在接收到用户的第三操作之后,可以检测图像的图像信息是否存在第一标识。若图像信息存在第一标识,电子设备可以按照预设模式显示拼接后的图像。若图像信息不存在第一标识,电子设备可以按照常规的显示方式显示该图像。
可以理解的是,电子设备为图像添加第一标识,可以使电子设备按照预设模式显示拼接后的图像。也就是说,电子设备可以在一张图像中同时具备广角图像的特点、主图像的特点和长焦图像的特点,进而可以保障用户同时查看广角图像、主图像和长焦图像。如此,可以提高用户查看的图像质量,提高了用户的使用体验。
需要说明的是,在电子设备拍摄得到上述第一图像、第二图像和第三图像之后,电子设备可以向其他的电子设备分享该第一图像、第二图像和第三图像。具体的,电子设备可以通过传输一个数据包的形式,向其他的电子设备分享第一图像、第二图像和第三图像。或者,电子设备可以通过分别传输第一图像、第二图像和第三图像的形式,向其他的电子设备分享第一图像、第二图像和第三图像。
在一些实施例中,电子设备可以通过传输一个数据包的形式,向其他的电子设备分享第一图像、第二图像和第三图像。具体的,电子设备可以向其他的电子设备(可以称为接收端设备)发送数据包,该数据包包括第一图像、第二图像和第三图像。之后,接收端设备可以接收该数据包,并保存第一图像、第二图像和第三图像。示例性的,接收端设备可以以可见形式保存第一图像,以不可见形式保存第二图像和第三图像(例如图4A)。若该接收端设备可以识别第一标识,接收端设备则可以显示拼接后的图像。可选的,若该接收端设备可以识别第一标识,接收端设备则可以以可见形式保存上述第一图像、第二图像和第三图像(例如图4B)。若接收端设备不可以识别第一标识,接收端设备则仅显示第一图像。
在一些实施例中,电子设备可以通过分别传输第一图像、第二图像和第三图像的形式,向其他的电子设备分享第一图像、第二图像和第三图像。具体的,电子设备可以向接收端设备发送第一图像、第二图像和第三图像。接收端设备接收到第一图像、第二图像和第三图像之后,可以以可见形式保存第一图像、第二图像和第三图像。并且,若该接收端设备可以识别第一标识,接收端设备则可以显示拼接后的图像。可选的,若该接收端设备可以识别第一标识,接收端设备则可以以可见形式保存上述第一图像,以不可见形式保存第二图像和第三图像。若接收端设备不可以识别第一标识,接收端设备则仅显示第一图像。
可以理解的是,在接收端设备可以识别第一标识的情况下,接收端设备也可以显示拼接后的图像。也就是说,接收端设备可以在一张图像中同时具备广角图像的特点、主图像的特点和长焦图像的特点,进而可以保障用户同时查看广角图像、主图像和长焦图像。如此,可以提高用户查看的图像质量,提高了用户的使用体验。
上述主要从电子设备的角度对本申请实施例提供的方案进行了介绍。可以理解的是,电子设备为了实现上述功能,其包含了执行各个功能相应的硬件结构和/或软件模块。本领域技术人员应该很容易意识到,结合本申请所公开的实施例描述的各示例的一种图像的显示方法步骤,本申请能够以硬件或硬件和计算机软件的结合形式来实现。某个功能究竟以硬件还是电子设备软件驱动硬件的方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
本申请实施例可以根据上述方法示例对图像的显示装置进行功能模块或者功能单元的划分,例如,可以对应各个功能划分各个功能模块或者功能单元,也可以将两个或两个以上的功能集成在一个处理模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块或者功能单元的形式实现。其中,本申请实施例中对模块或者单元的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。
本申请另一些实施例提供了一种电子设备(如图2A所示的手机200),该电子设备中安装有多个预设应用。该电子设备可以包括:存储器和一个或多个处理器。该存储器和处理器耦合。该电子设备还可以包括摄像头。或者,该电子设备可以外接摄像头。该存储器用于存储计算机程序代码,该计算机程序代码包括计算机指令。当处理器执行计算机指令时,电子设备可执行上述方法实施例中手机执行的各个功能或者步骤。该电子设备的结构可以参考图2A所示的手机200的结构。
本申请实施例还提供一种芯片系统,如图15所示,该芯片系统包括至少一个处理器1501和至少一个接口电路1502。处理器1501和接口电路1502可通过线路互联。例如,接口电路1502可用于从其它装置(例如电子设备的存储器)接收信号。又例如,接口电路1502可用于向其它装置(例如处理器1501)发送信号。示例性的,接口电路1502可读取存储器中存储的指令,并将该指令发送给处理器1501。当所述指令被处理器1501执行时,可使得电子设备(如图2A所示的手机200)执行上述实施例中的各个步骤。当然,该芯片系统还可以包含其他分立器件,本申请实施例对此不作具体限定。
本申请实施例还提供一种计算机可读存储介质,该计算机可读存储介质包括计算机指令,当所述计算机指令在上述电子设备(如图2A所示的手机200)上运行时,使得该电子设备执行上述方法实施例中手机执行的各个功能或者步骤。
本申请实施例还提供一种计算机程序产品,当所述计算机程序产品在计算机上运行时,使得所述计算机执行上述方法实施例中手机执行的各个功能或者步骤。
通过以上实施方式的描述,所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。
在本申请所提供的几个实施例中,应该理解到,所揭露的装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个装置,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是一个物理单元或多个物理单元,即可以位于一个地方,或者也可以分布到多个不同地方。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个可读取存储介质中。基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该软件产品存储在一个存储介质中,包括若干指令用以使得一个设备(可以是单片机,芯片等)或处理器(processor)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(read only memory,ROM)、随机存取存储器(random access memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
以上内容,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何在本申请揭露的技术范围内的变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (13)

  1. 一种图像的显示方法,其特征在于,应用于电子设备,所述电子设备包括显示屏、第一摄像头和第二摄像头,所述第一摄像头的视场角大于所述第二摄像头的视场角;所述方法包括:
    所述电子设备接收用户在第一界面的第一操作,所述第一界面是所述电子设备拍照的取景界面,所述第一界面包括所述第一摄像头采集的预览图像;
    响应于所述第一操作,所述电子设备保存所述第一摄像头采集的第一图像,并保存所述第二摄像头采集的第二图像,所述第一摄像头采集所述第一图像的取景范围是第一取景范围,所述第二摄像头采集所述第二图像的取景范围是第二取景范围,所述第一取景范围大于所述第二取景范围;
    所述电子设备接收第二操作,所述第二操作用于触发所述电子设备显示所述第一图像;
    响应于所述第二操作,所述电子设备显示所述第一图像;
    所述电子设备接收第三操作,所述第三操作用于触发所述电子设备显示放大后的所述第一图像中的第一区域图像,所述第一区域图像包括第一目标区域图像,所述第一目标区域图像相对于所述第一摄像头的取景范围是第三取景范围,所述第一取景范围包括所述第三取景范围,所述第三取景范围与所述第二取景范围重合;
    响应于所述第三操作,所述电子设备将所述第二图像拼接在所述第一目标区域图像上显示。
  2. 根据权利要求1所述的方法,其特征在于,在所述电子设备接收第二操作之后,所述方法还包括:
    响应于所述第二操作,所述电子设备将所述第二图像拼接在所述第一图像中第二目标区域图像上显示,所述第二目标区域图像相对于所述第一摄像头的取景范围与所述第二取景范围重合。
  3. 根据权利要求1或2所述的方法,其特征在于,在所述电子设备接收第三操作之后,所述方法还包括:
    所述电子设备获取所述第一区域图像的分辨率和所述显示屏的分辨率;
    所述电子设备根据所述第一区域图像的分辨率和所述显示屏的分辨率,计算得到第一比值,所述第一比值为所述第一区域图像的分辨率与所述显示屏的分辨率之间的比值;
    所述电子设备将所述第二图像拼接在所述第一目标区域图像上显示,包括:
    若所述第一比值大于第一预设比值,所述电子设备则将所述第二图像拼接在所述第一目标区域图像上显示。
  4. 根据权利要求1-3中任一项所述的方法,其特征在于,在所述电子设备接收第三操作之后,所述方法还包括:
    所述电子设备获取第一倍率,所述第一倍率为所述第三操作触发所述电子设备放大所述第一图像后的缩放倍率;
    所述电子设备将所述第二图像拼接在所述第一目标区域图像上显示,包括:
    若所述第一倍率大于第一预设缩放倍率,所述电子设备则将所述第二图像拼接在 所述第一目标区域图像上显示。
  5. 根据权利要求4所述的方法,其特征在于,所述电子设备获取第一倍率,包括:
    所述电子设备获取所述第一图像的缩放倍率、所述第一图像的分辨率和所述第一区域图像的分辨率;
    所述电子设备根据所述第一图像的缩放倍率、所述第一图像的分辨率和所述第一区域图像的分辨率,计算得到所述第一倍率。
  6. 根据权利要求5所述的方法,其特征在于,所述第一倍率满足下述公式:
    Figure PCTCN2022079142-appb-100001
    其中,M用于表示所述第一倍率,B用于表示所述第一图像的分辨率,A用于表示所述第一区域图像的分辨率,Z用于表示所述第一图像的缩放倍率。
  7. 根据权利要求1-6中任一项所述的方法,其特征在于,所述方法还包括:
    响应于所述第三操作,所述电子设备对所述第一区域图像中除所述第一目标区域图像以外的区域进行模糊处理。
  8. 根据权利要求1-7中任一项所述的方法,其特征在于,所述电子设备还包括第三摄像头,所述第二摄像头的视场角大于所述第三摄像头的视场角,所述方法还包括:
    响应于所述拍照操作,所述电子设备保存通过所述第三摄像头采集的第三图像,所述第三摄像头采集所述第三图像的取景范围是第四取景范围,所述第二取景范围大于所述第四取景范围。
  9. 根据权利要求8所述的方法,其特征在于,所述方法还包括:
    所述电子设备接收第四操作,所述第四操作用于触发所述电子设备显示放大后的所述第二图像中的第二区域图像,所述第二区域图像包括第三目标区域图像,所述第三目标区域图像相对于所述第二摄像头的取景范围是第五取景范围,所述第二取景范围包括所述第五取景范围,所述第五取景范围与所述第四取景范围重合;
    所述电子设备获取所述第二区域图像的分辨率和所述显示屏的分辨率;
    所述电子设备根据所述第二区域图像的分辨率和所述显示屏的分辨率,计算得到第二比值,所述第二比值为所述第二区域图像的分辨率与所述显示屏的分辨率之间的比值;
    若所述第二比值大于第一预设比值,所述电子设备则将所述第三图像拼接在所述第三目标区域图像上显示。
  10. 根据权利要求8所述的方法,其特征在于,所述方法还包括:
    所述电子设备接收第四操作,所述第四操作用于触发所述电子设备显示放大后的所述第二图像中的第二区域图像,所述第二区域图像包括第三目标区域图像,所述第三目标区域图像相对于所述第二摄像头的取景范围是第五取景范围,所述第二取景范围包括所述第五取景范围,所述第五取景范围与所述第四取景范围重合;
    所述电子设备获取第二倍率,所述第二倍率为所述第四操作触发所述电子设备放大所述第二图像后的缩放倍率;
    若所述第二倍率大于第二预设缩放倍率,所述电子设备则将所述第三图像拼接在所述第三目标区域图像上显示。
  11. 一种电子设备,其特征在于,所述电子设备包括:存储器、显示屏和一个或多个处理器;所述存储器、所述显示屏与所述处理器耦合,所述存储器用于存储计算机程序代码,所述计算机程序代码包括计算机指令,当所述计算机指令被所述一个或多个处理器执行时,使得所述电子设备执行如权利要求1-10中任一项所述的方法。
  12. 一种计算机可读存储介质,其特征在于,包括计算机指令,当所述计算机指令在电子设备上运行时,使得所述电子设备执行如权利要求1-10中任一项所述的方法。
  13. 一种计算机程序产品,其特征在于,所述计算机程序产品在计算机上运行时,使得所述计算机执行如权利要求1-10中任一项所述的方法。
PCT/CN2022/079142 2021-05-10 2022-03-03 一种图像的显示方法及电子设备 WO2022237287A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP22777935.2A EP4117276B1 (en) 2021-05-10 2022-03-03 Image display method and electronic device
US17/919,579 US20240214669A1 (en) 2021-05-10 2022-03-03 Image display method and electronic device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110506826.5A CN113364976B (zh) 2021-05-10 2021-05-10 一种图像的显示方法及电子设备
CN202110506826.5 2021-05-10

Publications (1)

Publication Number Publication Date
WO2022237287A1 true WO2022237287A1 (zh) 2022-11-17

Family

ID=77526204

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/079142 WO2022237287A1 (zh) 2021-05-10 2022-03-03 一种图像的显示方法及电子设备

Country Status (4)

Country Link
US (1) US20240214669A1 (zh)
EP (1) EP4117276B1 (zh)
CN (1) CN113364976B (zh)
WO (1) WO2022237287A1 (zh)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113364976B (zh) * 2021-05-10 2022-07-15 荣耀终端有限公司 一种图像的显示方法及电子设备
TWI812003B (zh) * 2022-02-10 2023-08-11 宏正自動科技股份有限公司 影像預覽方法及預覽系統
CN116051368B (zh) * 2022-06-29 2023-10-20 荣耀终端有限公司 图像处理方法及其相关设备

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180007281A1 (en) * 2014-01-17 2018-01-04 Samsung Electronics Co., Ltd. Method and apparatus for compositing image by using multiple focal lengths for zooming image
CN108965742A (zh) * 2018-08-14 2018-12-07 京东方科技集团股份有限公司 异形屏显示方法、装置、电子设备及计算机可读存储介质
CN110941375A (zh) * 2019-11-26 2020-03-31 腾讯科技(深圳)有限公司 对图像进行局部放大的方法、装置及存储介质
CN111294517A (zh) * 2020-03-03 2020-06-16 华为技术有限公司 一种图像处理的方法及移动终端
CN111541845A (zh) * 2020-04-30 2020-08-14 维沃移动通信(杭州)有限公司 图像处理方法、装置及电子设备
CN113364976A (zh) * 2021-05-10 2021-09-07 荣耀终端有限公司 一种图像的显示方法及电子设备

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009141951A1 (ja) * 2008-05-19 2009-11-26 パナソニック株式会社 映像撮影装置および映像符号化装置
CN106791400B (zh) * 2016-12-23 2019-08-20 维沃移动通信有限公司 一种图像显示方法及移动终端
KR102344104B1 (ko) * 2017-08-22 2021-12-28 삼성전자주식회사 이미지의 표시 효과를 제어할 수 있는 전자 장치 및 영상 표시 방법
KR102418852B1 (ko) * 2018-02-14 2022-07-11 삼성전자주식회사 이미지 표시를 제어하는 전자 장치 및 방법
CN114915726A (zh) * 2019-11-15 2022-08-16 华为技术有限公司 一种拍摄方法及电子设备
CN112598571B (zh) * 2019-11-27 2021-10-08 中兴通讯股份有限公司 一种图像缩放方法、装置、终端及存储介质
CN112637481B (zh) * 2020-11-25 2022-03-29 华为技术有限公司 图像缩放方法和装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180007281A1 (en) * 2014-01-17 2018-01-04 Samsung Electronics Co., Ltd. Method and apparatus for compositing image by using multiple focal lengths for zooming image
CN108965742A (zh) * 2018-08-14 2018-12-07 京东方科技集团股份有限公司 异形屏显示方法、装置、电子设备及计算机可读存储介质
CN110941375A (zh) * 2019-11-26 2020-03-31 腾讯科技(深圳)有限公司 对图像进行局部放大的方法、装置及存储介质
CN111294517A (zh) * 2020-03-03 2020-06-16 华为技术有限公司 一种图像处理的方法及移动终端
CN111541845A (zh) * 2020-04-30 2020-08-14 维沃移动通信(杭州)有限公司 图像处理方法、装置及电子设备
CN113364976A (zh) * 2021-05-10 2021-09-07 荣耀终端有限公司 一种图像的显示方法及电子设备

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4117276A4

Also Published As

Publication number Publication date
EP4117276A4 (en) 2023-10-25
EP4117276A1 (en) 2023-01-11
CN113364976B (zh) 2022-07-15
CN113364976A (zh) 2021-09-07
US20240214669A1 (en) 2024-06-27
EP4117276B1 (en) 2024-07-10

Similar Documents

Publication Publication Date Title
WO2020186969A1 (zh) 一种多路录像方法及设备
WO2022262260A1 (zh) 一种拍摄方法及电子设备
WO2022237287A1 (zh) 一种图像的显示方法及电子设备
CN106605403B (zh) 拍摄方法和电子设备
WO2022262344A1 (zh) 一种拍摄方法及电子设备
WO2020078273A1 (zh) 一种拍摄方法及电子设备
WO2022057723A1 (zh) 一种视频的防抖处理方法及电子设备
WO2022252660A1 (zh) 一种视频拍摄方法及电子设备
WO2022237286A1 (zh) 一种图像的融合方法及电子设备
EP4329287A1 (en) Photographing method and electronic device
KR20110045549A (ko) 듀얼 렌즈를 가진 촬상장치 및 촬상방법
US20140210941A1 (en) Image capture apparatus, image capture method, and image capture program
CN111835973A (zh) 拍摄方法、拍摄装置、存储介质与移动终端
CN114070993B (zh) 摄像方法、装置、摄像设备及可读存储介质
WO2021032117A1 (zh) 一种拍摄方法及电子设备
CN115526787A (zh) 视频处理方法和装置
WO2022105928A1 (zh) 拍摄方法及电子设备
WO2022166371A1 (zh) 多景录像方法、装置及电子设备
KR20210101009A (ko) 복수의 카메라를 이용한 동영상 촬영 방법 및 그 장치
CN115514874B (zh) 一种视频拍摄方法及电子设备
WO2021185374A1 (zh) 一种拍摄图像的方法及电子设备
CN114363678A (zh) 一种投屏方法及设备
CN114979458B (zh) 一种图像的拍摄方法及电子设备
CN113099113B (zh) 电子终端、拍照方法及装置、存储介质
RU2807091C1 (ru) Способ слияния изображений и электронное устройство

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 17919579

Country of ref document: US

ENP Entry into the national phase

Ref document number: 2022777935

Country of ref document: EP

Effective date: 20221007

NENP Non-entry into the national phase

Ref country code: DE