WO2022252780A1 - 拍摄方法及电子设备 - Google Patents

拍摄方法及电子设备 Download PDF

Info

Publication number
WO2022252780A1
WO2022252780A1 PCT/CN2022/083426 CN2022083426W WO2022252780A1 WO 2022252780 A1 WO2022252780 A1 WO 2022252780A1 CN 2022083426 W CN2022083426 W CN 2022083426W WO 2022252780 A1 WO2022252780 A1 WO 2022252780A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
zoom
electronic device
image
fov
Prior art date
Application number
PCT/CN2022/083426
Other languages
English (en)
French (fr)
Inventor
陈蔚
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to EP22814827.6A priority Critical patent/EP4329287A1/en
Publication of WO2022252780A1 publication Critical patent/WO2022252780A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72469User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72439User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W52/00Power management, e.g. TPC [Transmission Power Control], power saving or power classes
    • H04W52/02Power saving arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W52/00Power management, e.g. TPC [Transmission Power Control], power saving or power classes
    • H04W52/02Power saving arrangements
    • H04W52/0209Power saving arrangements in terminal devices
    • H04W52/0261Power saving arrangements in terminal devices managing power supply demand, e.g. depending on battery level
    • H04W52/0274Power saving arrangements in terminal devices managing power supply demand, e.g. depending on battery level by switching on or off the equipment or parts thereof
    • H04W52/028Power saving arrangements in terminal devices managing power supply demand, e.g. depending on battery level by switching on or off the equipment or parts thereof switching on or off only a part of the equipment circuit blocks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/52Details of telephonic subscriber devices including functional features of a camera

Definitions

  • the present application relates to the technical field of terminals, and in particular to a photographing method and electronic equipment.
  • the embodiment of the present application provides a shooting method and electronic equipment, which can avoid FOV center jumps during zoom shooting, realize smooth FOV transition, and improve user experience. Moreover, the corresponding cameras on the electronic equipment can be turned on as needed, saving Power consumption of electronic equipment.
  • an embodiment of the present application provides a shooting method, which is applied to an electronic device including a display screen and multiple cameras, and the multiple cameras include a first camera and a second camera.
  • the method includes: Displaying a first preview image under the first zoom ratio, the first preview image is collected by the first camera; the electronic device detects the user's click operation on the second zoom ratio option; the electronic device generates based on at least one third zoom ratio and At least one second preview image is displayed on the display screen, and at least one third zoom ratio is between the first zoom ratio and the second zoom ratio; the electronic device displays the third preview image under the second zoom ratio on the display screen, and the third The preview image is captured by the second camera.
  • the electronic device can implement the FOV center offset algorithm based on the images captured by the above two cameras, so that the FOV center of the image captured by the current camera gradually approaches the FOV center of the image captured by the target camera to realize the FOV The smooth transition of the center avoids the FOV center jump.
  • the click operation includes: the operation of touching the second zoom factor option with the user's finger or the handwriting device first, and then leaving the operation of the second zoom factor option.
  • the click operation mentioned in the embodiment of the present application can be distinguished from the operation in which the user's finger or handwriting device first touches the second zoom factor option.
  • the electronic device generates and displays at least one second preview image on the display screen based on at least one third zoom ratio, which specifically includes: when the electronic device detects that the user's finger or the handwriting device touches the second zoom ratio During operation of the option, at least one second preview image is generated and displayed on the display screen based on at least one third zoom ratio.
  • the method for determining the number N of the second preview images is as follows:
  • T is the time interval between the electronic device detecting the user's click operation on the second zoom factor option and the electronic device displaying the third preview image under the second zoom factor on the display screen
  • t is the current camera system's adjacent The time interval between two preview frames.
  • the method for determining the zoom value zoomValue corresponding to the third zoom ratio is as follows:
  • S is the distance between the first zoom factor option and the second zoom factor option on the display screen
  • n is a positive integer smaller than the number N of the second preview images.
  • the zoom value corresponding to the third zoom ratio can be calculated.
  • the FOV center of the first preview image is consistent with the FOV center of the first camera
  • the FOV center of the third preview image is consistent with the FOV center of the second camera.
  • the method before the electronic device detects the user's click operation on the second zoom factor option, the method further includes: the electronic device powers on the second camera, and starts and runs the second camera.
  • the electronic device can operate multiple cameras at the same time, and save the time of turning on the cameras in a scene where zooming is required.
  • the FOV center of the at least one second preview image gradually approaches the FOV center of the second camera.
  • the method further includes: the electronic device powers on the second camera, and starts and runs the second camera.
  • the camera can be turned on as needed to save power consumption of electronic equipment.
  • the FOV center of at least one second preview image is consistent with the FOV center of the first camera; after the electronic device runs the second camera, at least one second preview image The center of FOV of the image starts to get closer to the center of FOV of the second camera.
  • the embodiment of the present application provides an electronic device, including a display screen, multiple cameras with different focal lengths, a memory and a processor coupled to the memory, multiple application programs, and one or more programs; wherein, multiple The optical centers of the two cameras do not coincide.
  • the multiple cameras include a first camera and a second camera.
  • the first camera and the second camera are two cameras with adjacent focal lengths among the multiple cameras.
  • an embodiment of the present application provides a computer storage medium, the computer storage medium stores a computer program, the computer program includes program instructions, and when the program instructions are run on the electronic device, the electronic device is made to perform any of the above aspects.
  • an embodiment of the present application provides a computer program product, which, when running on a computer, causes the computer to execute the method in any possible implementation manner of any one of the above aspects.
  • FIG. 1 is a schematic structural diagram of an electronic device provided in an embodiment of the present application.
  • FIG. 2A-FIG. 2B are schematic diagrams of the appearance and structure of an electronic device provided by an embodiment of the present application.
  • 3A-3B are schematic user interface diagrams of a typical shooting scene provided by the embodiment of the present application.
  • FIG. 4A-FIG. 4H are schematic diagrams of a user interface for zooming in on a preview scene provided by an embodiment of the present application
  • FIG. 5A-FIG. 5H are schematic diagrams of the user interface for zoom reduction in the preview scene provided by the embodiment of the present application.
  • FIG. 6A-FIG. 6H are schematic diagrams of the user interface for zooming in the recording scene provided by the embodiment of the present application.
  • FIG. 7 is a schematic flow diagram of a method for realizing a smooth transition of the FOV center in a point-and-click zoom scene provided by an embodiment of the present application;
  • FIG. 8 is a schematic diagram of the specific execution process of the FOV center offset algorithm provided by the embodiment of the present application.
  • FIG. 9A-FIG. 9D are schematic diagrams of states of turning on and off of multiple cameras provided by the embodiment of the present application.
  • Fig. 10 is a schematic diagram of cooperation of some software and hardware of an electronic device when the zoom is increased according to the embodiment of the present application;
  • Fig. 11 is a schematic diagram of cooperation of some software and hardware of an electronic device when the zoom is reduced according to the embodiment of the present application;
  • Fig. 12 is a schematic flowchart of a photographing method provided by an embodiment of the present application.
  • the embodiment of the present application provides a shooting method, which can avoid FOV center jump during zoom shooting, realize smooth transition of FOV, and improve user experience. Moreover, the corresponding camera on the electronic device can be turned on as required, so as to save power consumption of the electronic device.
  • the photographing method provided in the embodiment of the present application may be applied to an electronic device having multiple cameras with different focal lengths.
  • the plurality of cameras may be ordinary cameras, telephoto cameras, wide-angle cameras, and the like. Since the positions of the multiple cameras on the electronic device are different, the optical centers of the multiple cameras are not coincident, resulting in inconsistent FOV centers of the multiple cameras.
  • the optical zoom magnification of the ordinary camera is 1x (expressed as 1X)
  • the optical zoom magnification of the telephoto camera is 5x, expressed as 5X (subsequent X indicates the zoom magnification)
  • the optical zoom magnification of the wide-angle camera is 0.4X.
  • the shooting method provided by the embodiment of the present application can solve the FOV center jump problem that occurs during the optical zooming process, and can only open a default camera in a scene that does not need zooming, and then turn on a default camera in a scene that requires zooming. Turn on other corresponding cameras on the electronic device to save power consumption of the electronic device.
  • the electronic device can only turn on a camera with a larger FOV (such as a normal camera) by default, and when it detects that the user increases the zoom ratio, the electronic device can turn on a camera with a smaller FOV (such as a telephoto camera).
  • the electronic device can use the FOV center offset algorithm to crop the image captured by the camera with a larger FOV, so that the FOV center of the cropped image gradually approaches the FOV of the camera with a smaller FOV Center, to achieve a smooth transition of FOV and avoid FOV center jump.
  • the camera with a larger FOV and the camera with a smaller FOV refer to two cameras involved in one camera switching, for example, switching from a normal camera to a telephoto camera.
  • Use the FOV center offset algorithm to eccentrically crop the image captured by the camera with a larger FOV mainly refers to: during the zooming process, as the zoom magnification continues to approach the zoom magnification of the camera switch, the camera with a larger FOV
  • the captured image is cropped several times successively, so that the FOV center of the image obtained by each crop (for example, the image cropped under 3.1X, 3.2X, 3.3X, ... 4.9X) gradually approaches with a smaller
  • the FOV center of the FOV camera, even the cropped image at 4.9X can coincide with the FOV center of the telephoto camera, thereby avoiding FOV center jumps.
  • the implementation of eccentric cropping of images captured by a camera with a larger FOV based on the FOV center offset algorithm will be described in detail in the following content, which will not be expanded here.
  • the shooting method provided by the embodiment of the present application facilitates the user to "lock” a certain target scene during the zooming process, It avoids the situation of "lost” the target scene due to camera switching, and also avoids users from repeatedly framing and focusing, which improves the efficiency and convenience of zoom shooting.
  • other corresponding cameras on the electronic device are turned on in a scene requiring zooming, which saves power consumption of the electronic device.
  • the shooting method provided by the embodiment of the present application is also applicable to the usage scenario of reducing the zoom ratio (for example, switching from a normal camera to a wide-angle camera).
  • the aforementioned electronic devices may be mobile phones, tablet computers, wearable devices, vehicle-mounted devices, augmented reality (augmented reality, AR)/virtual reality (virtual reality, VR) equipment, notebook computers, ultra-mobile personal computers (ultra-mobile personal computers, UMPC), netbook, personal digital assistant (personal digital assistant, PDA) or special camera (such as single-lens reflex camera, card camera), etc., this application does not make any limitation to the concrete type of above-mentioned electronic equipment.
  • the zoom ratio of camera switching when increasing the zoom ratio, is 4.9X-5X, and the camera switching specifically refers to switching from a normal camera to a telephoto camera, that is to say, switching from a normal camera
  • the zoom magnification switching point when reaching the telephoto camera is 5X.
  • FIG. 1 exemplarily shows the structure of an electronic device 100 provided by an embodiment of the present application.
  • the electronic device 100 may have multiple cameras 193 , such as a normal camera, a wide-angle camera, an ultra-wide-angle camera, a telephoto camera, and the like.
  • the focal length of an ultra-wide-angle camera is generally about 12 millimeters (mm)-24mm, and the viewing angle of an ultra-wide-angle camera is generally 84°-120°;
  • the focal length of a wide-angle camera is generally about 24mm-35mm, and the viewing angle of a wide-angle camera is generally 63°-84°;
  • the focal length of an ordinary camera is generally about 50mm, and the viewing angle of an ordinary camera is generally about 46°;
  • the focal length of a telephoto camera is generally about 135mm-500mm, and the viewing angle of a telephoto camera is generally 5°-18°;
  • the focal length of the ultra-telephoto camera generally exceeds 500mm, and the viewing angle of the ultra-telephoto camera is generally 0°-5°.
  • the performance of these types of cameras in terms of viewing angles is as follows: the super wide-angle camera is better than the wide-angle camera, the wide-angle camera is better than the normal camera, the normal camera is better than the telephoto camera, and the telephoto camera is better than the ultra-telephoto camera.
  • the optical centers of the multiple cameras are not coincident, resulting in inconsistent FOV centers of the multiple cameras.
  • the optical zoom ratio of the ordinary camera is 1X
  • the optical zoom ratio of the telephoto camera is 5X
  • the optical zoom ratio of the wide-angle camera is 0.4X.
  • the optical zoom process involving camera switching will cause jumps in the FOV center. Different positions affect the user's focus.
  • the electronic device 100 may also include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charging management module 140, a power management module 141, a battery 142, and an antenna 1 , antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone jack 170D, sensor module 180, button 190, motor 191, indicator 192, display screen 194, and A subscriber identification module (subscriber identification module, SIM) card interface 195 and the like.
  • SIM subscriber identification module
  • the processor 110 may include one or more processing units, for example: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (graphics processing unit, GPU), image signal processing device (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor (neural-network processing unit, NPU )Wait.
  • application processor application processor, AP
  • modem processor graphics processing unit
  • graphics processing unit graphics processing unit
  • image signal processing device image signal processor, ISP
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • baseband processor baseband processor
  • neural network processor neural-network processing unit
  • the memory in processor 110 is a cache memory.
  • the memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be directly called from the memory. Repeated access is avoided, and the waiting time of the processor 110 is reduced, thus improving the efficiency of the system.
  • processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuitsound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver) /transmitter, UART) interface, mobile industry processor interface (mobile industry processor interface, MIPI), general-purpose input and output (general-purpose input/output, GPIO) interface, subscriber identity module (subscriber identity module, SIM) interface, and/or Universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous receiver transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input and output
  • subscriber identity module subscriber identity module
  • SIM subscriber identity module
  • USB Universal serial bus
  • the interface connection relationship between the modules shown in the embodiment of the present application is only a schematic illustration, and does not constitute a structural limitation of the electronic device 100 .
  • the electronic device 100 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the charging management module 140 is configured to receive a charging input from a charger.
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 may also be disposed in the processor 110 .
  • the power management module 141 and the charging management module 140 may also be set in the same device.
  • the wireless communication function of the electronic device 100 can be realized by the antenna 1 , the antenna 2 , the mobile communication module 150 , the wireless communication module 160 , a modem processor, a baseband processor, and the like.
  • the mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G applied on the electronic device 100.
  • the wireless communication module 160 can provide wireless local area networks (wireless local area networks, WLAN) (such as wireless fidelity (Wireless Fidelity, Wi-Fi) network), bluetooth (bluetooth, BT), global navigation satellite system, etc. (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • the antenna 1 of the electronic device 100 is coupled to the mobile communication module 150
  • the antenna 2 is coupled to the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology mentioned above may include global system for mobile communications (GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), Wideband code division multiple access (WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR techniques, etc.
  • GSM global system for mobile communications
  • GPRS general packet radio service
  • code division multiple access code division multiple access
  • WCDMA Wideband code division multiple access
  • TD-SCDMA time-division code division multiple access
  • LTE long term evolution
  • BT BT
  • GNSS can include global positioning system (global positioning system, GPS), global navigation satellite system (global navigation satellite system, GLONASS), Beidou satellite navigation system (beidounavigation satellite system, BDS), quasi-zenith satellite system (quasi-zenith satellite system) , QZSS) and/or satellite
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, so as to expand the storage capacity of the electronic device 100.
  • the internal memory 121 may be used to store computer-executable program codes including instructions.
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by executing instructions stored in the internal memory 121 .
  • the internal memory 121 may include an area for storing programs and an area for storing data.
  • the electronic device 100 can implement audio functions through the audio module 170 , the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signal.
  • the audio module 170 may also be used to encode and decode audio signals.
  • the audio module 170 may be set in the processor 110 , or some functional modules of the audio module 170 may be set in the processor 110 .
  • the sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an environmental Light sensor 180L, bone conduction sensor 180M, etc.
  • the pressure sensor 180A is used to sense the pressure signal and convert the pressure signal into an electrical signal.
  • the gyro sensor 180B can be used to determine the motion posture of the electronic device 100, such as the angular velocity of the electronic device 100 around three axes (ie, x, y and z axes).
  • the air pressure sensor 180C is used to measure air pressure.
  • the magnetic sensor 180D includes a Hall sensor.
  • the acceleration sensor 180E can detect the acceleration of the electronic device 100 in various directions (generally three axes).
  • the distance sensor 180F is used to measure the distance.
  • the electronic device 100 may measure the distance by infrared or laser. In some embodiments, when shooting a scene, the electronic device 100 may use the distance sensor 180F for distance measurement to achieve fast focusing.
  • Proximity light sensor 180G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
  • the light emitting diodes may be infrared light emitting diodes.
  • the ambient light sensor 180L is used for sensing ambient light brightness.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the temperature sensor 180J is used to detect temperature.
  • Touch sensor 180K also known as "touch panel”.
  • the touch sensor 180K can be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the bone conduction sensor 180M can acquire vibration signals.
  • the keys 190 include a power key, a volume key and the like.
  • the motor 191 can generate a vibrating reminder.
  • the indicator 192 can be an indicator light, and can be used to indicate charging status, power change, and can also be used to indicate messages, missed calls, notifications, and the like.
  • SIM card interface 195 is used for connecting SIM card.
  • the electronic device 100 can realize the shooting function through an image processor (Image Signal Processor, ISP), a camera 193, a video codec, a GPU, a display screen 194, and an application processor.
  • image processor Image Signal Processor, ISP
  • ISP Image Signal Processor
  • the ISP is used for processing the data fed back by the camera 193 .
  • the light is transmitted to the photosensitive element of the camera through the lens, and the optical signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin color.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene. Not limited to being integrated in the processor 110 , the ISP can also be set in the camera 193 .
  • the camera 193 includes a lens and a photosensitive element (also called an image sensor) for capturing still images or videos.
  • the object generates an optical image through the lens and projects it to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP for conversion into a digital image signal, such as standard RGB, YUV and other image signals.
  • camera 193 may be used to collect depth data.
  • the camera 193 may have a (time of flight, TOF) 3D sensing module or a structured light (structured light) 3D sensing module for acquiring depth information.
  • the camera used to collect depth data may be a front camera or a rear camera.
  • Video codecs are used to compress or decompress digital images.
  • the electronic device 100 may support one or more image codecs. In this way, the electronic device 100 can open or save pictures or videos in various encoding formats.
  • the electronic device 100 may implement a display function through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 194 is used for displaying images, videos, etc., such as images collected by the camera 193 .
  • the display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrixorganic light-emitting diode) , AMOLED), flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diodes (quantum dot light emitting diodes, QLED), etc.
  • electronic device 100 may include one or more display screens 194 .
  • the electronic device 100 shown in FIG. 1 is merely an example, and that the electronic device 100 may have more or fewer components than those shown in FIG. 1, two or more components may be combined, or Different component configurations are possible.
  • the various components shown in FIG. 1 may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
  • FIG. 2A and FIG. 2B exemplarily show the appearance structure of an electronic device 100 provided by the embodiment of the present application.
  • FIG. 2A shows a side where the display screen 194 of the electronic device 100 is located.
  • FIG. 2B shows the side of the electronic device 100 where the rear cover is located.
  • the electronic device 100 may have multiple cameras 193 .
  • the electronic device 100 may include multiple front cameras.
  • the front camera 193 - 1 and the front camera 193 - 2 can be arranged on the top of the electronic device 100 , such as the "notch" position of the electronic device 100 (ie, the area AA shown in FIG. 2A ).
  • the area AA may also include a speaker 170A and the like.
  • the electronic device 100 may include multiple rear cameras, such as a rear camera 193-3, a rear camera 193-4, and a rear camera 193-5.
  • the rear camera 193-3, the rear camera 193-4 and the rear camera 193-5 can be a common camera, a wide-angle camera and a telephoto camera respectively.
  • a flashlight 196 and the like may also be arranged near the camera 193 .
  • the camera 193 can change the viewing angle of the preview image in the preview frame by digital zooming, or can change the viewing angle of the preview image in the preview frame by optical zooming, or can also use a combination of optical zooming and digital zooming (also known as hybrid zooming). Zoom) to change the viewing angle of the preview image. That is, zooming may include digital zooming, optical zooming, or hybrid zooming. The following is an example of hybrid zoom.
  • the electronic device 100 can change the size of the preview viewing angle presented by a series of images displayed in the preview frame by changing the cameras used for shooting among the plurality of cameras 193 and combining with digital zoom.
  • the aforementioned camera used for shooting may refer to a camera whose collected images are displayed in a preview frame.
  • the above digital zoom can increase the area of each pixel in the image captured by the camera 193 for the electronic device 100 to achieve the purpose of changing the focal length. This is equivalent to the electronic device 100 cropping an image captured by a camera, and then enlarging the cropped image, that is, adjusting the resolution of the cropped image to be the same as the resolution of the image before cropping.
  • the electronic device 100 will start the telephoto camera 193-5, and zoom between 1X and 5X according to a certain zoom step (for example, 0.1X) (for example, from 1X, 1.1X, 1.2X...4.9X until 5X ).
  • a certain zoom step for example, 0.1X
  • the electronic device 100 may first crop the image captured by the common camera 193-3, and the cropping at this stage may be center cropping.
  • the electronic device 100 can use the FOV center offset algorithm to eccentrically crop the image captured by the ordinary camera 193-3, so that the FOV center of the cropped image gradually approaches the telephoto camera 193
  • the FOV center of -5, the FOV center changes more smoothly, instead of suddenly jumping to the FOV center of the telephoto camera 193-5. That is, the electronic device 100 digitally zooms multiple times on the image captured by the common camera 193-3.
  • the optical zoom ratio for example, 5X
  • the electronic device 100 switches to the telephoto camera 193-5, and the image displayed in the preview frame becomes captured by the telephoto camera 193-5.
  • the received image that is, the electronic device 100 performs optical zooming. That is to say, when the zoom ratio is continuously increased, the image display may include two successive stages: the stage of using the common camera 193-3 to collect images, and the stage of using the telephoto camera 193-5 to collect images. At the stage of using the images collected by the common camera 193-3, the zoom ratio is gradually increased from 1X, for example, the zoom ratio is increased from 1X to 3X, and from 3X to 4.9X. When the zoom ratio increases to 5X, switch to use The stage of collecting images by the telephoto camera 193-5.
  • the FOV of the image from the camera displayed in the preview frame may generally be smaller than the FOV of the camera.
  • the image in the preview frame is cropped from the image captured by the camera.
  • the zoom ratio is changed to a specific ratio, the image displayed in the preview box can present a FOV as large as the FOV of this one camera.
  • the FOV of the image displayed in the preview frame is generally smaller than that of the common camera 193-3.
  • the zoom magnification is 1X
  • the image displayed in the preview frame may present a FOV smaller than that of the general camera 193-3.
  • the electronic device 100 can detect the user's operation of opening the "camera” application program, for example, the operation of clicking the camera icon 215D in the main interface (Home screen) shown in FIG. 3A.
  • the electronic device 100 may display the user interface 301 exemplarily shown in FIG. 3B , that is, a user interface of the "camera" application program.
  • the main interface shown in FIG. 3A may include a status bar 211, a tray 215 with a list of frequently used applications, a calendar indicator 212, a weather indicator 213, a navigation bar 216, and other application icons 214, among others.
  • “Camera” is an application program for capturing images on electronic devices 100 such as smartphones and tablet computers, and this application does not limit the name of the application program. Not limited to what is shown in FIGS. 3A-3B , the user can also open the user interface 301 in other application programs.
  • the user interface 301 may be the user interface of the default camera mode of the "camera" application.
  • the default photographing mode may be the default photographing mode of the rear ordinary camera, or other modes, which are not limited here.
  • the user interface 301 may include: a setting control 310, a flash control 309, a zoom bar 308, a zoom ratio 307, a preview frame 306, a camera flip control 305, a gallery shortcut control 304, a shutter control 303, and a camera mode option 302 . in:
  • the setting control 310 can be used to adjust the parameters of taking pictures (such as resolution, filter, etc.) and to turn on or off some ways for taking pictures (such as timing pictures, smiling snapshots, voice-activated pictures, etc.).
  • the setting control 310 can be used to set more other shooting functions, which is not limited in this embodiment of the present application.
  • the flash control 309 can be used to turn on or off the flash.
  • zoom points on the zoom bar 308 There are multiple zoom points on the zoom bar 308 for indicating different zoom ratios. Different zoom points indicate different zoom ratios.
  • Zoom factor 307 may be used to indicate the current zoom factor. Wherein, the larger the zoom ratio 307 is, the smaller the FOV of the image displayed in the preview frame 306 is. Conversely, the smaller the zoom factor 307 is, the larger the FOV of the image displayed in the preview frame 306 is. As shown in FIG. 3B, 1X may be the default zoom ratio of the camera application. The default zoom ratio may also be other values, and the embodiment of the present application does not limit the default zoom ratio.
  • the preview frame 306 can be used to display images captured by the camera 193 in real time.
  • the electronic device 100 can refresh the display content therein in real time, so that the user can preview the image currently collected by the camera 193 .
  • the camera switching control 305 can be used to monitor a user operation that triggers switching of the camera, and in response to the operation, the electronic device 100 can switch the camera, for example, switch the rear camera to the front camera.
  • the gallery shortcut key 304 can be used to open the gallery application.
  • the electronic device 100 may launch the gallery application.
  • the gallery application program is an application program for picture management on electronic devices such as smart phones and tablet computers, and may also be called "album".
  • the name of the application program is not limited in this embodiment.
  • the gallery application program can support users to perform various operations on pictures stored on the electronic device 100, such as browsing, editing, deleting, selecting and other operations.
  • the electronic device 100 may also display the thumbnails of the saved images in the gallery shortcut key 304 .
  • the shutter control 303 can be used to monitor the user's operation triggering to take pictures.
  • the electronic device 100 may save the image in the preview frame 307 as a picture in the gallery application.
  • One or more shooting mode options may be displayed in the camera mode option 302 .
  • the one or more shooting mode options may include: a large aperture mode option 302A, a video recording mode option 302B, a photographing mode option 302C, a portrait mode option 302D, and more options 302E.
  • the electronic device 100 may enable the shooting mode selected by the user.
  • the electronic device 100 may further display more other shooting mode options, such as slow-motion shooting mode options, etc., which may present richer camera functions to the user.
  • the more option 302E may not be displayed in the camera mode option 302 , and the user can browse other shooting mode options by sliding left/right in the camera mode option 302 .
  • UI user interface
  • Figure 4A- Figure 4H exemplarily shows the user interface for zooming in the preview scene. It is 1X, and the default camera of the electronic device 100 is a normal camera. After detecting that the user presses the 5X zoom point on the zoom bar 308 , the electronic device 100 turns on the telephoto camera.
  • FIG. 4A exemplarily shows a preview scene: the image displayed in the preview frame 306 (which may be called a preview image) is from a common camera, and the zoom factor 307 is 1X.
  • the electronic device 100 can detect the user's operation of increasing the zoom ratio (for example, the user's click operation on the 5X zoom point on the zoom bar 308), and in response to this operation, the electronic device 100 can zoom out the preview.
  • the FOV in which the image is presented.
  • the zoom factor 307 displayed in the preview frame 306 will gradually increase.
  • the electronic device 100 may display the image captured by the telephoto camera in the preview frame 306 , that is, the preview image is switched from the telephoto camera, that is, the electronic device 100 has performed optical zooming. Assuming that the minimum change unit of the zoom ratio is 0.1X, then 4.9X-5X is the optical zoom ratio for switching from a normal camera to a telephoto camera.
  • the ordinary camera is always in the running state, and the telephoto camera is in the startup state and has not yet entered the running state. Therefore, the In the process, only the ordinary camera collects images, and the electronic device 100 gradually reduces the FOV presented by the preview image (that is, the image M shown in FIG. The image presented coincides with the center O2 of the FOV.
  • the normal camera when the zoom ratio 307 increases to 3.1X, the normal camera is in the running state, and the telephoto camera starts to enter the running state, that is, the normal camera and the telephoto camera start to collect images at the same time, During the process of increasing the zoom ratio 307 from 3.1X to 4.9X, the normal camera and the telephoto camera are always in operation, collecting images at the same time, until the zoom ratio 307 increases to 5X, the normal camera is turned off, and the telephoto camera is still running state, at this time, only the telephoto camera collects images.
  • the electronic device 100 can also move the FOV center of the preview image toward the FOV center of the telephoto camera, Instead of jumping directly from the FOV center of the ordinary camera to the FOV center of the telephoto camera.
  • the zoom magnification continues to increase to 5X, the optical zoom magnification of the telephoto camera, the FOV center of the preview image will gradually approach, and even eventually coincide with the FOV center of the telephoto camera, avoiding the jump of the FOV center.
  • Fig. 4A, Fig. 4B, Fig. 4C, and Fig. 4D respectively show preview images at 1X, 3X, 3.1X, and 5X, and the centers of their FOVs are located at positions O1, O2, O3, and O4, respectively. It can be seen that O3 is closer to O4 than O2. In this way, when the zoom ratio increases from 1X to 5X, the FOV center of the preview image will not suddenly change from O1 to O4, but a smoother transition. A jump occurs.
  • the image M is an image collected by a normal camera
  • FOV1 is the FOV of the normal camera
  • FOV2 is the FOV of the telephoto camera.
  • the FOV of the normal camera covers the FOV of the telephoto camera, and the distance between the FOV center O4 of the telephoto camera and the FOV center O1 of the normal camera is far. This longer distance is caused by the misalignment of the optical centers of the telephoto camera and the ordinary camera.
  • the electronic device 100 can gradually move the cropping center of the cropping area closer to the FOV center of the telephoto camera, that is, execute FOV center offset algorithm to achieve off-center cropping.
  • image 1 is a preview image at 1X
  • image 1 can be obtained by cropping image M
  • the center of FOV of image 1 coincides with the center O1 of FOV1.
  • image 2 is a preview image at 3X
  • image 2 can be obtained by cropping image M.
  • the FOV center of image 2 coincides with the center O1 of FOV1.
  • image 3 is a preview image at 3.1X, and image 3 can be obtained by cropping image M.
  • the FOV center of image 3 no longer coincides with the center O1 of FOV1, but deviates from O1 and is closer to the FOV center O4 of the telephoto camera.
  • image T is a preview image at 5X
  • image T is an image collected by a telephoto camera.
  • the FOV center of the image T coincides with the FOV center O4 of the telephoto camera.
  • Figure 4E- Figure 4H only uses the example of executing the FOV center offset algorithm when the telephoto camera starts to operate to illustrate an implementation of the FOV center offset algorithm, that is, the size of the electronic device increases from 3X to 3.1X, 3.2X, ... , Up to 4.9X the entire digital zoom process executes the FOV center offset algorithm.
  • the electronic device 100 may also execute the FOV center shift algorithm after a period of time after the telephoto camera starts to operate (for example, after the electronic device 100 zooms to 4X), for example, the electronic device 100 may be enlarged from 4X to 4.1X , 4.2X, ..., up to 4.9X, the entire digital zoom process executes the FOV center offset algorithm.
  • the embodiment of the present application does not limit the speed of using the FOV center offset algorithm to crop and approach the FOV center of the telephoto camera. Once, it can also move to the center of the FOV of the telephoto camera every time it changes by 0.2X.
  • the FOV of the image 1 may also be FOV1, that is, the image M may be a preview image at 1X.
  • the method of increasing the zoom ratio described in the embodiment of Figures 4A-4H is also applicable to the scene where the wide-angle camera is switched to a normal camera, the scene where the ultra-wide-angle camera is switched to a wide-angle camera, and the scene where the telephoto camera is switched to an ultra-telephoto camera.
  • the embodiment does not limit this.
  • the embodiment of the present application only uses the telephoto camera after zooming to 3X as an example to illustrate an implementation of the FOV center offset algorithm. In some embodiments, it can also be after zooming to other magnifications ( For example, 2X, 3.5X, 4X, etc.) start to operate the telephoto camera, which is not limited in this embodiment of the present application.
  • Figure 5A- Figure 5H exemplarily shows the user interface for zoom reduction in the preview scene.
  • the optical zoom magnification of the ordinary camera is 1X
  • the optical zoom magnification of the wide-angle camera is 0.4X (wide)
  • the default camera of the electronic device 100 is a normal camera. After detecting that the user presses the 0.4X zoom point on the zoom bar 308 , the electronic device 100 turns on the wide-angle camera.
  • FIG. 5A exemplarily shows a preview scene: the image displayed in the preview frame 306 (which may be called a preview image) is from a common camera, and the zoom factor 307 is 1X.
  • the electronic device 100 can detect the user's operation of reducing the zoom ratio (for example, the user's click operation on the 0.4X zoom point on the zoom bar 308), and in response to this operation, the electronic device 100 can zoom out.
  • the zoom factor 307 displayed in the preview frame 306 will gradually decrease.
  • the electronic device 100 can display the image captured by the wide-angle camera in the preview frame 306, that is, the preview image is switched to come from the wide-angle camera, that is, the electronic device 100 has performed optical zoom. Assuming that the minimum change unit of the zoom ratio is 0.1X, then 0.5X-0.4X is the zoom ratio for switching from a normal camera to a wide-angle camera.
  • the ordinary camera in the process of reducing the zoom ratio 307 from 1X to 0.9X, the ordinary camera is always in the running state, and the wide-angle camera is in the startup state and has not yet entered the running state. Therefore, the In the process, only the ordinary camera collects images, and the electronic device 100 gradually enlarges the FOV presented by the preview image (that is, performs a center cut on the image M' shown in FIG. 5E ). The preview image presented coincides with the FOV center O2.
  • the zoom ratio 307 when the zoom ratio 307 is reduced to 0.8X, the normal camera is in the running state, and the wide-angle camera starts to enter the running state, that is, the normal camera and the wide-angle camera start to capture images at the same time.
  • the common camera and the wide-angle camera are always in the running state, collecting images at the same time, until the zoom magnification 307 is reduced to 0.4X, the common camera is closed, and the wide-angle camera is still in the running state. , only the wide-angle camera captures images.
  • the electronic device 100 can also move the FOV center of the preview image toward the FOV center of the wide-angle camera, Instead of jumping directly from the FOV center of the normal camera to the FOV center of the wide-angle camera.
  • the zoom magnification decreases to 0.4X, the optical zoom magnification of the wide-angle camera, the FOV center of the preview image will gradually approach, and even eventually coincide with the FOV center of the wide-angle camera, avoiding the jump of the FOV center.
  • Fig. 5A, Fig. 5B, Fig. 5C, and Fig. 5D respectively show preview images at 1X, 0.9X, 0.8X, and 0.4X, and their FOV centers are respectively at positions O1', O2', O3', O4' place. It can be seen that O3' is closer to O4' than O2', so that when the zoom ratio is reduced from 1X to 0.4X, the FOV center of the preview image will not suddenly change from O1' to O4', but more Smooth transitions without jumps.
  • the image M' is the image collected by the common camera
  • FOV1' is the FOV of the common camera
  • FOV2' is the FOV of the wide-angle camera.
  • the FOV of the wide-angle camera covers the FOV of the ordinary camera, and the distance between the FOV center O4' of the wide-angle camera and the FOV center O1' of the ordinary camera is relatively large. This longer distance is due to the misalignment of the optical centers of the wide-angle camera and the ordinary camera.
  • FIGS Close to the FOV center of the wide-angle camera, that is, execute the FOV center offset algorithm to achieve off-center cropping.
  • image 1' is a preview image at 1X
  • image 1' can be obtained by cutting image M'
  • the FOV center of image 1' coincides with the center O1' of FOV1'
  • image 2' is a preview image at 0.9X
  • image 2' can be obtained by cropping image M'.
  • the FOV center of image 2' coincides with the center O1' of FOV1'.
  • the image 3' is a preview image at 0.8X, and the image 3' can be obtained by cutting the image M'.
  • image T' is a preview image at 0.4X
  • image T' is an image collected by a wide-angle camera.
  • the FOV center of the image T' coincides with the FOV center O4' of the wide-angle camera.
  • Figures 5E-5H illustrate an implementation of the FOV center offset algorithm by simply taking the FOV center offset algorithm as an example when the wide-angle camera starts to operate, that is, the electronic device is reduced from 0.9X to 0.8X, 0.7X, ... , up to 0.4X the entire digital zoom process executes the FOV center offset algorithm.
  • the electronic device 100 may also execute the FOV center offset algorithm after a period of time after the wide-angle camera starts to operate (for example, after the electronic device 100 zooms to 0.6X), for example, the electronic device 100 may reduce the FOV center shift algorithm from 0.6X to 0.5 The entire digital zoom process of X, 0.4X executes the FOV center offset algorithm.
  • the embodiment of the present application does not limit the speed of cropping toward the FOV center of the wide-angle camera by using the FOV center offset algorithm. , or move to the center of the FOV of the telephoto camera every 0.2X.
  • the FOV of image 1' needs to be smaller than FOV1'. In this way, it can be ensured that in the process of switching from a normal camera to a wide-angle camera, a smooth transition can be achieved by cutting the image M' captured by a normal camera to avoid FOV center jump phenomenon.
  • the preview image actually corresponds to the image when the zoom ratio is 2X, which is easy to understand.
  • the magnification is mapped to the zoom magnification displayed in the zoom magnification 307 , so that the FOV of the image 1 ′ is smaller than the FOV ′.
  • the method of reducing the zoom ratio described in the embodiment of Figures 5A-5H is also applicable to the scene where the wide-angle camera is switched to the ultra-wide-angle camera, the scene where the telephoto camera is switched to a normal camera, and the scene where the ultra-telephoto camera is switched to a telephoto camera.
  • the embodiment of the application does not limit this.
  • the embodiment of the present application only uses the wide-angle camera after zooming to 0.9X as an example to illustrate an implementation of the FOV center offset algorithm. In some embodiments, it can also be after zooming to other magnifications ( For example, 0.8X, 0.7X, etc.) start to operate the wide-angle camera, which is not limited in this embodiment of the present application.
  • the zoom preview user interface exemplarily shown in FIGS. 4A-4D and FIGS. 5A-5D may also be used for the user to take pictures during the zooming process.
  • the electronic device 100 may save the images in the preview frames under these different zoom ratios as photos, and then the FOV centers presented by these photos are in a smooth transition.
  • the electronic device 100 may save the images in the preview frame under these different zoom ratios as photos, and then the FOV centers presented by these photos are in a smooth transition.
  • 6A-6H exemplarily show a user interface for zooming in a recording scene.
  • FIG. 6A-FIG. 6D exemplarily show a user interface for performing zoom increase in a video recording scene.
  • the electronic device 100 can detect the user's operation of increasing the zoom ratio (for example, the user clicks on the 5X zoom point on the zoom bar 308), in response to this operation, during the process of increasing the zoom ratio 307 from 1X to 3X, the electronic device 100 can gradually reduce the FOV presented by the preview image.
  • the user's operation of increasing the zoom ratio For example, the user clicks on the 5X zoom point on the zoom bar 308
  • the electronic device 100 can gradually reduce the FOV presented by the preview image.
  • the electronic device 100 can also move the FOV center of the preview image toward the FOV center of the telephoto camera instead of directly moving from The FOV center of the normal camera jumps to the FOV center of the telephoto camera.
  • the zoom magnification continues to increase to 5X, the optical zoom magnification of the telephoto camera, the FOV center of the preview image will gradually approach, and even eventually coincide with the FOV center of the telephoto camera, avoiding the jump of the FOV center.
  • the image in the preview box in FIGS. 6A-6D can be saved as a video.
  • FIG. 6E-FIG. 6H exemplarily show a user interface for zoom reduction in a recording scene.
  • the optical zoom magnification of the ordinary camera is 1X
  • the optical zoom magnification of the wide-angle camera is 0.4X (wide)
  • the default camera of the electronic device 100 is an ordinary camera. After pressing the 0.4X zoom point operation on the zoom bar 308 , the electronic device 100 turns on the wide-angle camera.
  • the electronic device 100 can detect the user's operation of reducing the zoom ratio (for example, the user clicks on the 0.4X zoom point on the zoom bar 308), in response to this operation, during the process of reducing the zoom ratio 307 from 1X to 0.9X, the electronic device 100 gradually expands the FOV presented by the preview image.
  • the user's operation of reducing the zoom ratio For example, the user clicks on the 0.4X zoom point on the zoom bar 308
  • the electronic device 100 gradually expands the FOV presented by the preview image.
  • the electronic device 100 can also move the FOV center of the preview image toward the FOV center of the wide-angle camera instead of directly Jump from the FOV center of the normal camera to the FOV center of the wide-angle camera.
  • the zoom magnification decreases to 0.4X, the optical zoom magnification of the wide-angle camera, the FOV center of the preview image will gradually approach, and even eventually coincide with the FOV center of the wide-angle camera, avoiding the jump of the FOV center.
  • the difference from the embodiment shown in FIGS. 5A-5D is that the image in the preview frame in FIGS. 6E-6H , that is, the recorded image, can be saved as a video.
  • click to zoom means that the user switches the current zoom point by clicking the zoom point on the zoom bar 308 except for the current zoom point (also referred to as the first zoom factor option).
  • the process of reaching the target zoom point For example, in FIG. 4A , the user switches the current zoom point 1X zoom point to the target zoom point 5X zoom point by clicking on the 5X zoom point on the zoom bar 308 in FIG. 4A .
  • the click operation of the 0.4X zoom point on the zoom bar 308 switches the current zoom point 1X zoom point to the target zoom point 0.4X zoom point.
  • FIG. 7 exemplarily shows a flow of a method for the electronic device 100 to realize a smooth transition of the FOV center in a point-and-click zoom scene.
  • the following takes the current zoom point as the 1X zoom point, the target zoom point as the 5X zoom point, and the user clicking and zooming from the 1X zoom point to the 5X zoom point as an example to describe the specific steps of the method in detail:
  • the electronic device 100 detects an operation of pressing a display screen with a user's finger.
  • the electronic device 100 determines whether the position where the user's finger presses the display screen is on a target zoom point, where the target zoom point is different from the current zoom point.
  • the electronic device 100 can determine whether the position where the user's finger presses the display screen is within the hot zone of the target zoom point, and if so, it can indicate that the user's finger presses the display screen The position of the screen is on the above-mentioned target zoom point.
  • the hot spot of the above-mentioned target zoom point refers to a certain area on the display screen that contains the location of the above-mentioned target zoom point.
  • the camera application program of the electronic device 100 sends the zoom value corresponding to the target zoom point and the zoom value corresponding to the zoom sequence to a hardware abstraction layer (Hardware Abstract Layer, HAL).
  • HAL Hardware abstraction layer
  • the above-mentioned zoom sequence refers to a set composed of multiple zoom ratios between the target zoom point (eg, 5X zoom point) and the current zoom point (eg, 1X zoom point).
  • the electronic device 100 determines that the position where the user's finger presses the display screen is within the hot zone of the 5X zoom point, it means that the position where the user's finger presses the display screen is on the 5X zoom point.
  • the camera application program of the electronic device 100 may send the zoom value corresponding to the 5X zoom point and the zoom value corresponding to the zoom sequence to the hardware abstraction layer.
  • the hardware abstraction layer does not have enough time to perform subsequent related steps due to a certain delay in sending the zoom value corresponding to the 5X zoom point to the hardware abstraction layer by the camera application program.
  • the zoom value corresponding to the 5X zoom point is the corresponding zoom value when the zoom magnification is 5X
  • the zoom value corresponding to the above zoom sequence is the zoom value corresponding to each zoom magnification in the zoom sequence.
  • S is the distance on the display screen between the current zoom point and the target zoom point on the zoom bar
  • T is the time required for a smooth transition from the current zoom point to the target zoom point (that is, from the time the user clicks the current zoom point to the display screen).
  • the time required to display the preview image corresponding to the target zoom point which can be obtained from the hardware abstraction layer
  • t is the time interval between two adjacent preview frames of the current camera system, which can be obtained from the hardware abstraction layer
  • the number of zoom magnifications included in the above zoom sequence is determined by N.
  • the current zoom point is 1X
  • the target zoom point is 5X
  • the minimum change unit of the zoom ratio is 0.1X.
  • the zoom sequence can be [1.1X 1.2X 1.3X...4.9X 5X]
  • the value range of the serial number n of the zoom sequence is (0,40]
  • so on, when n 40, the corresponding zoom ratio is 5X.
  • the camera application since the time interval for the camera application to send the zoom value corresponding to each zoom ratio in the zoom sequence to the hardware abstraction layer is determined by the preview frame time interval of the current camera system, the camera application will send the zoom value corresponding to the above zoom sequence
  • the time needed to complete sending one by one in order is the above-mentioned time T required for a smooth transition from the current zoom point to the target zoom point.
  • the camera application can send all the zoom values corresponding to the above zoom sequence to the hardware abstraction layer at one time, and the hardware abstraction layer can correspond to each zoom value one by one according to the preview frame time interval of the current camera system.
  • the image preview frame of thus completing the smooth zooming process.
  • the hardware abstraction layer of the electronic device 100 starts to execute the first processing flow.
  • the hardware abstraction layer of the electronic device 100 may start to execute the first processing procedure based on the received zoom value corresponding to the zoom point and the zoom value corresponding to the zoom sequence sent by the camera application.
  • the specific process of executing the first processing flow will be introduced in detail later, and will not be expanded here.
  • the electronic device 100 detects that the user's finger leaves the display screen.
  • the electronic device 100 determines whether the position where the user's finger leaves the display screen is on the zoom point.
  • the electronic device 100 can determine whether the position where the user's finger leaves the display screen is within the hot zone of the zoom point. If so, it can indicate that the user's finger leaves the display screen. above the zoom point.
  • the hot zone of the above-mentioned zoom point refers to a certain area on the display screen that includes the location of the above-mentioned zoom point.
  • the time from pressing the display screen to leaving the display screen is about 100ms-300ms, which will not exceed the total time of the smooth zooming process.
  • the user can also use a handwriting device (such as a stylus) to press or leave the display screen.
  • a handwriting device such as a stylus
  • the camera application program of the electronic device 100 sends a confirmation zoom instruction to the hardware abstraction layer.
  • the electronic device 100 determines that the position where the user's finger leaves the display screen is within the hot zone of the 5X zoom point, it means that the position where the user's finger leaves the display screen is on the 5X zoom point.
  • the camera application program of the electronic device 100 may send the above confirmation zoom instruction to the hardware abstraction layer, wherein the above confirmation zoom instruction is used to instruct the hardware abstraction layer to execute the second processing flow.
  • the hardware abstraction layer of the electronic device 100 starts to execute the second processing flow.
  • the hardware abstraction layer of the electronic device 100 may execute the second processing flow based on the received zoom value corresponding to the zoom point and the zoom value corresponding to the zoom sequence sent by the camera application.
  • the specific process of executing the second processing flow will be introduced in detail later, and will not be expanded here.
  • the display screen of the electronic device 100 displays the zoomed image.
  • the display screen of the electronic device 100 may display multiple zoomed images.
  • the zoom sequence is [1.1X 1.2X 1.3X...4.9X 5X]
  • the display screen can display images under 1.1X, 1.2X, 1.3X, ..., 4.9X, 5X in sequence.
  • the camera application program of the electronic device 100 sends a zoom cancellation instruction to the hardware abstraction layer.
  • the electronic device 100 may send the cancel zoom instruction to the hardware abstraction layer, wherein the cancel zoom instruction is used to instruct the hardware abstraction layer to stop executing the first processing flow and the second processing flow.
  • the hardware abstraction layer of the electronic device 100 stops executing the first processing flow and the second processing flow.
  • the hardware abstraction layer of the electronic device 100 may stop executing the first processing flow and the second processing flow.
  • the specific process of executing the first processing flow and the second processing flow will be introduced in detail later, and will not be expanded here.
  • the above-mentioned method for the electronic device 100 to achieve a smooth transition of the FOV center in the point-and-click zoom scene is only described by taking the switching from a normal camera to a telephoto camera as an example.
  • the above method is also applicable to switching between other cameras, for example, The normal camera is switched to the wide-angle camera, which is not limited in this embodiment of the present application.
  • the first processing flow and the second processing flow may be collectively referred to as a smooth zoom processing process, and the specific process is as follows:
  • the current zoom point is 1X zoom point
  • the target zoom point is 5X zoom point
  • the user clicks from the 1X zoom point to zoom to the 5X zoom point that is, the user clicks from the normal camera to switch to the telephoto camera
  • the camera application can send the zoom value corresponding to the 5X zoom point to the hardware abstraction layer.
  • the hardware abstraction layer After the hardware abstraction layer receives the zoom value corresponding to the above-mentioned 5X zoom point, it can Confirm that the camera ID (Identifier) corresponding to the above zoom value is a telephoto camera, and start to start the telephoto camera.
  • the normal camera is still running. At this time, the image displayed on the display screen (such as the preview image shown in Figure 4A) is still from Ordinary webcam.
  • the camera application While sending the zoom value corresponding to the 5X zoom point to the hardware abstraction layer, the camera application also sends the zoom value corresponding to each zoom ratio in the zoom sequence to the hardware abstraction layer in sequence, and the zoom sequence is [1.1X 1.2X 1.3 X...4.9X 5X] as an example, in the process of sending the zoom value from 1.1X to 3X, the normal camera is still running, but the telephoto camera is in the process of starting up and has not yet started running. This process can collect data from the normal camera At this time, the image displayed on the display screen (such as the preview image shown in FIG. 4B ) is still from a common camera.
  • the normal camera is still running, and the telephoto camera starts to run, that is, the telephoto camera also starts to collect images, based on the images collected by the normal camera and the telephoto camera
  • the hardware abstraction layer starts to execute the FOV center offset algorithm, and gradually moves the FOV center of the image captured by the ordinary camera towards the FOV center of the telephoto camera. preview image) still comes from the normal camera.
  • the camera application sends the zoom value of 5X to the hardware abstraction layer
  • the normal camera stops running (that is, shuts down)
  • the telephoto camera is still running
  • the image displayed on the display screen (such as the preview image shown in Figure 4D) comes from the telephoto camera . So far, the camera application program has sent all the zoom values corresponding to each zoom ratio in the above zoom sequence.
  • the electronic device 100 can smoothly transition the FOV center of the image displayed on the display screen from the FOV center of the common camera to The center of the FOV of the telephoto camera for smooth zooming.
  • the current zoom point is 1X zoom point
  • the target zoom point is 0.4X zoom point
  • the user clicks from the 1X zoom point to zoom to 0.4X zoom point that is, the user clicks from the normal camera to switch to the wide-angle camera
  • the camera application can send the zoom value corresponding to the 0.4X zoom point to the hardware abstraction layer, and after the hardware abstraction layer receives the zoom value corresponding to the 0.4X zoom point, it can Confirm that the camera ID (Identifier) corresponding to the above-mentioned zoom value is a wide-angle camera based on the above-mentioned zoom value, and start to start the wide-angle camera. from a normal webcam.
  • the camera application When the camera application sends the zoom value corresponding to the 0.4X zoom point to the hardware abstraction layer, it will also send the zoom value corresponding to each zoom ratio in the zoom sequence to the hardware abstraction layer in sequence, and the zoom sequence is [0.9X 0.8X ...0.5X 0.4X] as an example, when the zoom value of 0.9X is sent, the normal camera is still running, but the wide-angle camera is in the process of starting up and has not yet started running. This process can center the image captured by the normal camera. At this time, the image displayed on the display screen (such as the preview image shown in FIG. 5B ) is still from a common camera.
  • the normal camera is still running, and the wide-angle camera starts running, that is, the wide-angle camera also starts to collect images, based on the images collected by the normal camera and the images collected by the wide-angle camera
  • the hardware abstraction layer starts to execute the FOV center offset algorithm, and gradually moves the FOV center of the image captured by the ordinary camera toward the FOV center of the wide-angle camera. from a normal webcam.
  • the camera application sends a zoom value of 0.5X to the hardware abstraction layer
  • the normal camera stops i.e. shuts down
  • the wide-angle camera is still running
  • the image displayed on the display screen such as the preview image shown in Figure 5D
  • the camera application has sent the zoom values corresponding to all the zoom ratios in the above zoom sequence.
  • the electronic device 100 can smoothly transition the FOV center of the image displayed on the display screen from the FOV center of the common camera to The center of the FOV of the wide-angle camera for smooth zooming.
  • FIG. 8 exemplarily shows the specific execution process of the FOV center offset algorithm, wherein the target camera is the camera to be switched to by the current camera, and the process includes the following steps:
  • the hardware abstraction layer of the electronic device 100 adjusts the cropping area of the current camera (including the position and size of the cropping area) based on the feature point coordinates of the target camera (such as the FOV center point coordinates of the target camera), and calculates the features of the current camera. Point coordinates (such as the FOV center point coordinates of the current camera cropping area).
  • the hardware abstraction layer of the electronic device 100 can be based on the feature point coordinates of the telephoto camera (such as shown in FIG. 4G ). coordinates O4 of the FOV center point of the telephoto camera shown), adjust the cropped area of the common camera (for example, the cropped area is adjusted from the area where Image 2 is located in Figure 4F to the area where Image 3 is located in Figure 4G), and based on the above cropping
  • the feature point coordinates of the common camera are calculated by cutting the area (for example, the FOV center point coordinate O3 of the common camera shown in FIG. 4G ).
  • the hardware abstraction layer of the electronic device 100 can be based on the feature point coordinates of the wide-angle camera (for example, the wide-angle camera shown in FIG. 5G
  • the FOV center point coordinates (04') of the camera adjust the cropped area of the common camera (for example, the cropped area is adjusted from the area where the image 2' is located in Figure 5F to the area where the image 3' is located in Figure 5G), and based on the above-mentioned cropping
  • the region calculates the feature point coordinates of the common camera (for example, the FOV center point coordinate O3' of the common camera shown in FIG. 5G ).
  • the hardware abstraction layer of the electronic device 100 judges whether the feature point coordinates of the target camera coincide with the feature point coordinates of the current camera.
  • the hardware abstraction layer of the electronic device 100 can judge whether the feature point coordinates of the target camera coincide with the feature point coordinates of the current camera calculated based on the clipping area of the current camera, and if so, end the execution FOV center offset algorithm, if not, continue to execute step S801 and step S802 in sequence until the feature point coordinates of the target camera coincide with the feature point coordinates of the current camera, that is to say, the cropping of the current camera can be adjusted multiple times Area, calculate the feature point coordinates of the current camera multiple times, so that the feature point coordinates of the current camera gradually approach and coincide with the feature point coordinates of the target camera, so that the FOV center of the image displayed on the display screen can smoothly transition from the FOV center of the ordinary camera to the center of the FOV of the wide-angle camera.
  • the hardware abstraction of the electronic device 100 Layer can sequentially adjust the cropping area of the current camera at 3.1X, 3.2X, 3.3X, ..., 4.9X, and 5X zoom magnifications, and calculate the feature point coordinates of the current camera in turn, so as to present the coordinates of the FOV center point of the ordinary camera gradually The change trend of the coordinate movement of the FOV center point of the telephoto camera. In this way, during the zooming process, the FOV center of the preview image will not jump.
  • the hardware abstraction layer of the electronic device 100 can be Adjust the cropping area of the current camera in turn at 0.8X, 0.7X, 0.6X, 0.5X, and 0.4X zoom magnification, and calculate the feature point coordinates of the current camera in turn, so that the coordinates of the FOV center point of the ordinary camera gradually change to that of the wide-angle camera.
  • the change trend of coordinate movement of FOV center point In this way, during the zooming process, the FOV center of the preview image will not jump.
  • Figure 8 only illustrates the specific execution process of the FOV center offset algorithm by taking the example of switching from a normal camera to a telephoto camera and switching from a normal camera to a wide-angle camera. The switching between the two is not limited in this embodiment of the present application.
  • the electronic device 100 only turns on one camera by default, and turns on another camera only when one scene is selected and zoomed, so as to realize a smooth transition of the FOV center.
  • the electronic device 100 can also turn on the current camera and the camera whose focal length is adjacent (or it can also be called the optical zoom ratio is adjacent) by default.
  • the electronic device 100 can also turn on the current camera and the camera whose focal length is adjacent (or it can also be called the optical zoom ratio is adjacent) by default.
  • multiple scenarios such as zooming, only the camera corresponding to the target zoom point and the camera with an adjacent focal length can be turned on, which can also achieve a smooth transition of the FOV center. At the same time, it can save the time of turning on the camera during the zooming process.
  • Fig. 9A-Fig. The camera indicated by the circle is the camera from which the preview image displayed on the display screen of the electronic device 100 comes from.
  • the optical zoom magnification of the wide-angle camera is 0.4X
  • the optical zoom magnification of the ordinary camera is 1X
  • the optical zoom magnification of the telephoto camera 1 is 5X
  • the optical zoom magnification of the telephoto camera 2 is 6X.
  • the optical zoom ratios of the above four cameras increase sequentially, that is, the optical zoom ratio of the wide-angle camera ⁇ the optical zoom ratio of the ordinary camera ⁇ the optical zoom ratio of the telephoto camera 1 ⁇ the telephoto camera
  • the optical zoom magnification of 1 is easy to understand.
  • the wide-angle camera is only adjacent to the ordinary camera, and the ordinary camera is adjacent to the wide-angle camera and the telephoto camera 1, and the telephoto camera 1 is adjacent. It is adjacent to the common camera and the telephoto camera 2, and the telephoto camera 2 is only adjacent to the telephoto camera 2.
  • the electronic device 100 turns on the wide-angle camera by default, as shown in FIG. 9A , after the electronic device 100 detects that the user opens the camera application program, in response to the operation, the electronic device 100 opens the camera application program, and the preview image comes from the wide-angle camera. At the same time, since the wide-angle camera is adjacent to the normal camera, the electronic device 100 starts to activate the normal camera, so that the normal camera enters the running state. In the case that the normal camera has entered the running state, if the electronic device 100 detects that the user switches from the wide-angle camera to the normal camera, in response to the operation, the electronic device 100 can directly base on the image collected by the normal camera and the image collected by the wide-angle camera. The image executes the FOV center offset algorithm, so that the FOV center of the wide-angle camera smoothly transitions to the FOV center of the ordinary camera, avoiding the jump of the FOV center and realizing smooth zooming.
  • the preview image is from the image collected by the common camera.
  • the electronic device 100 since the common camera is adjacent to the wide-angle camera and the telephoto camera 1, the electronic device 100 still maintains the wide-angle camera. Run, and start the telephoto camera 1 at the same time, so that the telephoto camera 1 enters the running state.
  • the electronic device 100 detects that the user switches from the normal camera to the telephoto camera 1, in response to the operation, the electronic device 100 can directly base on the images collected by the normal camera and The image captured by the telephoto camera 1 executes the FOV center offset algorithm, so that the FOV center of the ordinary camera smoothly transitions to the FOV center of the telephoto camera 1, avoiding jumps in the FOV center, and realizing smooth zooming.
  • the preview image comes from the image collected by the telephoto camera 1. Adjacent, therefore, the electronic device 100 starts to turn off the wide-angle camera, keeps the normal camera running, and starts to start the telephoto camera 2 at the same time, so that the telephoto camera 2 enters the running state.
  • the electronic device 100 detects that the user switches from the telephoto camera 1 to the telephoto camera 2, in response to this operation, the electronic device 100 can directly use the telephoto camera 1
  • the images collected by the telephoto camera 2 and the images collected by the telephoto camera 2 execute the FOV center offset algorithm, so that the FOV center of the telephoto camera 1 smoothly transitions to the FOV center of the telephoto camera 2, avoiding jumps in the FOV center, and realizing smooth zooming.
  • the preview image is from the image collected by the telephoto camera 2.
  • the electronic device 100 starts to turn off the ordinary camera, and still keeps the telephoto camera 1 running.
  • the embodiment of the present application can detect the scene where the user needs to zoom (such as click to zoom, pinch to zoom, drag the zoom bar to zoom) Waiting for multiple scenarios) to directly execute the FOV center offset algorithm without waiting for the camera corresponding to the target zoom point to run before executing the FOV center offset algorithm. In this way, while achieving smooth zooming, it saves the time to turn on the camera during the zooming process .
  • FIGS. 9A-9D by taking the wide-angle camera enabled by default on the electronic device 100 as an example. It is not limited thereto.
  • the camera enabled by default on the electronic device 100 may also be other cameras.
  • the electronic device 100 can also turn on the normal camera by default.
  • the opening and closing of the camera can be switched from FIG. 9B to FIG. 9A.
  • the opening and closing of the camera can be switched from FIG. 9B to FIG. 9C.
  • this embodiment of the present application is also applicable to a scene where the user directly clicks and zooms from the current camera to switch to a camera that is not adjacent to the current camera.
  • the electronic device 100 detects that the user directly selects the zoom from the wide-angle camera to switch to the telephoto camera 2, in response to this operation, the electronic device 100 can turn on or off the above-mentioned FIG. 9A, Figure 9B, Figure 9C, and Figure 9D correspond to cameras, and execute the FOV center shift algorithm in the smooth zooming process in sequence.
  • FIG. 9C , FIG. 9B , and FIG. 9A correspond to cameras, and execute the FOV center shift algorithm in the smooth zooming process in sequence.
  • the camera is used to collect images.
  • the reflected light of the scene passes through the lens and is refracted on the lens, it converges on the image sensor.
  • the image sensor converts the optical image into an analog electrical signal, and then passes through the digital-analog
  • the converter outputs the raw digital image captured by the camera.
  • the hardware abstraction layer is used to receive the zoom value corresponding to the zoom ratio reported from the camera application, start the target camera corresponding to the zoom ratio (such as a telephoto camera or a wide-angle camera), start and execute the FOV center offset algorithm, etc. Specifically, after receiving the zoom value corresponding to the zoom ratio reported from the camera application program, the hardware abstraction layer can determine and start the target camera corresponding to the above zoom value, and then, the hardware abstraction layer can collect data based on the current camera (such as a common camera) The images collected by the target camera (such as a telephoto camera or a wide-angle camera) execute the FOV center offset algorithm to obtain cropping parameters, and send the cropping parameters to the cropping module of the ISP.
  • the above-mentioned cropping parameters may include the position (such as the coordinates of the center point of FOV) and the size (width and height) of the cropping area.
  • the ISP is used to convert the data from the camera into an image in a standard format, such as YUV, RGB, etc.
  • the ISP can use the cropping module to crop the image captured by the camera based on the received cropping parameters, and then perform post-processing on the cropped image, such as black level correction, lens shading correction, dead point compensation , color interpolation and other operations, and then send the YUV/RGB image to the processor through the I/O control interface for processing.
  • Processors include application processors, baseband processors, multimedia processors, etc., which can run various image processing algorithms and control peripheral devices.
  • images in formats such as YUV and RGB obtained by the ISP can be directly sent to the display for display.
  • the photo or video can be saved to the memory.
  • the display screen can monitor user operations for adjusting preview images in various areas of the display screen through the UI, and report the monitored user operations to the hardware abstraction layer.
  • the user operation may include but not limited to the touch operation on the target object detected by the electronic device 100 in the preview frame mentioned in the above UI embodiment.
  • the display screen displays a preview interface
  • the preview frame in the preview interface is used to display images captured by a common camera
  • the zoom ratio displayed in the preview interface is 1X.
  • the display screen detects the user's operation of increasing the zoom ratio to 5X (for example, the user clicks the operation of the zoom point 5X), and sends the zoom value corresponding to the zoom ratio selected by the user to the hardware abstraction
  • the hardware abstraction layer can determine that the camera corresponding to the zoom point 5X is a telephoto camera, and then the hardware abstraction layer can start the telephoto camera.
  • the ISP can center the image collected by the ordinary camera based on the cropping parameters calculated by the hardware abstraction layer.
  • the FOV center offset algorithm can be executed based on the image captured by the ordinary camera and the image captured by the telephoto camera to obtain the cropping parameters, and send the above cropping parameters to the ISP cropping module.
  • the ISP cropping module can be based on the above cropping
  • the parameter performs eccentric cropping on images captured by common cameras.
  • the ISP can do further post-processing on the cropped image and send it to the processor to generate the image to be displayed.
  • the processor may send the image to be displayed to the display, so as to instruct the display screen to display the image to be displayed in the preview frame.
  • the display screen detects that the user increases the zoom ratio to 5X (for example, the user clicks the zoom point 5X operation), and send the zoom value corresponding to the zoom ratio selected by the user to the hardware abstraction layer.
  • the hardware abstraction layer can judge that the camera corresponding to the zoom point 5X is a telephoto camera. After that, since the telephoto camera has been opened and running in advance, therefore, The hardware abstraction layer can directly execute the FOV center offset algorithm based on the image captured by the ordinary camera and the image captured by the telephoto camera to obtain the cropping parameters, and send the above cropping parameters to the cropping module of the ISP.
  • the cropping module of the ISP can Based on the above cropping parameters, the image captured by the common camera is cropped eccentrically. After that, the ISP can do further post-processing on the cropped image and send it to the processor to generate the image to be displayed. Afterwards, the processor may send the image to be displayed to the display, so as to instruct the display screen to display the image to be displayed in the preview box.
  • the electronic device 100 can crop the image captured by the ordinary camera through the cooperation of the hardware abstraction layer and the ISP, so as to achieve the purpose of gradually reducing the preview angle of view and gradually changing the FOV center of the displayed image .
  • the display screen displays a preview interface
  • the preview frame in the preview interface is used to display images captured by a common camera
  • the zoom ratio displayed in the preview interface is 1X.
  • the display screen detects that the user reduces the zoom ratio to 0.4X (for example, the user clicks the zoom point 0.4X operation), and sends the zoom value corresponding to the zoom ratio selected by the user to Hardware abstraction layer.
  • the hardware abstraction layer can determine that the camera corresponding to the zoom point of 0.4X is a wide-angle camera. After that, the hardware abstraction layer can start the wide-angle camera. During the start-up process of the wide-angle camera, that is, before the wide-angle camera starts to operate, the ISP can center the image collected by the ordinary camera based on the cropping parameters calculated by the hardware abstraction layer.
  • the hardware abstraction layer can be based on the normal
  • the image collected by the camera and the image collected by the wide-angle camera execute the FOV center offset algorithm to obtain the cropping parameters, and send the above cropping parameters to the ISP cropping module.
  • the acquired images were cropped off-centre.
  • the ISP can do further post-processing on the cropped image and send it to the processor to generate the image to be displayed.
  • the processor may send the image to be displayed to the display, so as to instruct the display screen to display the image to be displayed in the preview frame.
  • the display screen detects that the user reduces the zoom ratio to 0.4X (for example, the user clicks the zoom point 0.4 X operation), and send the zoom value corresponding to the zoom ratio selected by the user to the hardware abstraction layer.
  • the hardware abstraction layer can judge that the camera corresponding to the zoom point of 0.4X is a wide-angle camera.
  • the hardware abstraction layer can directly execute the FOV center offset algorithm based on the image captured by the ordinary camera and the image captured by the wide-angle camera to obtain the cropping parameters, and send the above cropping parameters to the cropping module of the ISP, the cropping module of the ISP can Based on the above cropping parameters, the image captured by the common camera is cropped eccentrically.
  • the ISP can do further post-processing on the cropped image and send it to the processor to generate the image to be displayed.
  • the processor may send the image to be displayed to the display, so as to instruct the display screen to display the image to be displayed in the preview box.
  • the electronic device 100 can crop the image captured by the ordinary camera through the cooperation of the hardware abstraction layer and the ISP, so as to realize the purpose of gradually increasing the preview angle of view and gradually changing the FOV center of the displayed image .
  • the display screen can also detect the user's photographing operation, and in response to the operation, the electronic device 100 can save the image displayed in the preview frame as a photo.
  • the display screen can also detect the user's touch operation on the shooting control.
  • the electronic device 100 can Save the image in the preview frame sent by the ISP to the display screen when the above-mentioned touch operation is detected, that is, the ISP can further encode and compress the obtained data in the YUV format into a photo in JPEG format, and the processor then saves the photo in the memory.
  • the electronic device 100 may further save the image in the preview frame, specifically as a video file.
  • the display screen can also detect the user's two touch operations on the video control, and respond to the above two touch operations Operation, the electronic device 100 can save the image frame output during the time period between the above two touch operations, that is, the video generated in the video recording mode, and the processor then saves the generated video in the memory.
  • the shooting method can be applied to an electronic device 100 including a display screen and multiple cameras, and the multiple cameras can include a first camera and a second camera.
  • the specific steps for performing the shooting method are as follows:
  • the electronic device 100 displays a first preview image at a first zoom ratio on a display screen, where the first preview image is captured by a first camera.
  • the electronic device 100 may detect the user's operation of opening the "Camera” application program, for example, the operation of clicking the camera icon 215D on the main interface shown in FIG. 3A . In response to this operation, the electronic device 100 may open the "Camera” application program, and display a first preview image at a first zoom ratio on the display screen, where the first preview image is captured by the first camera.
  • the first camera may be an ordinary camera
  • the first zoom ratio may be an optical zoom ratio of the ordinary camera or a certain zoom ratio within the range of the digital zoom ratio of the ordinary camera
  • the first preview image may be as shown in Fig. 4A or Fig. 5A to display the image in the preview box.
  • the electronic device 100 detects a user's click operation on the second zoom factor option.
  • the second zoom factor option may also be referred to as a target zoom point, for example, a target zoom point of 5X.
  • the click operation may include: the operation of touching the second zoom factor option with the user's finger or the handwriting device first, and then leaving the operation of the second zoom factor option.
  • the electronic device 100 generates and displays at least one second preview image on the display screen based on at least one third zoom ratio, where the at least one third zoom ratio is between the first zoom ratio and the second zoom ratio.
  • the third zoom ratio may be one or more zoom ratios in the zoom sequence mentioned in the foregoing embodiments, and the second preview image may be a preview image at the third zoom ratio during smooth zooming.
  • the second zoom ratio may be the optical zoom ratio of the second camera.
  • the second camera can be a telephoto camera or a wide-angle camera.
  • the second zoom ratio may be the optical zoom ratio of the telephoto camera
  • the third zoom ratio may be 3X
  • the corresponding second preview image may be the preview shown in FIG. 4B The image in the box.
  • the third zoom ratio may also be 3.1X
  • the corresponding second preview image may be the image in the preview frame shown in FIG. 4C .
  • the second zoom ratio can be the optical zoom ratio of the wide-angle camera
  • the third zoom ratio can be 0.9X
  • the corresponding second preview image can be the preview shown in FIG. 5B The image in the box.
  • the third zoom factor may also be 0.8X
  • the corresponding second preview image may be the image in the preview frame shown in FIG. 5C .
  • the electronic device 100 displays a third preview image under the second zoom ratio on the display screen, and the third preview image is captured by the second camera.
  • the second camera may be a telephoto camera
  • the third preview image may be an image in the preview frame shown in FIG. 4D .
  • the second camera may be a wide-angle camera
  • the third preview image may be the image in the preview frame shown in FIG. 5D .

Abstract

本申请实施例提供了一种拍摄方法及电子设备,电子设备可以默认只开启一个摄像头(即当前摄像头),在检测到用户点选变焦的操作时,电子设备可以再开启另一个摄像头(即目标摄像头),基于上述两个摄像头采集的图像,电子设备可以通过执行FOV中心偏移算法,使得当前摄像头采集图像的FOV中心逐渐靠近目标摄像头采集图像的FOV中心,实现FOV中心的平滑过渡,避免出现FOV中心跳变,同时,在点选变焦场景下再开启另外一个摄像头,节省电子设备的功耗。

Description

拍摄方法及电子设备
本申请要求于2021年05月31日提交中国国家知识产权局、申请号为202110612148.0、申请名称为“拍摄方法及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及终端技术领域,尤其涉及一种拍摄方法及电子设备。
背景技术
随着电子技术的发展,手机、平板电脑等一些电子设备被配置了越来越多的摄像头,可以实现更大倍率范围(例如0.5倍倍率至30倍倍率)的变焦拍摄,满足了用户的多种拍摄需求。但是,目前在变焦过程中会出现视场(Field of View,FOV)的中心跳变的问题,用户体验差。此外,在不需要变焦拍摄的场景中,一些电子设备也会把全部摄像头都开启,导致电子设备功耗较大。
发明内容
本申请实施例提供了一种拍摄方法及电子设备,可以避免在变焦拍摄过程中出现FOV中心跳变,实现FOV平滑过渡,提高用户体验,而且,可以按需开启电子设备上的对应摄像头,节省电子设备的功耗。
第一方面,本申请实施例提供了一种拍摄方法,应用于包括显示屏、多个摄像头的电子设备,多个摄像头包括第一摄像头和第二摄像头,该方法包括:电子设备在显示屏上显示第一变焦倍率下的第一预览图像,第一预览图像是第一摄像头采集的;电子设备检测到用户针对第二变焦倍率选项的点击操作;电子设备基于至少一个第三变焦倍率生成并在显示屏上显示至少一个第二预览图像,至少一个第三变焦倍率在第一变焦倍率与第二变焦倍率之间;电子设备在显示屏上显示第二变焦倍率下的第三预览图像,第三预览图像是第二摄像头采集的。
本申请实施例通过提供该拍摄方法,电子设备可以基于上述两个摄像头采集的图像,通过执行FOV中心偏移算法,使得当前摄像头采集图像的FOV中心逐渐靠近目标摄像头采集图像的FOV中心,实现FOV中心的平滑过渡,避免出现FOV中心跳变。
在一种可能的实现方式中,点击操作包括:用户手指或手写装置先接触第二变焦倍率选项的操作,再离开第二变焦倍率选项的操作。
如此,可以将本申请实施例中所说的点击操作与仅仅为用户手指或手写装置先接触第二变焦倍率选项的操作进行区别。
在一种可能的实现方式中,电子设备基于至少一个第三变焦倍率生成并在显示屏上显示至少一个第二预览图像,具体包括:电子设备在检测到用户手指或手写装置接触第二变焦倍率选项的操作时,开始基于至少一个第三变焦倍率生成并在显示屏上显示至少一个第二预览图像。
如此,可以给电子设备预留更多的内部处理时间,从而可以在用户执行操作之后更快地响应用户需求。
在一种可能的实现方式中,第二预览图像的个数N确定方法如下:
N=T/t
其中,T是电子设备检测到用户针对第二变焦倍率选项的点击操作到电子设备在显示屏上显示第二变焦倍率下的第三预览图像之间的时间间隔,t是当前相机系统的相邻两个预览帧之间的时间间隔。
如此,可以计算出需要在显示屏上显示的图像帧数。
在一种可能的实现方式中,第三变焦倍率对应的变焦值zoomValue确定方法如下:
Figure PCTCN2022083426-appb-000001
其中,S是第一变焦倍率选项与第二变焦倍率选项之间在所述显示屏上的距离,n是小于第二预览图像的个数N的正整数。
如此,可以计算出第三变焦倍率对应的变焦值。
在一种可能的实现方式中,第一预览图像的FOV中心与第一摄像头的FOV中心一致,第三预览图像的FOV中心与第二摄像头的FOV中心一致。
如此,可以保证摄像头采集的图像FOV中心与预览图像的FOV中心重合。
在一种可能的实现方式中,在电子设备检测到用户针对第二变焦倍率选项的点击操作之前,该方法还包括:电子设备给第二摄像头上电,启动并运行第二摄像头。
如此,可以在使电子设备同时运行多个摄像头,在需要变焦的场景下,节省开启摄像头的时间。
在一种可能的实现方式中,至少一个第二预览图像的FOV中心逐渐靠近第二摄像头的FOV中心。
如此,可以实现平滑变焦,避免FOV中心跳变。
在一种可能的实现方式中,在电子设备检测到用户针对第二变焦倍率选项的点击操作之后,该方法还包括:电子设备给第二摄像头上电,启动并运行第二摄像头。
如此,可以按需开启摄像头,节省电子设备的功耗。
在一种可能的实现方式中,在电子设备运行第二摄像头之前,至少一个第二预览图像的FOV中心与第一摄像头的FOV中心一致;在电子设备运行第二摄像头之后,至少一个第二预览图像的FOV中心开始逐渐靠近第二摄像头的FOV中心。
如此,在节省电子设备的功耗的情况下,还可以实现平滑变焦,避免FOV中心跳变。
第二方面,本申请实施例提供了一种电子设备,包括显示屏,不同焦距的多个摄像头,存储器以及耦合于存储器的处理器,多个应用程序,以及一个或多个程序;其中,多个摄像头的光学中心不重合,多个摄像头包括第一摄像头和第二摄像头,第一摄像头与第二摄像头是多个摄像头中焦距相邻的两个摄像头,处理器在运行一个或多个程序时,使得电子设备执行如上述任一方面任一项可能的实现方式的方法。
第三方面,本申请实施例提供了一种计算机存储介质,计算机存储介质存储有计算机程序,计算机程序包括程序指令,当程序指令在电子设备上运行时,使得电子设备执行如上述任一方面任一项可能的实现方式的方法。
第四方面,本申请实施例提供了一种计算机程序产品,当计算机程序产品在计算机上运行时,使得计算机执行上述任一方面任一项可能的实现方式中的方法。
附图说明
图1是本申请实施例提供的一种电子设备的结构示意图;
图2A-图2B是本申请实施例提供的一种电子设备的外形结构示意图;
图3A-图3B是本申请实施例提供的一种典型的拍摄场景的用户界面示意图;
图4A-图4H是本申请实施例提供的预览场景下进行变焦增大的用户界面示意图;
图5A-图5H是本申请实施例提供的预览场景下进行变焦减小的用户界面示意图;
图6A-图6H是本申请实施例提供的录像场景下进行变焦的用户界面示意图;
图7是本申请实施例提供的点选变焦场景下实现FOV中心平滑过渡的方法流程示意图;
图8是本申请实施例提供的FOV中心偏移算法的具体执行过程示意图;
图9A-图9D是本申请实施例提供的多个摄像头开启及关闭的状态示意图;
图10是本申请实施例提供的变焦增大时一种电子设备的部分软硬件的协作示意图;
图11是本申请实施例提供的变焦减小时一种电子设备的部分软硬件的协作示意图;
图12是本申请实施例提供的一种拍摄方法的流程示意图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述。其中,在本申请实施例的描述中,除非另有说明,“/”表示或的意思,例如,A/B可以表示A或B;文本中的“和/或”仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况,另外,在本申请实施例的描述中,“多个”是指两个或多于两个。
应当理解,本申请的说明书和权利要求书及附图中的术语“第一”、“第二”等是用于区别不同对象,而不是用于描述特定顺序。此外,术语“包括”和“具有”以及它们任何变形,意图在于覆盖不排他的包含。例如包含了一系列步骤或单元的过程、方法、系统、产品或设备没有限定于已列出的步骤或单元,而是可选地还包括没有列出的步骤或单元,或可选地还包括对于这些过程、方法、产品或设备固有的其它步骤或单元。
在本申请中提及“实施例”意味着,结合实施例描述的特定特征、结构或特性可以包含在本申请的至少一个实施例中。在说明书中的各个位置出现该短语并不一定均是指相同的实施例,也不是与其它实施例互斥的独立的或备选的实施例。本领域技术人员显式地和隐式地理解的是,本申请所描述的实施例可以与其它实施例相结合。
本申请实施例提供了一种拍摄方法,可以避免在变焦拍摄过程中出现FOV中心跳变,实现FOV平滑过渡,提高用户体验。而且,可以按需开启电子设备上的对应摄像头,节省电子设备的功耗。
本申请实施例提供的拍摄方法可以应用于具有不同焦距的多个摄像头的电子设备。这多个摄像头可为普通摄像头、长焦摄像头、广角摄像头等。由于这多个摄像头在电子设备上的位置不一样,因此这多个摄像头的光学中心不是重合的,从而导致这多个摄像头的FOV中心不一致。假设普通摄像头所在的光学变焦倍率为1倍倍率(表示为1X),长焦摄像头所在的光学变焦倍率为5倍倍率,表示为5X(后续X表示变焦倍率),广角摄像头所在的光学变焦倍率为0.4X。一般,由于这多个摄像头的光学中心不重合,因此涉及摄像头切换的光学变焦 过程会出现FOV中心跳变,例如在普通摄像头切换到长焦摄像头的过程中,所拍摄的中心点会落在明显不同的位置上,影响用户对焦。目前,在不需要变焦拍摄的场景中,一些电子设备也会把全部摄像头都开启,导致电子设备功耗较大。
本申请实施例提供的拍摄方法能够解决在光学变焦过程中出现的这种FOV中心跳变问题,而且,可以在不需要变焦的场景下,只开启一个默认的摄像头,在需要变焦的场景下再开启电子设备上的其它对应摄像头,节省电子设备的功耗。电子设备可以默认只开启一个较大FOV的摄像头(例如普通摄像头),在检测到用户增大变焦倍率的操作时,电子设备可以再开启一个较小FOV的摄像头(例如长焦摄像头),当较小FOV的摄像头运行后,电子设备可以利用FOV中心偏移算法对具有较大FOV的摄像头所拍摄的图像进行裁切,使得裁切得到的图像的FOV中心逐渐靠近具有较小FOV的摄像头的FOV中心,实现FOV的平滑过渡,避免出现FOV中心跳变。这里,具有较大FOV的摄像头、具有较小FOV的摄像头是指一次摄像头切换涉及的两个摄像头,例如从普通摄像头切换到长焦摄像头。利用FOV中心偏移算法对具有较大FOV的摄像头所拍摄的图像进行偏心裁切,主要是指:在变焦过程中,随着变焦倍率不断接近摄像头切换的变焦倍率,对具有较大FOV的摄像头所拍摄的图像先后进行多次裁切,以实现每一次裁切得到的图像(例如在3.1X、3.2X、3.3X、…4.9X下裁切得到的图像)的FOV中心逐渐靠近具有较小FOV的摄像头的FOV中心,甚至在4.9X下裁切得到的图像可以重合于长焦摄像头的FOV中心,从而避免出现FOV中心跳变。后面内容中会详细说明基于FOV中心偏移算法对具有较大FOV的摄像头所拍摄的图像进行偏心裁切的实现,在此先不展开。
这样,当用户增大变焦倍率(如从1X增大至5X)以达到详细观察远处景物的目的时,本申请实施例提供的拍摄方法便于用户在变焦过程中“锁定”某一目标景物,避免出现因摄像头切换而“跟丢”目标景物的情况,同时也避免了用户进行反复取景对焦,提高了变焦拍摄的效率和便捷性。同时,在需要变焦的场景下再开启电子设备上的其它对应摄像头,节省了电子设备的功耗。
这里先仅以增大变焦倍率示例,本申请实施例提供的拍摄方法也适用减小变焦倍率(例如普通摄像头切换到广角摄像头)的使用场景。
上述电子设备可以是手机、平板电脑、可穿戴设备、车载设备、增强现实(augmented reality,AR)/虚拟现实(virtual reality,VR)设备、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本、个人数字助理(personaldigital assistant,PDA)或专门的照相机(例如单反相机、卡片式相机)等,本申请对上述电子设备的具体类型不作任何限制。
在本申请实施例中,在增大变焦倍率时,摄像头切换的变焦倍率为4.9X-5X,该摄像头切换具体是指从普通摄像头到长焦摄像头的切换,也即是说,从普通摄像头切换到长焦摄像头时的变焦倍率切换点为5X。
图1示例性示出了本申请实施例提供的一种电子设备100的结构。
如图1所示,电子设备100可具有多个摄像头193,例如普通摄像头、广角摄像头、超广角摄像头、长焦摄像头等。摄像头193的焦距越小,其视角越大,取景范围就越大,可以拍摄到更多的景物。反之,摄像头193的焦距越大,其视角越小,取景范围就越小,可以拍摄到更少但更远的景物。例如,超广角摄像头的焦距一般约为12毫米(millimeter,mm)-24mm,超广角摄像头的视角一般为84°-120°;广角摄像头的焦距一般约为24mm-35mm,广角摄像头 的视角一般为63°-84°;普通摄像头的焦距一般在50mm左右,普通摄像头的视角一般为46°左右;长焦摄像头的焦距一般约为135mm-500mm,长焦摄像头的视角一般为5°-18°;超长焦摄像头的焦距一般超过500mm,超长焦摄像头的视角一般为0°-5°。这几种摄像头在视角方面的表现为:超广角摄像头优于广角摄像头,广角摄像头优于普通摄像头,普通摄像头优于长焦摄像头,长焦摄像头优于超长焦摄像头。
由于这多个摄像头在电子设备上的位置不一样,因此,这多个摄像头的光学中心不是重合的,从而导致这多个摄像头的FOV中心不一致。假设普通摄像头所在的光学变焦倍率为1X,长焦摄像头所在的光学变焦倍率为5X,广角摄像头所在的光学变焦倍率为0.4X。一般,由于这多个摄像头的光学中心不重合,因此涉及摄像头切换的光学变焦过程会出现FOV中心跳变,例如在普通摄像头切换到长焦摄像头的过程中,所拍摄的中心点会落在明显不同的位置上,影响用户对焦。
此外,电子设备100还可包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。
其中,处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processingunit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuitsound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purposeinput/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。可以理解的是,本申请实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备100的结构限定。在本申请另一些实施例中,电子设备100也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
充电管理模块140用于从充电器接收充电输入。电源管理模块141用于连接电池142,充电管理模块140与处理器110。在其他一些实施例中,电源管理模块141也可以设置于处理器110中。在另一些实施例中,电源管理模块141和充电管理模块140也可以设置于同一个器件中。
电子设备100的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。移动通信模块150可以提供应用在电子设 备100上的包括2G/3G/4G/5G等无线通信的解决方案。无线通信模块160可以提供应用在电子设备100上的包括无线局域网(wirelesslocal area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。在一些实施例中,电子设备100的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得电子设备100可以通过无线通信技术与网络以及其他设备通信。
上面提及的无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(codedivision multiple access,CDMA),宽带码分多址(wideband code division multipleaccess,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidounavigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellitesystem,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备100的存储能力。内部存储器121可以用于存储计算机可执行程序代码,可执行程序代码包括指令。处理器110通过运行存储在内部存储器121的指令,从而执行电子设备100的各种功能应用以及数据处理。内部存储器121可以包括存储程序区和存储数据区。
电子设备100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。
其中,传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。陀螺仪传感器180B可以用于确定电子设备100的运动姿态,例如电子设备100围绕三个轴(即,x,y和z轴)的角速度。气压传感器180C用于测量气压。磁传感器180D包括霍尔传感器。加速度传感器180E可检测电子设备100在各个方向上(一般为三轴)加速度的大小。距离传感器180F,用于测量距离。电子设备100可以通过红外或激光测量距离。在一些实施例中,拍摄场景,电子设备100可以利用距离传感器180F测距以实现快速对焦。
接近光传感器180G可以包括例如发光二极管(LED)和光检测器,例如光电二极管。发光二极管可以是红外发光二极管。环境光传感器180L用于感知环境光亮度。指纹传感器180H用于采集指纹。温度传感器180J用于检测温度。触摸传感器180K,也称“触控面板”。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。骨传导传感器180M可以获取振动信号。
按键190包括开机键,音量键等。马达191可以产生振动提示。指示器192可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。SIM卡 接口195用于连接SIM卡。
电子设备100可以通过图像处理器(Image Signal Processor,ISP),摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。不限于集成于处理器110中,ISP也可以设置在摄像头193中。
摄像头193包括镜头和感光元件(又可称为图像传感器),用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号,如标准的RGB,YUV等格式的图像信号。
在一些实施例中,摄像头193可以用于采集深度数据。例如,摄像头193可以具有(time of flight,TOF)3D感测模块或结构光(structured light)3D感测模块,用于获取深度信息。用于采集深度数据的摄像头可以为前置摄像头,也可为后置摄像头。
视频编解码器用于对数字图像压缩或解压缩。电子设备100可以支持一种或多种图像编解码器。这样,电子设备100可以打开或保存多种编码格式的图片或视频。
电子设备100可以通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等,例如摄像头193采集的图像。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emittingdiode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrixorganic light emitting diode的,AMOLED),柔性发光二极管(flex light-emittingdiode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot lightemitting diodes,QLED)等。在一些实施例中,电子设备100可以包括一个或多个显示屏194。
应当理解的是,图1所示电子设备100仅是一个范例,并且电子设备100可以具有比图1中所示的更多的或者更少的部件,可以组合两个或多个的部件,或者可以具有不同的部件配置。图1中所示出的各种部件可以在包括一个或多个信号处理和/或专用集成电路在内的硬件、软件、或硬件和软件的组合中实现。
图2A和图2B示例性示出了本申请实施例提供的一种电子设备100外形结构。其中,图2A示出了电子设备100的显示屏194所在的一面。图2B示出了电子设备100的后盖所在的一面。
电子设备100可以具有多个摄像头193。其中,电子设备100可以包括多个前置摄像头。如图2A所示,前置摄像头193-1和前置摄像头193-2可设置于电子设备100顶端,如电子设备100的“刘海”位置(即图2A中示出的区域AA)。区域AA中除了包括摄像头193之外,还可以包括扬声器170A等。如图2B所示,电子设备100可以包括多个后置摄像头,例如后置摄像头193-3、后置摄像头193-4以及后置摄像头193-5。后置摄像头193-3、后置摄像头 193-4以及后置摄像头193-5可以分别为普通摄像头、广角摄像头以及长焦摄像头。摄像头193附近还可以配置有闪光灯196等。
摄像头193可以通过数字变焦来改变预览框中的预览图像的视角,也可以通过光学变焦来改变预览框中的预览图像的视角,还可以通过光学变焦和数字变焦相结合的方式(又称为混合变焦)来改变预览图像的视角。也即是说,变焦可以包括数字变焦、光学变焦或混合变焦。下面以混合变焦举例。
电子设备100可以通过变换这多个摄像头193中用于拍摄的摄像头并结合数字变焦来实现显示在预览框中一系列图像所呈现的预览视角的大小是渐变的。上述用于拍摄的摄像头可以指采集的图像显示在预览框中的摄像头。上述数字变焦可以为电子设备100将摄像头193采集的图像中每个像素面积变大,来实现改变焦距的目的。这相当于电子设备100将一个摄像头采集的图像进行了裁切处理,然后将经过裁切处理的图像放大,即将经过裁切处理的图像的分辨率调整至和裁切之前的图像的分辨率相同。这样,相比于裁切之前的图像,经过裁切处理并放大的图像中每个像素的面积变大。且部分视角内的图像被裁切掉,使得经过裁切处理并放大的图像所呈现的预览视角变小,类似于增大焦距的效果。但实际上,上述数字变焦并未改变上述一个摄像头的焦距。
假设在当前预览框中显示的是普通摄像头193-3采集的图像,当前变焦倍率为1X,长焦摄像头193-5未开启的情况下,当电子设备100检测到用户点击变焦点“5X”的操作时,电子设备100会启动长焦摄像头193-5,并在1X到5X之间按一定的变焦步长(例如0.1X)进行变焦(例如从1X、1.1X、1.2X…4.9X直到5X)。在从1X变焦到5X的过程中,电子设备100可以先对普通摄像头193-3所拍摄的图像进行裁切,这个阶段的裁切可以是居中裁切。当长焦摄像头193-5运行后,电子设备100可以利用FOV中心偏移算法对普通摄像头193-3所拍摄的图像进行偏心裁切,使得裁切得到的图像的FOV中心逐渐靠近长焦摄像头193-5的FOV中心,FOV中心变动更加平滑,而不是突然跳变至长焦摄像头193-5的FOV中心。即电子设备100对普通摄像头193-3所拍摄的图像进行多次数字变焦。当放大的变焦倍率达到长焦摄像头193-5所在的光学变焦倍率(例如5X)时,电子设备100切换到长焦摄像头193-5,预览框中显示的图像变成长焦摄像头193-5采集到的图像,即电子设备100进行光学变焦。也即是说,在不断增大变焦倍率时,图像显示可以包括先后两个阶段:使用普通摄像头193-3采集图像的阶段、使用长焦摄像头193-5采集图像的阶段。在使用普通摄像头193-3采集的图像的阶段,变焦倍率从1X逐渐增大,例如变焦倍率从1X增大至3X、从3X增大至4.9X,当倍率增大至5X时,切换到使用长焦摄像头193-5采集图像的阶段。
在使用一个摄像头采集图像时,显示在预览框中的来自该摄像头的图像所呈现的FOV一般可能小于这个摄像头的FOV。此时,预览框中的图像是从该摄像头采集的图像中裁剪出来的。当变焦倍率变换至一个特定倍率时,显示在预览框中的图像所呈现的FOV可以与这一个摄像头的FOV一样大。例如,上述在使用普通摄像头193-3采集的图像的阶段(即1X至4.9X),显示在预览框中的图像所呈现的FOV一般小于普通摄像头193-3的FOV。即使当变焦倍率为1X时,显示在预览框中的图像所呈现的FOV也可能小于与普通摄像头193-3的FOV。
关于电子设备100如何基于FOV中心偏移算法对普通摄像头193-3所拍摄的图像进行偏心裁切,后续实施例中会详细说明,在此先不展开。
下面描述本申请涉及的一种典型的拍摄场景。
如图3A-图3B所示,电子设备100可以检测到用户打开“相机”应用程序的操作,例如 在图3A所示的主界面(Home screen)中点击相机图标215D的操作。响应于该操作,电子设备100可以显示图3B示例性所示的用户界面301,即“相机”应用程序的一个用户界面。图3A所示的主界面可包括状态栏211,具有常用应用程序表的托盘215、日历指示符212、天气指示符213、导航栏216以及其他应用程序图标214,等等。“相机”是智能手机、平板电脑等电子设备100上的一款图像拍摄的应用程序,本申请对该应用程序的名称不做限制。不限于图3A-图3B所示,用户还可以在其他应用程序中打开用户界面301。
用户界面301可以是“相机”应用程序的默认拍照模式的用户界面。该默认拍照模式可是默认后置普通摄像头的拍照模式,也可以是其他,这里不作限制。如图3B所示,用户界面301可包括:设置控件310、闪光灯控件309、变焦条308、变焦倍率307、预览框306、摄像头翻转控件305、图库快捷控件304、快门控件303、相机模式选项302。其中:
设置控件310可用于调整拍摄照片的参数(如分辨率、滤镜等)以及开启或关闭一些用于拍照的方式(如定时拍照、微笑抓拍、声控拍照等)等。设置控件310可用于设置更多其他拍摄的功能,本申请实施例对此不作限定。
闪光灯控件309可用于开启或者关闭闪光灯。
变焦条308上有多个变焦点,用于指示不同的变焦倍率,变焦点不同,指示的变焦倍率不同。
变焦倍率307可用于指示当前的变焦倍率。其中,变焦倍率307越大,显示在预览框306中的图像所呈现的FOV越小。反之,变焦倍率307越小,显示在预览框306中的图像所呈现的FOV越大。如图3B所示,1X可以为相机应用程序的默认变焦倍率。默认变焦倍率还可以为其他数值,本申请实施例对默认变焦倍率不作限定。
预览框306可用于显示摄像头193实时采集的图像。电子设备100可以实时刷新其中的显示内容,以便于用户预览摄像头193当前的采集的图像。
摄像头切换控件305可用于监听触发切换摄像头的用户操作,响应于该操作,电子设备100可以切换摄像头,例如将后置摄像头切换为前置摄像头。
图库快捷键304可用于开启图库应用程序。响应于作用在图库快捷键304上的用户操作,例如点击操作,电子设备100可以开启图库应用程序。这样,用户可以便捷地查看拍摄的照片和视频,而无需先退出相机应用程序,再开启图库应用程序。图库应用程序是智能手机、平板电脑等电子设备上的一款图片管理的应用程序,又可以称为“相册”,本实施例对该应用程序的名称不做限制。图库应用程序可以支持用户对存储于电子设备100上的图片进行各种操作,例如浏览、编辑、删除、选择等操作。另外,电子设备100还可以在图库快捷键304中显示所保存的图像的缩略图。
快门控件303可用于监听触发拍照的用户操作。响应于该操作,电子设备100可以将预览框307中的图像保存为图库应用程序中的图片。
相机模式选项302中可以显示有一个或多个拍摄模式选项。这一个或多个拍摄模式选项可以包括:大光圈模式选项302A、录像模式选项302B、拍照模式选项302C、人像模式选项302D和更多选项302E。当检测到作用于拍摄模式选项上的用户操作,电子设备100可以开启用户选择的拍摄模式。特别的,当检测到作用于更多选项302E的用户操作,电子设备100可以进一步显示更多的其他拍摄模式选项,如慢动作拍摄模式选项等等,可以向用户展示更丰富的摄像功能。不限于图3B所示,相机模式选项302中可以不显示更多选项302E,用户可以通过在相机模式选项302中向左/右滑动来浏览其他拍摄模式选项。
基于上述拍摄场景,下面介绍在电子设备100上实现的一些用户界面(user interface,UI)。
图4A-图4H示例性示出了预览场景下进行变焦增大的用户界面,在图4A-图4H实施例中,假设长焦摄像头所在的光学变焦倍率为5X,普通摄像头所在的光学变焦倍率为1X,电子设备100默认开启的摄像头为普通摄像头,在检测到用户按下变焦条308上5X变焦点的操作之后,电子设备100再开启长焦摄像头。
图4A示例性示出了一种预览场景:预览框306中显示的图像(可以称为预览图像)来自普通摄像头,变焦倍率307为1X。
在图4A所示的预览场景下,电子设备100可以检测到用户增大变焦倍率的操作(例如用户针对变焦条308上5X变焦点的点击操作),响应于该操作,电子设备100可以缩小预览图像所呈现的FOV。同时,预览框306中显示的变焦倍率307会逐渐增大。当变焦倍率增大为5X时,电子设备100可以将长焦摄像头采集的图像显示在预览框306中,即预览图像切换为来自长焦摄像头,即电子设备100进行了光学变焦。假设变焦倍率的最小变化单元为0.1X,那么,4.9X-5X即为普通摄像头切换到长焦摄像头的光学变焦倍率。
如图4A-图4B所示,示例性地,在变焦倍率307从1X增大到3X的过程中,普通摄像头一直处于运行状态,长焦摄像头在启动状态中,尚未进入运行状态,因此,该过程中只有普通摄像头采集图像,电子设备100逐渐缩小预览图像所呈现的FOV(即对图4E所示的图像M进行居中裁切),1X时的预览图像所呈现FOV中心O1和3X时的预览图像所呈现FOV中心O2重合。
如图4C-图4D所示,示例性地,在变焦倍率307增大到3.1X时,普通摄像头处于运行状态,长焦摄像头开始进入运行状态,即普通摄像头和长焦摄像头开始同时采集图像,在变焦倍率307由3.1X增大到4.9X过程中,普通摄像头和长焦摄像头一直处于运行状态,同时采集图像,直到变焦倍率307增大到5X时,普通摄像头关闭,长焦摄像头仍处于运行状态,此时,只有长焦摄像头采集图像。在变焦倍率307从3.1X增大到5X的过程中,电子设备100除了逐渐缩小预览图像所呈现的FOV之外,电子设备100还可以将预览图像的FOV中心朝向长焦摄像头的FOV中心移动,而不是直接从普通摄像头的FOV中心跳变至长焦摄像头的FOV中心。随着变焦倍率不断增大至长焦摄像头所在的光学变焦倍率5X,预览图像的FOV中心就会逐渐靠近,甚至最终重合于长焦摄像头的FOV中心,避免了FOV中心跳变。
具体地,图4A、图4B、图4C、图4D分别示出了1X、3X、3.1X、5X时的预览图像,其FOV中心分别处以位置O1、O2、O3、O4处。可以看出,O3相较于O2更靠近O4,这样,在变焦倍率为从1X增大到5X时,预览图像的FOV中心不会从O1突然变至O4,而是更加平滑的过渡,不会出现跳变。
下面结合图4E-图4H,说明图4A-图4D示例性示出的变焦增大过程的实现原理。
如图4E-图4H所示,图像M为普通摄像头采集到的图像,FOV1为普通摄像头的FOV,FOV2为长焦摄像头的FOV。普通摄像头的FOV覆盖了长焦摄像头的FOV,且长焦摄像头的FOV中心O4与普通摄像头的FOV中心O1之间距离较远。这个较远的距离是由于长焦摄像头和普通摄像头的光心不重合导致的。为避免这一点导致变焦过程中出现FOV中心跳变,在对普通摄像头采集的图像M进行裁切时,电子设备100可以将裁切区域的裁切中心逐渐靠近长焦摄像头的FOV中心,即执行FOV中心偏移算法来实现偏心裁切。
具体地,如图4E所示,图像1为1X时的预览图像,图像1可以通过裁切图像M得到,图像1的FOV中心和FOV1的中心O1重合。如图4F所示,图像2为3X时的预览图像,图 像2可以通过裁切图像M得到。图像2的FOV中心和FOV1的中心O1重合。如图4G所示,图像3为3.1X时的预览图像,图像3可以通过裁切图像M得到。图像3的FOV中心不再与FOV1的中心O1重合,而是偏离O1而更加靠近长焦摄像头的FOV中心O4。如图4H所示,图像T为5X时的预览图像,图像T为长焦摄像头采集的图像。图像T的FOV中心与长焦摄像头的FOV中心O4重合。
图4E-图4H仅仅以长焦摄像头开始运行时就执行FOV中心偏移算法为例说明了FOV中心偏移算法的一种实现方式,即电子设备从3X变大为3.1X、3.2X、…、直至4.9X的整个数字变焦过程执行FOV中心偏移算法。不限于此,电子设备100也可以在长焦摄像头开始运行后的一段时间(例如电子设备100变焦至4X之后)才执行FOV中心偏移算法,例如电子设备100可以在从4X变大为4.1X、4.2X、…、直至4.9X的整个数字变焦过程执行FOV中心偏移算法。另外,本申请实施例对利用FOV中心偏移算法进行裁切朝长焦摄像头的FOV中心靠近的快慢不作限定,例如,电子设备100可以在变焦倍率每变化0.1X向长焦摄像头的FOV中心移动一次,也可以每变化0.2X向长焦摄像头的FOV中心移动一次。
在一些实施例中,图像1的FOV也可以是FOV1,即图像M可以是1X时的预览图像。
图4A-图4H实施例描述的增大变焦倍率的方式同样适用广角摄像头切换到普通摄像头的场景、超广角摄像头切换到广角摄像头的场景以及长焦摄像头切换到超长焦摄像头的场景,本申请实施例对此不作限定。
需要说明的是,本申请实施例仅仅以变焦至3X后开始运行长焦摄像头为例说明了FOV中心偏移算法的一种实现方式,在一些实施例中,还可以是变焦至其它倍率后(例如2X、3.5X、4X等)开始运行长焦摄像头,本申请实施例对此不作限定。
图5A-图5H示例性示出了预览场景下进行变焦减小的用户界面,在图5A-图5H实施例中,假设普通摄像头所在的光学变焦倍率为1X,广角摄像头所在的光学变焦倍率为0.4X(wide),电子设备100默认开启的摄像头为普通摄像头,在检测到用户按下变焦条308上0.4X变焦点的操作之后,电子设备100再开启广角摄像头。
图5A示例性示出了一种预览场景:预览框306中显示的图像(可以称为预览图像)来自普通摄像头,变焦倍率307为1X。
在图5A所示的预览场景下,电子设备100可以检测到用户减小变焦倍率的操作(例如用户针对变焦条308上0.4X变焦点的点击操作),响应于该操作,电子设备100可以扩大预览图像所呈现的FOV。同时,预览框306中显示的变焦倍率307会逐渐减小。当变焦倍率减小为0.4X时,电子设备100可以将广角摄像头采集的图像显示在预览框306中,即预览图像切换为来自广角摄像头,即电子设备100进行了光学变焦。假设变焦倍率的最小变化单元为0.1X,那么,0.5X-0.4X即为普通摄像头切换到广角摄像头的变焦倍率。
如图5A-图5B所示,示例性地,在变焦倍率307从1X减小到0.9X的过程中,普通摄像头一直处于运行状态,广角摄像头在启动状态中,尚未进入运行状态,因此,该过程中只有普通摄像头采集图像,电子设备100逐渐扩大预览图像所呈现的FOV(即对图5E所示的图像M’进行居中裁切),1X时的预览图像所呈现FOV中心O1和0.9X时的预览图像所呈现FOV中心O2重合。
如图5C-图5D所示,示例性地,在变焦倍率307减小到0.8X时,普通摄像头处于运行状态,广角摄像头开始进入运行状态,即普通摄像头和广角摄像头开始同时采集图像,在变焦倍率307由0.8X减小到0.4X过程中,普通摄像头和广角摄像头一直处于运行状态,同时 采集图像,直到变焦倍率307减小到0.4X时,普通摄像头关闭,广角摄像头仍处于运行状态,此时,只有广角摄像头采集图像。在变焦倍率307从0.8X减小到0.4X的过程中,电子设备100除了逐渐扩大预览图像所呈现的FOV之外,电子设备100还可以将预览图像的FOV中心朝向广角摄像头的FOV中心移动,而不是直接从普通摄像头的FOV中心跳变至广角摄像头的FOV中心。随着变焦倍率不断减小至广角摄像头所在的光学变焦倍率0.4X,预览图像的FOV中心就会逐渐靠近,甚至最终重合于广角摄像头的FOV中心,避免了FOV中心跳变。
具体地,图5A、图5B、图5C、图5D分别示出了1X、0.9X、0.8X、0.4X时的预览图像,其FOV中心分别处以位置O1’、O2’、O3’、O4’处。可以看出,O3’相较于O2’更靠近O4’,这样,在变焦倍率为从1X减小到0.4X时,预览图像的FOV中心不会从O1’突然变至O4’,而是更加平滑的过渡,不会出现跳变。
下面结合图5E-图5H,说明图5A-图5D示例性示出的变焦减小过程的实现原理。
如图5E-图5H所示,图像M’为普通摄像头采集到的图像,FOV1’为普通摄像头的FOV,FOV2’为广角摄像头的FOV。广角摄像头的FOV覆盖了普通摄像头的FOV,且广角摄像头的FOV中心O4’与普通摄像头的FOV中心O1’之间距离较远。这个较远的距离是由于广角摄像头和普通摄像头的光心不重合导致的。为避免这一点导致变焦过程中出现FOV中心跳变,如图5G-图5H所示,在对普通摄像头采集的图像M’进行裁切时,电子设备100可以将裁切区域的裁切中心逐渐靠近广角摄像头的FOV中心,即执行FOV中心偏移算法来实现偏心裁切。
具体地,如图5E所示,图像1’为1X时的预览图像,图像1’可以通过裁切图像M’得到,图像1’的FOV中心和FOV1’的中心O1’重合。如图5F所示,图像2’为0.9X时的预览图像,图像2’可以通过裁切图像M’得到。图像2’的FOV中心和FOV1’的中心O1’重合。如图5G所示,图像3’为0.8X时的预览图像,图像3’可以通过裁切图像M’得到。图像3’的FOV中心不再与FOV1’的中心O1’重合,而是偏离O1’而更加靠近广角摄像头的FOV中心O4’。如图5H所示,图像T’为0.4X时的预览图像,图像T’为广角摄像头采集的图像。图像T’的FOV中心与广角摄像头的FOV中心O4’重合。
图5E-图5H仅仅以广角摄像头开始运行时就执行FOV中心偏移算法为例说明了FOV中心偏移算法的一种实现方式,即电子设备从0.9X减小为0.8X、0.7X、…、直至0.4X的整个数字变焦过程执行FOV中心偏移算法。不限于此,电子设备100也可以在广角摄像头开始运行后的一段时间(例如电子设备100变焦至0.6X之后)才执行FOV中心偏移算法,例如电子设备100可以在从0.6X减小为0.5X、0.4X的整个数字变焦过程执行FOV中心偏移算法。另外,本申请实施例对利用FOV中心偏移算法进行裁切朝广角摄像头的FOV中心靠近的快慢不作限定,例如,电子设备100可以在变焦倍率每变化0.1X向长焦摄像头的FOV中心移动一次,也可以每变化0.2X向长焦摄像头的FOV中心移动一次。
在本申请实施例中,图像1’的FOV需要小于FOV1’,这样,可以保证在普通摄像头切换到广角摄像头的过程中,通过裁切普通摄像头采集的图像M’来实现平滑过渡,避免出现FOV中心跳变的现象。下面介绍图像1’的FOV小于FOV’的一种可能的实现方式:假设图像1’的FOV对应的变焦倍率为A,FOV’对应的变焦倍率为B,那么,A=B*C,其中,C为大于1的常数。举例来说,若B=1X,C=2,则A=2X,即变焦倍率307显示为1X时,预览图像实际上对应的是变焦倍率为2X时的图像,容易理解,通过对实际的变焦倍率与变焦倍率307显示的变焦倍率做一个映射,可以实现图像1’的FOV小于FOV’。
图5A-图5H实施例描述的减小变焦倍率的方式同样适用广角摄像头切换到超广角摄像头 的场景、长焦摄像头切换到普通摄像头的场景以及超长焦摄像头切换到长焦摄像头的场景,本申请实施例对此不作限定。
需要说明的是,本申请实施例仅仅以变焦至0.9X后开始运行广角摄像头为例说明了FOV中心偏移算法的一种实现方式,在一些实施例中,还可以是变焦至其它倍率后(例如0.8X、0.7X等)开始运行广角摄像头,本申请实施例对此不作限定。
图4A-图4D、图5A-图5D示例性示出的变焦预览的用户界面还可以用于用户在变焦过程中拍照。
示例性地,基于图4A-图4D示出的预览场景下增大变焦倍率的用户界面,电子设备100还可以检测到用户的拍照操作,例如在3X、3.1X、3.3X、…、5X下拍照的操作,响应于该操作,电子设备100可以将这些不同变焦倍率下的预览框中的图像保存为照片,那么,这些照片所呈现的FOV中心是平滑过渡的。
示例性地,基于图5A-图5D示出的预览场景下减小变焦倍率的用户界面,电子设备100还可以检测到用户的拍照操作,例如在0.9X、0.8X、0.7X、…、0.4X下拍照的操作,响应于该操作,电子设备100可以将这些不同变焦倍率下的预览框中的图像保存为照片,那么,这些照片所呈现的FOV中心是平滑过渡的。
图6A-图6H示例性示出了录像场景下进行变焦的用户界面。
(1)图6A-图6D示例性示出了录像场景下进行变焦增大的用户界面。
在图6A-图6D实施例中,假设长焦摄像头所在的光学变焦倍率为5X,普通摄像头所在的光学变焦倍率为1X,电子设备100默认开启的摄像头为普通摄像头,在检测到用户按下变焦条308上5X变焦点的操作之后,电子设备100再开启长焦摄像头。
和图4A-图4D所示的实施例一样,在图6A、图6B、图6C、图6D所示的变焦增大的录像过程中,电子设备100可以检测到用户增大变焦倍率的操作(例如用户针对变焦条308上5X变焦点的点击操作),响应于该操作,在变焦倍率307从1X增大到3X的过程中,电子设备100可以逐渐缩小预览图像所呈现的FOV,在变焦倍率307由3.1X增大到5X过程中,电子设备100除了逐渐缩小预览图像所呈现的FOV之外,电子设备100还可以将预览图像的FOV中心朝向长焦摄像头的FOV中心移动,而不是直接从普通摄像头的FOV中心跳变至长焦摄像头的FOV中心。随着变焦倍率不断增大至长焦摄像头所在的光学变焦倍率5X,预览图像的FOV中心就会逐渐靠近,甚至最终重合于长焦摄像头的FOV中心,避免了FOV中心跳变。
与图4A-图4D所示的实施例的不同之处在于,图6A-图6D中的预览框中的图像,即被录制的图像,可以被保存为视频。
(2)图6E-图6H示例性示出了录像场景下进行变焦减小的用户界面。
在图6E-图6H实施例中,假设普通摄像头所在的光学变焦倍率为1X,广角摄像头所在的光学变焦倍率为0.4X(wide),电子设备100默认开启的摄像头为普通摄像头,在检测到用户按下变焦条308上0.4X变焦点的操作之后,电子设备100再开启广角摄像头。
和图5A-图5D所示的实施例一样,在图6E、图6F、图6G、图6H所示的变焦减小的录像过程中,电子设备100可以检测到用户减小变焦倍率的操作(例如用户针对变焦条308上0.4X变焦点的点击操作),响应于该操作,在变焦倍率307从1X减小到0.9X的过程中,电子设备100逐渐扩大预览图像所呈现的FOV,在变焦倍率307由0.8X减小到0.4X过程中,电子设备100除了逐渐扩大预览图像所呈现的FOV之外,电子设备100还可以将预览图像的 FOV中心朝向广角摄像头的FOV中心移动,而不是直接从普通摄像头的FOV中心跳变至广角摄像头的FOV中心。随着变焦倍率不断减小至广角摄像头所在的光学变焦倍率0.4X,预览图像的FOV中心就会逐渐靠近,甚至最终重合于广角摄像头的FOV中心,避免了FOV中心跳变。
与图5A-图5D所示的实施例的不同之处在于,图6E-图6H中的预览框中的图像,即被录制的图像,可以被保存为视频。
基于前述UI实施例,下面介绍一种点选变焦场景下电子设备100实现FOV中心平滑过渡的方法流程。
在本申请实施例中,“点选变焦”是指用户通过针对变焦条308上除当前变焦点(也可以称为第一变焦倍率选项)之外的变焦点的点击操作来将当前变焦点切换到目标变焦点的过程,例如,图4A中用户通过针对变焦条308上5X变焦点的点击操作将当前变焦点1X变焦点切换到了目标变焦点5X变焦点,又例如,图5A中用户通过针对变焦条308上0.4X变焦点的点击操作将当前变焦点1X变焦点切换到了目标变焦点0.4X变焦点。
图7示例性示出了一种点选变焦场景下电子设备100实现FOV中心平滑过渡的方法流程。下面以当前变焦点是1X变焦点,目标变焦点是5X变焦点,用户由1X变焦点点选变焦至5X变焦点为例详细说明该方法的具体步骤:
S701、电子设备100检测到用户手指按下显示屏的操作。
S702、电子设备100判断用户手指按下显示屏的位置是否在目标变焦点上,其中,该目标变焦点不同于当前变焦点。
具体地,电子设备100在检测到用户手指按下显示屏的操作之后,可以判断用户手指按下显示屏的位置是否处于上述目标变焦点的热区内,若是,则可以表示用户手指按下显示屏的位置在上述目标变焦点上。其中,上述目标变焦点的热区是指显示屏上包含上述目标变焦点所在位置的某一个区域。
S703、电子设备100的相机应用程序向硬件抽象层(Hardware Abstract Layer,HAL)发送上述目标变焦点对应的变焦值及变焦序列对应的变焦值。
其中,上述变焦序列是指目标变焦点(例如5X变焦点)与当前变焦点(例如1X变焦点)之间的多个变焦倍率组成的集合。
示例性地,若电子设备100判断用户手指按下显示屏的位置在5X变焦点的热区内,则表示用户手指按下显示屏的位置在5X变焦点上。之后,电子设备100的相机应用程序可以向硬件抽象层发送5X变焦点对应的变焦值及变焦序列对应的变焦值。这样,可以避免由于相机应用程序向硬件抽象层发送5X变焦点对应的变焦值有一定的时延而导致硬件抽象层没有足够的时间执行后续相关步骤。其中,上述5X变焦点对应的变焦值即为变焦倍率为5X时对应的变焦值,上述变焦序列对应的变焦值即为变焦序列中每一个变焦倍率对应的变焦值。
变焦序列对应的变焦值zoomValue的计算公式为:
Figure PCTCN2022083426-appb-000002
其中,S为变焦条上当前变焦点与目标变焦点之间在显示屏上的距离;T是从当前变焦点平滑过渡到目标变焦点所需要的时间(即从用户点击当前变焦点到显示屏显示目标变焦点所对应的预览图像所需要的时间),可以从硬件抽象层获取到;t是当前相机系统的相邻两个预览帧之间的时间间隔,可以从硬件抽象层获取到;n是变焦序列的中变焦倍率的序号,n取 正整数,取值范围为(0,N],N=T/t,即N为在T时间内需要插入的总帧数。
容易理解,上述变焦序列中包含的变焦倍率的个数由N决定。举例来说,当前变焦点为1X,目标变焦点为5X,变焦倍率的最小变化单元为0.1X。在一种可能的实现方式中,变焦序列可以为[1.1X 1.2X 1.3X…4.9X 5X],变焦序列的序号n的取值范围为(0,40],n=1对应变焦倍率为1.1X,n=2对应变焦倍率为1.2X,依次类推,n=40时,对应的变焦倍率为5X。在另一种可能的实现方式中,变焦序列也可以为[1.2X 1.4X…4.8X 5X],变焦序列的序号n的取值范围为(0,20],n=1对应变焦倍率为1.2X,n=2对应变焦倍率为1.4X,依次类推,n=20时,对应的变焦倍率为5X。
容易理解,由于相机应用程序向硬件抽象层发送变焦序列中每一个变焦倍率对应的变焦值的时间间隔由当前相机系统的预览帧时间间隔决定,因此,相机应用程序将上述变焦序列对应的变焦值按序一一发送完毕需要的时间为上述从当前变焦点平滑过渡到目标变焦点所需要的时间T。
在一种可能的实现方式中,相机应用程序可以向硬件抽象层一次性发送上述变焦序列对应的全部变焦值,硬件抽象层可以根据当前相机系统的预览帧时间间隔一一对应每个变焦值对应的图像预览帧,从而完成平滑变焦过程。
S704、电子设备100的硬件抽象层开始执行第一处理流程。
电子设备100的硬件抽象层可以基于接收到的相机应用程序发送的上述变焦点对应的变焦值及变焦序列对应的变焦值开始执行第一处理流程。执行第一处理流程的具体过程会在后文详细介绍,在此先不展开。
S705、电子设备100检测到用户手指离开显示屏的操作。
S706、电子设备100判断用户手指离开显示屏的位置是否在上述变焦点上。
具体地,电子设备100在检测到用户手指离开显示屏的操作之后,可以判断用户手指离开显示屏的位置是否处于上述变焦点的热区内,若是,则可以表示用户手指离开显示屏的位置在上述变焦点上。其中,上述变焦点的热区是指显示屏上包含上述变焦点所在位置的某一个区域。
通常,在点选变焦过程中,用户手指从按下显示屏到离开显示屏的时长大约为100ms-300ms,不会超过平滑变焦过程的总时长。
在一些实施例中,用户也可以利用手写装置(例如手写笔)等来按下或离开显示屏。
S707、电子设备100的相机应用程序向硬件抽象层发送确认变焦指令。
示例性地,若电子设备100判断用户手指离开显示屏的位置在5X变焦点的热区内,则表示用户手指离开显示屏的位置在5X变焦点上。之后,电子设备100的相机应用程序可以向硬件抽象层发送上述确认变焦指令,其中,上述确认变焦指令用于指示硬件抽象层执行第二处理流程。
S708、电子设备100的硬件抽象层开始执行第二处理流程。
在接收到上述确认变焦指令之后,电子设备100的硬件抽象层可以基于接收到的相机应用程序发送的上述变焦点对应的变焦值及变焦序列对应的变焦值执行第二处理流程。执行第二处理流程的具体过程会在后文详细介绍,在此先不展开。
S709、电子设备100的显示屏显示变焦后的图像。
具体地,在电子设备100的硬件抽象层执行第一处理流程和第二处理流程之后,电子设备100的显示屏可以显示多个变焦后的图像。例如,在变焦序列为[1.1X 1.2X 1.3X…4.9X 5X]的情况下,显示屏可以依次显示1.1X、1.2X、1.3X、…、4.9X、5X下的图像。
S710、电子设备100的相机应用程序向硬件抽象层发送取消变焦指令。
示例性地,若电子设备100判断用户手指离开显示屏的位置不在5X变焦点的热区内,则表示用户手指离开显示屏的位置不在5X变焦点上。之后,电子设备100的相机应用程序可以向硬件抽象层发送上述取消变焦指令,其中,上述取消变焦指令用于指示硬件抽象层停止执行第一处理流程及第二处理流程。
S711、电子设备100的硬件抽象层停止执行第一处理流程及第二处理流程。
在接收到上述取消变焦指令之后,电子设备100的硬件抽象层可以停止执行第一处理流程及第二处理流程。执行第一处理流程及第二处理流程的具体过程会在后文详细介绍,在此先不展开。
需要说明的是,上述点选变焦场景下电子设备100实现FOV中心平滑过渡的方法仅仅以普通摄像头切换为长焦摄像头为例进行说明的,上述方法还适用于其他摄像头之间的切换,例如,普通摄像头切换为广角摄像头,本申请实施例对此不作限定。
下面对点选变焦场景下电子设备100的硬件抽象层执行第一处理流程和第二处理流程的具体过程进行详细介绍。
本申请实施例中,第一处理流程和第二处理流程可以统称为平滑变焦处理过程,具体过程如下:
(1)增大变焦倍率
以当前变焦点是1X变焦点,目标变焦点是5X变焦点,用户由1X变焦点点选变焦至5X变焦点(即用户由普通摄像头点选切换至长焦摄像头)为例进行说明。
如表1所示,在当前变焦点是1X变焦点时,只有普通摄像头在运行,此时,显示屏显示的图像(例如图4A所示的预览图像)来自普通摄像头。
在电子设备100检测到用户按下5X变焦点时,相机应用程序可以向硬件抽象层发送5X变焦点对应的变焦值,硬件抽象层接收到上述5X变焦点对应的变焦值之后,可以基于上述变焦值确认上述变焦值对应的摄像头ID(Identifier)为长焦摄像头,并开始启动长焦摄像头,普通摄像头仍在运行,此时,显示屏显示的图像(例如图4A所示的预览图像)仍来自普通摄像头。
相机应用程序在向硬件抽象层发送5X变焦点对应的变焦值的同时,还会按序向硬件抽象层发送变焦序列中每一个变焦倍率对应的变焦值,以变焦序列是[1.1X 1.2X 1.3X…4.9X 5X]为例,在发送1.1X的变焦值至3X的变焦值过程中,普通摄像头仍在运行,而长焦摄像头处于启动过程中,尚未开始运行,该过程可以对普通摄像头采集的图像进行居中裁切,此时,显示屏显示的图像(例如图4B所示的预览图像)仍来自普通摄像头。
相机应用程序在发送3.1X的变焦值至4.9X的变焦值过程中,普通摄像头仍在运行,长 焦摄像头开始运行,即长焦摄像头也开始采集图像,基于普通摄像头采集的图像和长焦摄像头采集的图像,硬件抽象层开始执行FOV中心偏移算法,将普通摄像头采集的图像的FOV中心逐渐朝向长焦摄像头的FOV中心移动,该过程中,显示屏显示的图像(例如图4C所示的预览图像)仍来自普通摄像头。
相机应用程序在发送5X的变焦值至硬件抽象层时,普通摄像头停止运行(即关闭),长焦摄像头仍在运行,显示屏显示的图像(例如图4D所示的预览图像)来自长焦摄像头。至此,相机应用程序将上述变焦序列中每一个变焦倍率对应的变焦值全部发送完毕。
从图4A-图4H可以看出,电子设备100可以通过执行FOV中心偏移算法,并对普通摄像头采集的图像进行裁切使得显示屏显示的图像的FOV中心从普通摄像头的FOV中心平滑过渡到长焦摄像头的FOV中心,实现平滑变焦。
Figure PCTCN2022083426-appb-000003
表1
(2)减小变焦倍率
以当前变焦点是1X变焦点,目标变焦点是0.4X变焦点,用户由1X变焦点点选变焦至0.4X变焦点(即用户由普通摄像头点选切换至广角摄像头)为例进行说明。
如表2所示,在当前变焦点是1X变焦点时,只有普通摄像头在运行,此时,显示屏显示的图像(例如图5A所示的预览图像)来自普通摄像头。
在电子设备100检测到用户按下0.4X变焦点时,相机应用程序可以向硬件抽象层发送0.4X变焦点对应的变焦值,硬件抽象层接收到上述0.4X变焦点对应的变焦值之后,可以基于上述变焦值确认上述变焦值对应的摄像头ID(Identifier)为广角摄像头,并开始启动广角摄像头,普通摄像头仍在运行,此时,显示屏显示的图像(例如图5A所示的预览图像)仍来自普通摄像头。
相机应用程序在向硬件抽象层发送0.4X变焦点对应的变焦值的同时,还会按序向硬件抽象层发送变焦序列中每一个变焦倍率对应的变焦值,以变焦序列是[0.9X 0.8X…0.5X 0.4X]为例,在发送0.9X的变焦值时,普通摄像头仍在运行,而广角摄像头处于启动过程中,尚未开始运行,该过程可以对普通摄像头采集的图像进行居中裁切,此时,显示屏显示的图像(例如图5B所示的预览图像)仍来自普通摄像头。
相机应用程序在发送0.8X的变焦值至0.5X的变焦值过程中,普通摄像头仍在运行,广角摄像头开始运行,即广角摄像头也开始采集图像,基于普通摄像头采集的图像和广角摄像 头采集的图像,硬件抽象层开始执行FOV中心偏移算法,将普通摄像头采集的图像的FOV中心逐渐朝向广角摄像头的FOV中心移动,该过程中,显示屏显示的图像(例如图5C所示的预览图像)仍来自普通摄像头。
相机应用程序在发送0.5X的变焦值至硬件抽象层时,普通摄像头停止运行(即关闭),广角摄像头仍在运行,显示屏显示的图像(例如图5D所示的预览图像)来自广角摄像头。至此,相机应用程序将上述变焦序列中全部变焦倍率对应的变焦值发送完毕。
从图5A-图5H可以看出,电子设备100可以通过执行FOV中心偏移算法,并对普通摄像头采集的图像进行裁切使得显示屏显示的图像的FOV中心从普通摄像头的FOV中心平滑过渡到广角摄像头的FOV中心,实现平滑变焦。
Figure PCTCN2022083426-appb-000004
表2
下面介绍FOV中心偏移算法的具体执行过程,图8示例性示出了FOV中心偏移算法的具体执行过程,其中,目标摄像头是由当前摄像头要切换到的摄像头,该过程包括以下步骤:
S801、电子设备100的硬件抽象层基于目标摄像头的特征点坐标(例如目标摄像头的FOV中心点坐标),调整当前摄像头的裁切区域(包括裁切区域的位置及大小),计算当前摄像头的特征点坐标(例如当前摄像头裁切区域的FOV中心点坐标)。
(1)增大变焦倍率
以当前摄像头是普通摄像头,目标摄像头是长焦摄像头为例,在普通摄像头和长焦摄像头都处于运行状态时,电子设备100的硬件抽象层可以基于长焦摄像头的特征点坐标(例如图4G所示的长焦摄像头的FOV中心点坐标O4),调整普通摄像头的裁切区域(例如裁切区域由图4F中图像2所在的区域调整到图4G中图像3所在的区域),并基于上述裁切区域计算普通摄像头的特征点坐标(例如图4G所示的普通摄像头的FOV中心点坐标O3)。
(2)减小变焦倍率
以当前摄像头是普通摄像头,目标摄像头是广角摄像头为例,在普通摄像头和广角摄像头都处于运行状态时,电子设备100的硬件抽象层可以基于广角摄像头的特征点坐标(例如图5G所示的广角摄像头的FOV中心点坐标O4’),调整普通摄像头的裁切区域(例如裁切区域由图5F中图像2’所在的区域调整到图5G中图像3’所在的区域),并基于上述裁切区域计算普通摄像头的特征点坐标(例如图5G所示的普通摄像头的FOV中心点坐标O3’)。
S802、电子设备100的硬件抽象层判断目标摄像头的特征点坐标是否与当前摄像头的特征点坐标重合。
具体地,在执行完步骤S801之后,电子设备100的硬件抽象层可以判断目标摄像头的特征点坐标是否与基于当前摄像头的裁切区域计算出来的当前摄像头的特征点坐标重合,若是,则结束执行FOV中心偏移算法,若否,则继续依次执行步骤S801和步骤S802,直到目标摄像头的特征点坐标与当前摄像头的特征点坐标重合,也即是说,可以通过多次调整当前摄像头的裁切区域,多次计算当前摄像头的特征点坐标,使得当前摄像头的特征点坐标逐渐靠近并重合于目标摄像头的特征点坐标,从而可以实现显示屏显示的图像的FOV中心从普通摄像头的FOV中心平滑过渡到广角摄像头的FOV中心。
(1)增大变焦倍率
以当前摄像头是普通摄像头,目标摄像头是长焦摄像头为例,假设变焦倍率呈现在3.1X、3.2X、3.3X、…、4.9X、5X逐渐增大的变化趋势,则电子设备100的硬件抽象层可以在3.1X、3.2X、3.3X、…、4.9X、5X变焦倍率下依次调整当前摄像头的裁切区域,并依次计算当前摄像头的特征点坐标,从而呈现普通摄像头的FOV中心点坐标逐渐向长焦摄像头的FOV中心点坐标移动的变化趋势。这样,在变焦过程中,预览图像的FOV中心不会发生跳变。
(2)减小变焦倍率
以当前摄像头是普通摄像头,目标摄像头是广角摄像头为例,假设变焦倍率呈现在0.8X、0.7X、0.6X、0.5X、0.4X逐渐减小的变化趋势,则电子设备100的硬件抽象层可以0.8X、0.7X、0.6X、0.5X、0.4X变焦倍率下依次调整当前摄像头的裁切区域,并依次计算当前摄像头的特征点坐标,从而呈现普通摄像头的FOV中心点坐标逐渐向广角摄像头的FOV中心点坐标移动的变化趋势。这样,在变焦过程中,预览图像的FOV中心不会发生跳变。
需要说明的是,图8仅仅以普通摄像头切换为长焦摄像头、普通摄像头切换为广角摄像头为例详细说明了FOV中心偏移算法的具体执行过程,不限于此,FOV算法同样适用于其他摄像头之间的切换,本申请实施例对此不作限定。
在前述实施例中,电子设备100只默认开启一个摄像头,仅在点选变焦一个场景下,再开启另外一个摄像头,可以实现FOV中心平滑过渡。
在本申请实施例中,电子设备100也可以默认开启当前摄像头及其焦距相邻(或者也可以称为光学变焦倍率相邻)的摄像头,在点选变焦、双指缩放变焦、拖动变焦条变焦等多个场景下,只开启目标变焦点对应的摄像头及其焦距相邻的摄像头,也可以实现FOV中心平滑过渡,同时,可以节省变焦过程中开启摄像头的时间。下面进行详细介绍:
假设电子设备100上有4个摄像头:广角摄像头、普通摄像头、长焦摄像头1、长焦摄像头2,图9A-图9D示例性示出了上述4个摄像头开启及关闭的情况,其中,黑色实心圆表示的摄像头是电子设备100的显示屏上显示的预览图像来自哪个摄像头。
假设广角摄像头所在的光学变焦倍率为0.4X,普通摄像头所在的光学变焦倍率为1X,长焦摄像头1所在的光学变焦倍率为5X,长焦摄像头2所在的光学变焦倍率为6X,从图9A-图9D中容易看出,上述4个摄像头所在的光学变焦倍率依次增大,即广角摄像头所在的光学变焦倍率<普通摄像头所在的光学变焦倍率<长焦摄像头1所在的光学变焦倍率<长焦摄像头1所在的光学变焦倍率,容易理解,从每个摄像头所在的光学变焦倍率的角度来说,广角摄像头仅与普通摄像头相邻,普通摄像头与广角摄像头、长焦摄像头1相邻,长焦摄像头1 与普通摄像头、长焦摄像头2相邻,长焦摄像头2仅与长焦摄像头2相邻。
假设电子设备100默认开启广角摄像头,如图9A所示,电子设备100在检测到用户打开相机应用程序的操作之后,响应于该操作,电子设备100打开相机应用程序,预览图像来自广角摄像头采集的图像,同时,由于广角摄像头与普通摄像头相邻,电子设备100开始启动普通摄像头,使得普通摄像头进入运行状态。在普通摄像头已经进入运行状态的情况下,若电子设备100检测到用户由广角摄像头切换到普通摄像头的操作,响应于该操作,电子设备100则可以直接基于普通摄像头采集的图像和广角摄像头采集的图像执行FOV中心偏移算法,使得广角摄像头的FOV中心平滑过渡到普通摄像头的FOV中心,避免FOV中心发生跳变,实现平滑变焦。
如图9B所示,在上述平滑变焦过程结束之后,预览图像则来自普通摄像头采集的图像,此时,由于普通摄像头与广角摄像头、长焦摄像头1相邻,因此,电子设备100仍保持广角摄像头运行,同时开始启动长焦摄像头1,使得长焦摄像头1进入运行状态。在长焦摄像头1已经进入运行状态的情况下,若电子设备100检测到用户由普通摄像头切换到长焦摄像头1的操作,响应于该操作,电子设备100则可以直接基于普通摄像头采集的图像和长焦摄像头1采集的图像执行FOV中心偏移算法,使得普通摄像头的FOV中心平滑过渡到长焦摄像头1的FOV中心,避免FOV中心发生跳变,实现平滑变焦。
如图9C所示,在上述平滑变焦过程结束之后,预览图像则来自长焦摄像头1采集的图像,此时,由于长焦摄像头1与普通摄像头、长焦摄像头2相邻,与广角摄像头不相邻,因此,电子设备100开始关闭广角摄像头,仍保持普通摄像头运行,同时开始启动长焦摄像头2,使得长焦摄像头2进入运行状态。在长焦摄像头2已经进入运行状态的情况下,若电子设备100检测到用户由长焦摄像头1切换到长焦摄像头2的操作,响应于该操作,电子设备100则可以直接基于长焦摄像头1采集的图像和长焦摄像头2采集的图像执行FOV中心偏移算法,使得长焦摄像头1的FOV中心平滑过渡到长焦摄像头2的FOV中心,避免FOV中心发生跳变,实现平滑变焦。
如图9D所示,在上述平滑变焦过程结束之后,预览图像则来自长焦摄像头2采集的图像,此时,由于长焦摄像头2与长焦摄像头1相邻,与普通摄像头不相邻,因此,电子设备100开始关闭普通摄像头,仍保持长焦摄像头1运行。
可以看出,与点选变焦场景下默认只开启一个摄像头的不同点在于,本申请实施例可以在检测到用户需要变焦的场景下(例如点选变焦、双指缩放变焦、拖动变焦条变焦等多个场景下)直接执行FOV中心偏移算法,不需要等到目标变焦点对应的摄像头运行之后再执行FOV中心偏移算法,这样,在实现平滑变焦的同时节省了变焦过程中开启摄像头的时间。
需要说明的是,上述文字仅仅以电子设备100默认开启广角摄像头为例对图9A-图9D进行了说明,不限于此,电子设备100默认开启的摄像头也可以是其他摄像头。例如,电子设备100也可以默认开启普通摄像头,在检测到用户从普通摄像头切换到广角摄像头的情况下,摄像头的开启及关闭情况可以由图9B切换到图9A,在检测到用户从普通摄像头切换到长焦摄像头1的情况下,摄像头的开启及关闭情况可以由图9B切换到图9C。
容易理解,本申请实施例也适用于用户从当前摄像头直接点选变焦切换到与当前摄像头不相邻的摄像头的场景。例如,对于增大变焦倍率的情况,若电子设备100检测到用户直接从广角摄像头点选变焦切换到长焦摄像头2的操作,响应于该操作,电子设备100可以依次开启或关闭上述图9A、图9B、图9C、图9D中对应摄像头,并依次执行平滑变焦过程中的FOV中心偏移算法。又例如,对于减小变焦倍率的情况,若电子设备100检测到用户直接从 长焦摄像头2点选变焦切换到广角摄像头的操作,响应于该操作,电子设备100可以依次开启或关闭上述图9D、图9C、图9B、图9A中对应摄像头,并依次执行平滑变焦过程中的FOV中心偏移算法。
下面从电子设备100的软硬件协作的视角,结合图10和图11详细描述FOV中心平滑过渡过程的具体实现。
如图10和图11所示,摄像头用于采集图像,当景物的反射光通过镜头,在镜片上折射后,汇聚在图像传感器上,图像传感器将光学图像转换成模拟电信号,再经过数模转换器输出摄像头采集到的原始数字图像。
硬件抽象层用于接收来自相机应用程序上报的变焦倍率对应的变焦值、启动变焦倍数对应的目标摄像头(例如长焦摄像头或广角摄像头)、启动执行FOV中心偏移算法等。具体地,硬件抽象层可以在接收到来自相机应用程序上报的变焦倍率对应的变焦值之后,判断并启动上述变焦值对应的目标摄像头,然后,硬件抽象层可以基于当前摄像头(例如普通摄像头)采集的图像和目标摄像头(例如长焦摄像头或广角摄像头)采集的图像执行FOV中心偏移算法,得到裁切参数,并将裁切参数发送给ISP的裁切模块。其中,上述裁切参数可以包括裁切区域的位置(例如FOV中心点坐标)和大小(宽度和高度)。
ISP用于将来自摄像头的数据转换为标准格式的图像,例如YUV、RGB等。具体地,ISP可以利用裁切模块基于接收到的裁切参数对摄像头采集的图像进行裁切,然后再对裁切后的图像进行后期处理,例如黑电平校正、镜头阴影校正、坏点补偿、颜色插值等操作,之后再通过I/O控制接口将YUV/RGB图像发送到处理器中处理。
处理器包括应用处理器、基带处理器、多媒体处理器等,可以运行各种图像处理算法,控制外围设备。在预览场景中,可以直接将ISP获取到的YUV、RGB等格式的图像发送给显示屏进行显示。在拍照或录像场景中,可以将照片或视频保存到存储器中。
显示屏可以通过UI监听用于调整显示屏中各个区域的预览图像的用户操作,并将监听到的用户操作上报给硬件抽象层。该用户操作可以包括但不限于上述UI实施例中提及的电子设备100在预览框中检测到的作用于目标物体的触控操作。
如图10所示,下面介绍增大变焦倍率的具体实现(以普通摄像头切换为长焦摄像头为例)。
在预览场景下,显示屏显示预览界面,预览界面中的预览框用于显示普通摄像头采集的图像,预览界面中显示变焦倍率为1X。
在默认只开启普通摄像头的情况下,显示屏检测到用户增大变焦倍率至5X的操作(例如用户点选变焦点5X的操作),并将用户选择的变焦倍率对应的变焦值发送给硬件抽象层,硬件抽象层可以判断变焦点5X对应的摄像头是长焦摄像头,之后,硬件抽象层可以启动长焦摄像头。在长焦摄像启动过程中,即长焦摄像头开始运行之前,ISP可以基于硬件抽象层计算得到的裁切参数对普通摄像头采集的图像进行居中裁切,在长焦摄像头开始运行之后,硬件抽象层可以基于普通摄像头采集的图像和长焦摄像头采集的图像执行FOV中心偏移算法,得到裁切参数,并将上述裁切参数发送给ISP的裁切模块,ISP的裁切模块可以基于上述裁切参数对普通摄像头采集的图像进行偏心裁切。之后,ISP可以对裁切后的图像做进一步后期处理,并发送给处理器生成待显示的图像。之后,处理器可以将上述待显示的图像发送给显示器,以指示显示屏显示在预览框中显示上述待显示的图像。
在默认开启普通摄像头,并开启与普通摄像头相邻的摄像头(例如长焦摄像头和广角摄像头)的情况下,显示屏检测到用户增大变焦倍率至5X的操作(例如用户点选变焦点5X的 操作),并将用户选择的变焦倍率对应的变焦值发送给硬件抽象层,硬件抽象层可以判断变焦点5X对应的摄像头是长焦摄像头,之后,由于长焦摄像头已经提前开启并运行,因此,硬件抽象层可以直接基于普通摄像头采集的图像和长焦摄像头采集的图像执行FOV中心偏移算法,得到裁切参数,并将上述裁切参数发送给ISP的裁切模块,ISP的裁切模块可以基于上述裁切参数对普通摄像头采集的图像进行偏心裁切。之后,ISP可以对裁切后的图像做进一步后期处理,并发送给处理器生成待显示的图像。之后,处理器可以将上述待显示的图像发送给显示器,以指示显示屏显示在预览框中显示上述待显示的图像。
容易理解,在增大变焦倍率的过程中,电子设备100可以通过硬件抽象层和ISP协作来对普通摄像头采集的图像进行裁切,实现逐渐减小预览视角以及逐渐改变显示图像的FOV中心的目的。
如图11所示,下面介绍减小变焦倍率的具体实现(以普通摄像头切换为广角摄像头为例)。
在预览场景下,显示屏显示预览界面,预览界面中的预览框用于显示普通摄像头采集的图像,预览界面中显示变焦倍率为1X。
在默认只开启普通摄像头的情况下,显示屏检测到用户减小变焦倍率至0.4X的操作(例如用户点选变焦点0.4X的操作),并将用户选择的变焦倍率对应的变焦值发送给硬件抽象层,硬件抽象层可以判断变焦点0.4X对应的摄像头是广角摄像头,之后,硬件抽象层可以启动广角摄像头。在广角摄像启动过程中,即广角摄像头开始运行之前,ISP可以基于硬件抽象层计算得到的裁切参数对普通摄像头采集的图像进行居中裁切,在广角摄像头开始运行之后,硬件抽象层可以基于普通摄像头采集的图像和广角摄像头采集的图像执行FOV中心偏移算法,得到裁切参数,并将上述裁切参数发送给ISP的裁切模块,ISP的裁切模块可以基于上述裁切参数对普通摄像头采集的图像进行偏心裁切。之后,ISP可以对裁切后的图像做进一步后期处理,并发送给处理器生成待显示的图像。之后,处理器可以将上述待显示的图像发送给显示器,以指示显示屏显示在预览框中显示上述待显示的图像。
在默认开启普通摄像头,并开启与普通摄像头相邻的摄像头(例如长焦摄像头和广角摄像头)的情况下,显示屏检测到用户减小变焦倍率至0.4X的操作(例如用户点选变焦点0.4X的操作),并将用户选择的变焦倍率对应的变焦值发送给硬件抽象层,硬件抽象层可以判断变焦点0.4X对应的摄像头是广角摄像头,之后,由于广角摄像头已经提前开启并运行,因此,硬件抽象层可以直接基于普通摄像头采集的图像和广角摄像头采集的图像执行FOV中心偏移算法,得到裁切参数,并将上述裁切参数发送给ISP的裁切模块,ISP的裁切模块可以基于上述裁切参数对普通摄像头采集的图像进行偏心裁切。之后,ISP可以对裁切后的图像做进一步后期处理,并发送给处理器生成待显示的图像。之后,处理器可以将上述待显示的图像发送给显示器,以指示显示屏显示在预览框中显示上述待显示的图像。
容易理解,在减小变焦倍率的过程中,电子设备100可以通过硬件抽象层和ISP协作来对普通摄像头采集的图像进行裁切,实现逐渐增大预览视角以及逐渐改变显示图像的FOV中心的目的。
在拍照场景下,与上述两种预览场景不同的是,显示屏还可以检测到用户的拍照操作,响应于该操作,电子设备100可以将预览框中显示的图像保存成照片。具体地,在进入拍摄预览界面之后,显示屏除了能检测到用户增大/减小变焦倍率的操作,还能检测到用户在拍摄控件上的触控操作,响应于该操作,电子设备100可以保存检测到上述触控操作时ISP发送到显示屏的预览框中的图像,即ISP可以将获得的YUV格式的数据进一步编码压缩成JPEG 格式的照片,处理器再将上述照片保存到存储器中。
在录像场景下,与上述两种预览场景不同的是,电子设备100还可以进一步保存预览框中的图像,具体保存为视频文件。具体地,在进入拍摄预览界面之后,显示屏除了能检测到用户增大/减小变焦倍率的操作,还能检测到用户在录像控件上的两次触控操作,响应于上述两次触控操作,电子设备100可以保存在上述两次触控操作中间的时间段输出的图像帧,即将为录像模式下生成的视频,处理器再将生成的视频保存在存储器中。
基于前述内容中介绍的电子设备100以及前述UI实施例,下面介绍本申请实施例提供的拍摄方法的总体流程。
如图12所示,该拍摄方法可以应用于包括显示屏、多个摄像头的电子设备100,这多个摄像头可以包括第一摄像头和第二摄像头,执行该拍摄方法的具体步骤如下:
S1201、电子设备100在显示屏上显示第一变焦倍率下的第一预览图像,该第一预览图像是第一摄像头采集的。
具体地,电子设备100可以检测到用户打开“相机”应用程序的操作,例如在图3A所示的主界面中点击相机图标215D的操作。响应于该操作,电子设备100可以打开“相机”应用程序,并在显示屏上显示第一变焦倍率下的第一预览图像,该第一预览图像是第一摄像头采集的。
其中,第一摄像头可以是普通摄像头,第一变焦倍率可以为普通摄像头所在的光学变焦倍率或普通摄像头的数字变焦倍率范围内的某一变焦倍率,第一预览图像可以为图4A或图5A所示预览框中的图像。
S1202、电子设备100检测到用户针对第二变焦倍率选项的点击操作。
其中,第二变焦倍率选项也可以称为目标变焦点,例如目标变焦点5X。点击操作可以包括:用户手指或手写装置先接触所述第二变焦倍率选项的操作,再离开所述第二变焦倍率选项的操作。
S1203、电子设备100基于至少一个第三变焦倍率生成并在显示屏上显示至少一个第二预览图像,该至少一个第三变焦倍率在第一变焦倍率与第二变焦倍率之间。
其中,第三变焦倍率可以为前述实施例提及的变焦序列中的其中一个或多个变焦倍率,第二预览图像可以是平滑变焦过程中第三变焦倍率下的预览图像。第二变焦倍率可以是第二摄像头所在的光学变焦倍率。第二摄像头可以是长焦摄像头或广角摄像头。
示例性地,在增大变焦倍率的情况下,第二变焦倍率可以是长焦摄像头所在的光学变焦倍率,第三变焦倍率可以是3X,对应的第二预览图像则可以是图4B所示预览框中的图像。或者,第三变焦倍率也可以是3.1X,对应的第二预览图像则可以是图4C所示预览框中的图像。
示例性地,在减小变焦倍率的情况下,第二变焦倍率可以是广角摄像头所在的光学变焦倍率,第三变焦倍率可以是0.9X,对应的第二预览图像则可以是图5B所示预览框中的图像。或者,第三变焦倍率也可以是0.8X,对应的第二预览图像则可以是图5C所示预览框中的图像。
S1204、电子设备100在显示屏上显示第二变焦倍率下的第三预览图像,该第三预览图 像时第二摄像头采集的。
示例性地,在增大变焦倍率的情况下,第二摄像头可以是长焦摄像头,第三预览图像可以是图4D所示预览框中的图像。
示例性地,在减小变焦倍率的情况下,第二摄像头可以是广角摄像头,第三预览图像可以是图5D所示预览框中的图像。
上述总体方法流程中未提及的内容可以参考前述实施例中的相关内容,在此不再赘述。
以上实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的范围。

Claims (12)

  1. 一种拍摄方法,应用于包括显示屏、多个摄像头的电子设备,所述多个摄像头包括第一摄像头和第二摄像头,其特征在于,所述方法包括:
    所述电子设备在所述显示屏上显示第一变焦倍率下的第一预览图像,所述第一预览图像是所述第一摄像头采集的;
    所述电子设备检测到用户针对第二变焦倍率选项的点击操作;
    所述电子设备基于至少一个第三变焦倍率生成并在所述显示屏上显示至少一个第二预览图像,所述至少一个第三变焦倍率在所述第一变焦倍率与第二变焦倍率之间;
    所述电子设备在所述显示屏上显示所述第二变焦倍率下的所述第三预览图像,所述第三预览图像是所述第二摄像头采集的。
  2. 根据权利要求1所述的方法,其特征在于,所述点击操作包括:用户手指或手写装置先接触所述第二变焦倍率选项的操作,再离开所述第二变焦倍率选项的操作。
  3. 根据权利要求2所述的方法,其特征在于,所述电子设备基于至少一个第三变焦倍率生成并在所述显示屏上显示至少一个第二预览图像,具体包括:
    所述电子设备在检测到用户手指或手写装置接触所述第二变焦倍率选项的操作时,开始基于至少一个所述第三变焦倍率生成并在所述显示屏上显示至少一个所述第二预览图像。
  4. 根据权利要求1-3任一项所述的方法,其特征在于,所述第二预览图像的个数N确定方法如下:
    N=T/t
    其中,T是所述电子设备检测到用户针对所述第二变焦倍率选项的所述点击操作到所述电子设备在所述显示屏上显示所述第二变焦倍率下的所述第三预览图像之间的时间间隔,t是当前相机系统的相邻两个预览帧之间的时间间隔。
  5. 根据权利要求4所述的方法,其特征在于,所述第三变焦倍率对应的变焦值zoomValue确定方法如下:
    Figure PCTCN2022083426-appb-100001
    其中,S是第一变焦倍率选项与所述第二变焦倍率选项之间在所述显示屏上的距离,n是小于所述第二预览图像的个数N的正整数。
  6. 根据权利要求1-5任一项所述的方法,其特征在于,所述第一预览图像的FOV中心与所述第一摄像头的FOV中心一致,所述第三预览图像的FOV中心与所述第二摄像头的FOV中心一致。
  7. 根据权利要求1-6任一项所述的方法,其特征在于,在所述电子设备检测到用户针对第二变焦倍率选项的点击操作之前,所述方法还包括:
    所述电子设备给所述第二摄像头上电,启动并运行所述第二摄像头。
  8. 根据权利要求7所述的方法,其特征在于,所述至少一个第二预览图像的FOV中心逐渐靠近所述第二摄像头的FOV中心。
  9. 根据权利要求1-6任一项所述的方法,其特征在于,在所述电子设备检测到用户针对第二变焦倍率选项的点击操作之后,所述方法还包括:
    所述电子设备给所述第二摄像头上电,启动并运行所述第二摄像头。
  10. 根据权利要求9所述的方法,其特征在于,在所述电子设备运行所述第二摄像头之前,所述至少一个第二预览图像的FOV中心与所述第一摄像头的FOV中心一致;在所述电子设备运行所述第二摄像头之后,所述至少一个第二预览图像的FOV中心开始逐渐靠近第二摄像头的FOV中心。
  11. 一种电子设备,包括显示屏,不同焦距的多个摄像头,存储器以及耦合于所述存储器的处理器,多个应用程序,以及一个或多个程序;其中,所述多个摄像头的光学中心不重合,所述多个摄像头包括第一摄像头和第二摄像头,所述第一摄像头与所述第二摄像头是所述多个摄像头中焦距相邻的两个摄像头,其特征在于,所述处理器在运行所述一个或多个程序时,使得所述电子设备执行如权利要求1-10中任一项所述的方法。
  12. 一种计算机存储介质,其特征在于,所述计算机存储介质存储有计算机程序,所述计算机程序包括程序指令,当所述程序指令在电子设备上运行时,使得所述电子设备执行如权利要求1-10中任一项所述的方法。
PCT/CN2022/083426 2021-05-31 2022-03-28 拍摄方法及电子设备 WO2022252780A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP22814827.6A EP4329287A1 (en) 2021-05-31 2022-03-28 Photographing method and electronic device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110612148.0A CN115484375A (zh) 2021-05-31 2021-05-31 拍摄方法及电子设备
CN202110612148.0 2021-05-31

Publications (1)

Publication Number Publication Date
WO2022252780A1 true WO2022252780A1 (zh) 2022-12-08

Family

ID=84322500

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/083426 WO2022252780A1 (zh) 2021-05-31 2022-03-28 拍摄方法及电子设备

Country Status (3)

Country Link
EP (1) EP4329287A1 (zh)
CN (1) CN115484375A (zh)
WO (1) WO2022252780A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117278850A (zh) * 2023-10-30 2023-12-22 荣耀终端有限公司 一种拍摄方法及电子设备

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116709016A (zh) * 2022-02-24 2023-09-05 荣耀终端有限公司 倍率切换方法和倍率切换装置
CN117135420A (zh) * 2023-04-07 2023-11-28 荣耀终端有限公司 图像同步方法及其相关设备

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005080086A (ja) * 2003-09-02 2005-03-24 Fuji Photo Film Co Ltd 撮影画像表示装置及びデジタルカメラ
CN110351487A (zh) * 2019-08-26 2019-10-18 Oppo广东移动通信有限公司 控制方法、控制装置、电子设备和存储介质
CN111654631A (zh) * 2020-06-19 2020-09-11 厦门紫光展锐科技有限公司 变焦控制方法、系统、设备及介质
CN111885305A (zh) * 2020-07-28 2020-11-03 Oppo广东移动通信有限公司 预览画面处理方法及装置、存储介质和电子设备

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102351542B1 (ko) * 2017-06-23 2022-01-17 삼성전자주식회사 시차 보상 기능을 갖는 애플리케이션 프로세서, 및 이를 구비하는 디지털 촬영 장치

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005080086A (ja) * 2003-09-02 2005-03-24 Fuji Photo Film Co Ltd 撮影画像表示装置及びデジタルカメラ
CN110351487A (zh) * 2019-08-26 2019-10-18 Oppo广东移动通信有限公司 控制方法、控制装置、电子设备和存储介质
CN111654631A (zh) * 2020-06-19 2020-09-11 厦门紫光展锐科技有限公司 变焦控制方法、系统、设备及介质
CN111885305A (zh) * 2020-07-28 2020-11-03 Oppo广东移动通信有限公司 预览画面处理方法及装置、存储介质和电子设备

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117278850A (zh) * 2023-10-30 2023-12-22 荣耀终端有限公司 一种拍摄方法及电子设备

Also Published As

Publication number Publication date
EP4329287A1 (en) 2024-02-28
CN115484375A (zh) 2022-12-16

Similar Documents

Publication Publication Date Title
WO2021093793A1 (zh) 一种拍摄方法及电子设备
WO2020259038A1 (zh) 一种拍摄方法及设备
WO2022262260A1 (zh) 一种拍摄方法及电子设备
WO2020073959A1 (zh) 图像捕捉方法及电子设备
WO2022252780A1 (zh) 拍摄方法及电子设备
US11832022B2 (en) Framing method for multi-channel video recording, graphical user interface, and electronic device
CN114205522B (zh) 一种长焦拍摄的方法及电子设备
WO2021129198A1 (zh) 一种长焦场景下的拍摄方法及终端
CN113596316B (zh) 拍照方法及电子设备
WO2021185250A1 (zh) 图像处理方法及装置
CN113660408B (zh) 一种视频拍摄防抖方法与装置
WO2021185374A1 (zh) 一种拍摄图像的方法及电子设备
CN115967851A (zh) 快速拍照方法、电子设备及计算机可读存储介质
WO2023231697A1 (zh) 一种拍摄方法及相关设备
WO2020209097A1 (ja) 画像表示装置、画像表示方法、及びプログラム
CN114531539B (zh) 拍摄方法及电子设备
RU2809660C1 (ru) Способ кадрирования для записи многоканального видео, графический пользовательский интерфейс и электронное устройство
WO2023160224A9 (zh) 一种拍摄方法及相关设备
WO2022206589A1 (zh) 一种图像处理方法以及相关设备
CN117354624A (zh) 摄像头切换方法、设备以及存储介质
CN115802145A (zh) 拍摄方法及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22814827

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2022814827

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2022814827

Country of ref document: EP

Effective date: 20231122

NENP Non-entry into the national phase

Ref country code: DE