WO2022127787A1 - Procédé d'affichage d'image et dispositif électronique - Google Patents

Procédé d'affichage d'image et dispositif électronique Download PDF

Info

Publication number
WO2022127787A1
WO2022127787A1 PCT/CN2021/137940 CN2021137940W WO2022127787A1 WO 2022127787 A1 WO2022127787 A1 WO 2022127787A1 CN 2021137940 W CN2021137940 W CN 2021137940W WO 2022127787 A1 WO2022127787 A1 WO 2022127787A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
camera
electronic device
application
interface
Prior art date
Application number
PCT/CN2021/137940
Other languages
English (en)
Chinese (zh)
Inventor
沈日胜
许虎
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2022127787A1 publication Critical patent/WO2022127787A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters

Definitions

  • the present invention relates to the technical field of electronic devices, and in particular, to an image display method and electronic device.
  • the user experience can be improved by extending the animation duration of the camera app startup, but the duration of the camera app startup animation is usually determined by the system and cannot be modified. Extending the animation duration of the camera app launch can't really avoid the black background when the camera app launches. In addition, the animation time started by the camera application is too long, which will also affect the user experience.
  • the application can save a frame of the image sampled when the camera exits, as the background image the next time the camera application starts.
  • a part of the memory needs to be consumed to save the sampled images.
  • the electronics interface still presents a black background during camera startup.
  • an embodiment of the present invention provides an image display method.
  • the shooting preview interface of the electronic device displays a blurred image
  • the shooting preview interface of the electronic device displays the real-time image collected by the camera of the electronic device, which improves the user experience.
  • an embodiment of the present application provides an image display method, the method includes detecting a first operation of a user, and in response to the first operation, the electronic device displays a first shooting preview interface, the first shooting
  • the preview interface includes a first preview frame, the first preview frame displays a first image, and the first image is a blurred image; after a first preset time, the electronic device displays a second shooting preview interface, and the first image is a blurred image.
  • the second shooting preview interface includes a second preview frame, and the second preview frame displays an image captured in real time by a camera of the electronic device.
  • the first image is an image obtained by blurring an image collected by a camera of the electronic device.
  • the first preset time is a power-on time of the camera of the electronic device.
  • the method before the electronic device displays the first shooting preview interface, the method further includes: saving the first image.
  • saving the first image Through the above setting manner, during the startup process of the camera application, the first image can be quickly provided, the waiting time of the user is reduced, and the user experience is improved. All applications using the system camera service on the electronic device can obtain the stored image data through the RAM and ROM corresponding to the camera application when the camera is started, instead of saving the image data in its corresponding storage space separately, saving Memory.
  • the first image is a black image.
  • the blurring process is to use a blur coefficient to process the image collected by the camera.
  • a blurred image with a similar blurring degree to that in the focusing process can be obtained, so that the switching to the focusing process after displaying the blurred image is smoother, and the user experience is improved.
  • embodiments of the present application provide an electronic device, including one or more touch screens, one or more memories, and one or more processors; wherein the one or more memories store one or more computers a program; the one or more touch screens are used to detect a first operation of the user; in response to the first operation, the one or more touch screens are also used to display a first shooting preview interface, the first
  • the shooting preview interface includes a first preview frame, the first preview frame displays a first image, and the first image is a blurred image; after a first preset time, the one or more touch screens are also used to display the first image.
  • the second shooting preview interface includes a second preview frame, and the second preview frame displays the image captured by the camera of the electronic device in real time.
  • the first image is an image obtained by blurring an image collected by a camera of the electronic device.
  • the first preset time is a power-on time of the camera of the electronic device.
  • the one or more memories are used to save the first image.
  • the first image can be quickly provided, the waiting time of the user is reduced, and the user experience is improved.
  • All applications using the system camera service on the electronic device can obtain the stored image data through the RAM and ROM corresponding to the camera application when the camera is started, instead of saving the image data in its corresponding storage space separately, saving Memory.
  • the first image is a black image.
  • the one or more processors are configured to process the image collected by the camera by using a blur coefficient.
  • a blurred image with a similar blurring degree to that in the focusing process can be obtained, so that the switching to the focusing process after displaying the blurred image is smoother, and the user experience is improved.
  • embodiments of the present application provide a computer-readable storage medium, including instructions, when the instructions are executed on an electronic device, the electronic device is made to perform the above first aspect, or any of the above first aspects. An implementation of the method described.
  • an embodiment of the present application provides a computer program product, which, when the computer program product runs on an electronic device, enables the electronic device to perform the above first aspect, or any one of the above first aspect. method described.
  • the interface of the electronic device switches from displaying the blurred image of the last frame of image collected during the previous operation of the camera to displaying the image during the focusing process of the camera, and then to displaying the image after the focusing of the camera is completed.
  • the process of images is smoother, which improves the user experience.
  • all the application programs on the electronic device that use the system camera service can obtain the stored image data through the internal memory, and different application programs do not need to save memory separately.
  • FIG. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
  • FIG. 2 is a schematic structural diagram of an operating system in an electronic device provided by an embodiment of the present application.
  • FIG. 3A is a schematic diagram of a scene of starting a camera application according to an embodiment of the present application.
  • 3B is a schematic diagram of a scene where a camera application is started
  • FIG. 4 is a schematic diagram of an image display method provided by an embodiment of the present application.
  • FIG. 5 is an interaction diagram for querying and obtaining image data after blurring processing provided by an embodiment of the present application
  • FIG. 6 is an interaction diagram of a fuzzification process provided by an embodiment of the present application.
  • FIG. 7 is a flowchart of querying and acquiring image data after blurring processing provided by an embodiment of the present application.
  • references in this specification to "one embodiment” or “some embodiments” and the like mean that a particular feature, structure or characteristic described in connection with that embodiment is included in at least one embodiment of the present application.
  • appearances of the phrases “in one embodiment,” “in some embodiments,” “in other embodiments,” “in other embodiments,” etc. in various places in this specification are not necessarily All refer to the same embodiment, but mean “one or more but not all embodiments” unless specifically emphasized otherwise.
  • the terms “including”, “including”, “having” and their variants mean “including but not limited to” unless specifically emphasized otherwise.
  • the term “when” may be interpreted to mean “if” or “after” or “in response to determining" or “in response to detecting" depending on the context.
  • the phrases “in determining" or “if detecting (the stated condition or event)” can be interpreted to mean “if determining" or “in response to determining" or “on detecting (the stated condition or event)” or “in response to the detection of (the stated condition or event)”.
  • the electronic device may be a portable electronic device that also includes other functions such as personal digital assistant and/or music player functions, such as a mobile phone, a tablet computer, a wearable electronic device with wireless communication capabilities (eg, a smart watch) Wait.
  • portable electronic devices include, but are not limited to, carry-on Or portable electronic devices with other operating systems.
  • the portable electronic device described above may also be other portable electronic devices, such as a laptop computer (Laptop) or the like with a touch panel or a touch-sensitive surface.
  • the above-mentioned electronic device may not be a portable electronic device, but a desktop computer.
  • FIG. 1 is a schematic structural diagram of an electronic device 100 .
  • the electronic device 100 may be a mobile phone, a tablet computer, a desktop computer, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, and an augmented reality (AR) device, virtual reality (virtual reality, VR) equipment, artificial intelligence (artificial intelligence, AI) equipment, wearable equipment, vehicle-mounted equipment, etc., the specific types of the electronic equipment are not particularly limited in this embodiment of the present application.
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone jack 170D, sensor module 180, buttons 190, motor 191, indicator 192, camera 193, display screen 194, and Subscriber identification module (subscriber identification module, SIM) card interface 195 and so on.
  • SIM Subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light. Sensor 180L, bone conduction sensor 180M, etc.
  • the structures illustrated in the embodiments of the present invention do not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or less components than shown, or combine some components, or separate some components, or arrange different components.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (neural-network processing unit, NPU), etc. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • baseband processor baseband processor
  • neural-network processing unit neural-network processing unit
  • the controller can generate an operation control signal according to the instruction operation code and timing signal, and complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 110 . If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby increasing the efficiency of the system.
  • the processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transceiver (universal asynchronous transmitter) receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and / or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transceiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus that includes a serial data line (SDA) and a serial clock line (SCL).
  • the processor 110 may contain multiple sets of I2C buses.
  • the processor 110 can be respectively coupled to the touch sensor 180K, the charger, the flash, the camera 193 and the like through different I2C bus interfaces.
  • the processor 110 may couple the touch sensor 180K through the I2C interface, so that the processor 110 and the touch sensor 180K communicate with each other through the I2C bus interface, so as to realize the touch function of the electronic device 100 .
  • the I2S interface can be used for audio communication.
  • the processor 110 may contain multiple sets of I2S buses.
  • the processor 110 may be coupled with the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170 .
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the I2S interface, so as to realize the function of answering calls through a Bluetooth headset.
  • the PCM interface can also be used for audio communications, sampling, quantizing and encoding analog signals.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • a UART interface is typically used to connect the processor 110 with the wireless communication module 160 .
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to implement the Bluetooth function.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193 .
  • MIPI interfaces include camera serial interface (CSI), display serial interface (DSI), etc.
  • the processor 110 communicates with the camera 193 through a CSI interface, so as to realize the photographing function of the electronic device 100 .
  • the processor 110 communicates with the display screen 194 through the DSI interface to implement the display function of the electronic device 100 .
  • the GPIO interface can be configured by software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface may be used to connect the processor 110 with the camera 193, the display screen 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like.
  • the GPIO interface can also be configured as I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface that conforms to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100, and can also be used to transmit data between the electronic device 100 and peripheral devices. It can also be used to connect headphones to play audio through the headphones.
  • the interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules illustrated in the embodiment of the present invention is only a schematic illustration, and does not constitute a structural limitation of the electronic device 100 .
  • the electronic device 100 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 140 may receive charging input from the wired charger through the USB interface 130 .
  • the charging management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100 . While the charging management module 140 charges the battery 142 , it can also supply power to the electronic device through the power management module 141 .
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the display screen 194, the camera 193, and the wireless communication module 160.
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, battery health status (leakage, impedance).
  • the power management module 141 may also be provided in the processor 110 .
  • the power management module 141 and the charging management module 140 may also be provided in the same device.
  • the wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modulation and demodulation processor, the baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed as a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 may provide wireless communication solutions including 2G/3G/4G/5G etc. applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modulation and demodulation processor, and then turn it into an electromagnetic wave for radiation through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the same device as at least part of the modules of the processor 110 .
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low frequency baseband signal is processed by the baseband processor and passed to the application processor.
  • the application processor outputs sound signals through audio devices (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or videos through the display screen 194 .
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent of the processor 110, and may be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the electronic device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), global navigation satellites Wireless communication solutions such as global navigation satellite system (GNSS), frequency modulation (FM), near field communication (NFC), and infrared technology (IR).
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication
  • IR infrared technology
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , perform frequency modulation on it, amplify it, and convert it into electromagnetic waves for radiation through the antenna 2 .
  • the antenna 1 of the electronic device 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code Division Multiple Access (WCDMA), Time Division Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (global positioning system, GPS), global navigation satellite system (global navigation satellite system, GLONASS), Beidou navigation satellite system (beidou navigation satellite system, BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite based augmentation systems (SBAS).
  • global positioning system global positioning system, GPS
  • global navigation satellite system global navigation satellite system, GLONASS
  • Beidou navigation satellite system beidou navigation satellite system, BDS
  • quasi-zenith satellite system quadsi -zenith satellite system, QZSS
  • SBAS satellite based augmentation systems
  • the electronic device 100 implements a display function through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • Display screen 194 is used to display images, videos, and the like.
  • Display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrix organic light).
  • LED organic light-emitting diode
  • AMOLED organic light-emitting diode
  • FLED flexible light-emitting diode
  • Miniled MicroLed, Micro-oLed, quantum dot light-emitting diode (quantum dot light emitting diodes, QLED) and so on.
  • the electronic device 100 may implement a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
  • the ISP is used to process the data fed back by the camera 193 .
  • the shutter is opened, the light is transmitted to the camera photosensitive element through the lens, the light signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • Camera 193 is used to capture still images or video.
  • the object is projected through the lens to generate an optical image onto the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the camera 193 may be used to collect images, and send the collected images to the camera service module.
  • the camera 193 can receive the start-up instruction sent by the camera service module. After the camera 193 receives the start-up instruction, it takes a period of time to start up. After the camera 193 is started, it collects still images or videos through the photosensitive element.
  • a digital signal processor is used to process digital signals, in addition to processing digital image signals, it can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy and so on.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs.
  • the electronic device 100 can play or record videos of various encoding formats, such as: Moving Picture Experts Group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4 and so on.
  • MPEG Moving Picture Experts Group
  • MPEG2 moving picture experts group
  • MPEG3 MPEG4
  • MPEG4 Moving Picture Experts Group
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • Applications such as intelligent cognition of the electronic device 100 can be implemented through the NPU, such as image recognition, face recognition, speech recognition, text understanding, and the like.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100 .
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example to save files like music, video etc in external memory card.
  • the Internal memory 121 may be used to store computer executable program code, which includes instructions.
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), and the like.
  • the storage data area can store data (such as audio data, phone book, etc.) created during the use of the electronic device 100 and the like.
  • the internal memory 121 may include high-speed random access memory, may include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
  • the internal memory 121 may include random access memory (random access memory, RAM).
  • the RAM can be read and written at any time, and the read and write speed is fast.
  • the RAM is usually used as a temporary data storage medium for the operating system or other running programs. Once the power is turned off, the data stored in the RAM will be lost.
  • the internal memory 121 may also include a read-only memory (ROM), the data stored in the ROM is stable, and the data stored in the ROM will not change after the power is turned off.
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor. For example, in this embodiment of the present application, the processor 110 may execute the instructions stored in the internal memory 121 for capturing images, acquiring images, extracting blur coefficients, blurring, and caching images.
  • the internal memory 121 may allocate corresponding storage spaces for different application programs.
  • the RAM and ROM in the internal memory 121 will allocate corresponding storage space for the camera application, which is used to save the image captured by the camera or the image after blurring.
  • the last frame of image captured during camera operation can be stored either in RAM or in ROM.
  • the blurred image can be stored either in RAM or in ROM.
  • the electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playback, recording, etc.
  • the audio module 170 is used for converting digital audio information into analog audio signal output, and also for converting analog audio input into digital audio signal. Audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be provided in the processor 110 , or some functional modules of the audio module 170 may be provided in the processor 110 .
  • Speaker 170A also referred to as a "speaker" is used to convert audio electrical signals into sound signals.
  • the electronic device 100 can listen to music through the speaker 170A, or listen to a hands-free call.
  • the receiver 170B also referred to as "earpiece" is used to convert audio electrical signals into sound signals.
  • the voice can be answered by placing the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals.
  • the user can make a sound by approaching the microphone 170C through a human mouth, and input the sound signal into the microphone 170C.
  • the electronic device 100 may be provided with at least one microphone 170C.
  • the earphone jack 170D is used to connect wired earphones.
  • the earphone interface 170D may be the USB interface 130, or may be a 3.5mm open mobile terminal platform (OMTP) standard interface, a cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the pressure sensor 180A is used to sense pressure signals, and can convert the pressure signals into electrical signals.
  • the pressure sensor 180A may be provided on the display screen 194 .
  • the electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the electronic device 100 may also calculate the touched position according to the detection signal of the pressure sensor 180A.
  • the gyro sensor 180B may be used to determine the motion attitude of the electronic device 100 .
  • the angular velocity of electronic device 100 about three axes ie, x, y, and z axes
  • the gyro sensor 180B can be used for image stabilization.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenarios.
  • the air pressure sensor 180C is used to measure air pressure.
  • the electronic device 100 calculates the altitude through the air pressure value measured by the air pressure sensor 180C to assist in positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the electronic device 100 can detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip machine, the electronic device 100 can detect the opening and closing of the flip according to the magnetic sensor 180D.
  • the acceleration sensor 180E can detect the magnitude of the acceleration of the electronic device 100 in various directions (generally three axes).
  • the magnitude and direction of gravity can be detected when the electronic device 100 is stationary. It can also be used to identify the posture of electronic devices, and can be used in applications such as horizontal and vertical screen switching, pedometers, etc.
  • the electronic device 100 can measure the distance through infrared or laser. In some embodiments, when shooting a scene, the electronic device 100 can use the distance sensor 180F to measure the distance to achieve fast focusing.
  • Proximity light sensor 180G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
  • the light emitting diodes may be infrared light emitting diodes.
  • the electronic device 100 emits infrared light to the outside through the light emitting diode.
  • Electronic device 100 uses photodiodes to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100 . When insufficient reflected light is detected, the electronic device 100 may determine that there is no object near the electronic device 100 .
  • the ambient light sensor 180L is used to sense ambient light brightness.
  • the electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket, so as to prevent accidental touch.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to realize fingerprint unlocking, accessing application locks, taking pictures with fingerprints, answering incoming calls with fingerprints, and the like.
  • the temperature sensor 180J is used to detect the temperature.
  • the electronic device 100 uses the temperature detected by the temperature sensor 180J to execute a temperature processing strategy. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold value, the electronic device 100 reduces the performance of the processor located near the temperature sensor 180J in order to reduce power consumption and implement thermal protection.
  • Touch sensor 180K also called “touch device”.
  • the touch sensor 180K may be disposed on the display screen 194 , and the touch sensor 180K and the display screen 194 form a touch screen, also referred to as a "touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to touch operations may be provided through display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100 , which is different from the location where the display screen 194 is located.
  • the bone conduction sensor 180M can acquire vibration signals.
  • the bone conduction sensor 180M can acquire the vibration signal of the vibrating bone mass of the human voice.
  • the bone conduction sensor 180M can also contact the pulse of the human body and receive the blood pressure beating signal.
  • the keys 190 include a power-on key, a volume key, and the like. Keys 190 may be mechanical keys. It can also be a touch key.
  • the electronic device 100 may receive key inputs and generate key signal inputs related to user settings and function control of the electronic device 100 .
  • Motor 191 can generate vibrating cues.
  • the motor 191 can be used for vibrating alerts for incoming calls, and can also be used for touch vibration feedback.
  • touch operations acting on different applications can correspond to different vibration feedback effects.
  • the motor 191 can also correspond to different vibration feedback effects for touch operations on different areas of the display screen 194 .
  • the indicator 192 can be an indicator light, which can be used to indicate the charging state, the change of the power, and can also be used to indicate a message, a missed call, a notification, and the like.
  • the SIM card interface 195 is used to connect a SIM card.
  • the SIM card can be contacted and separated from the electronic device 100 by inserting into the SIM card interface 195 or pulling out from the SIM card interface 195 .
  • the electronic device 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • the SIM card interface 195 can support Nano SIM card, Micro SIM card, SIM card and so on. Multiple cards can be inserted into the same SIM card interface 195 at the same time. The types of the plurality of cards may be the same or different.
  • the SIM card interface 195 can also be compatible with different types of SIM cards.
  • the SIM card interface 195 is also compatible with external memory cards.
  • the electronic device 100 interacts with the network through the SIM card to realize functions such as call and data communication.
  • the electronic device 100 employs an eSIM, ie: an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100 .
  • FIG. 2 is a block diagram of the software structure of the electronic device 100 according to the embodiment of the present application.
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the embodiment of the present invention takes an Android system with a layered architecture as an example to illustrate the software structure of the electronic device 100 as an example.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate with each other through software interfaces.
  • the Android system is divided into four layers, which are, from top to bottom, an application layer, an application framework layer, an Android runtime (Android runtime) and a system library, and a kernel layer.
  • the application layer can include a series of application packages.
  • the application package can include applications such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, short message and so on.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer may include an image processing module, and the image processing module has the functions of acquiring images, extracting blur coefficients, and blurring.
  • the image processing module can obtain the image collected by the camera through the internal memory, and extract the blur coefficient of the obtained image; the image processing module can also obtain the last frame of the image collected during the operation of the camera through the internal memory, and the collected image The last frame of the image is blurred; the image processing module can also save the blurred image in the RAM corresponding to the camera application, or save the blurred image in the ROM corresponding to the camera application.
  • the application framework layer may include window managers, content providers, view systems, telephony managers, resource managers, notification managers, and the like.
  • a window manager is used to manage window programs.
  • the window manager can get the size of the display screen, determine whether there is a status bar, lock the screen, take screenshots, etc.
  • Content providers are used to store and retrieve data and make these data accessible to applications.
  • the data may include video, images, audio, calls made and received, browsing history and bookmarks, phone book, etc.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on. View systems can be used to build applications.
  • a display interface can consist of one or more views.
  • the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
  • the phone manager is used to provide the communication function of the electronic device 100 .
  • the management of call status including connecting, hanging up, etc.).
  • the resource manager provides various resources for the application, such as localization strings, icons, pictures, layout files, video files and so on.
  • the notification manager enables applications to display notification information in the status bar, which can be used to convey notification-type messages, and can disappear automatically after a brief pause without user interaction. For example, the notification manager is used to notify download completion, message reminders, etc.
  • the notification manager can also display notifications in the status bar at the top of the system in the form of graphs or scroll bar text, such as notifications of applications running in the background, and notifications on the screen in the form of dialog windows. For example, text information is prompted in the status bar, a prompt sound is issued, the electronic device vibrates, and the indicator light flashes.
  • Android Runtime includes core libraries and a virtual machine. Android runtime is responsible for scheduling and management of the Android system.
  • the core library consists of two parts: one is the function functions that the java language needs to call, and the other is the core library of Android.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application layer and the application framework layer as binary files.
  • the virtual machine is used to perform functions such as object lifecycle management, stack management, thread management, safety and exception management, and garbage collection.
  • a system library can include multiple functional modules. For example: surface manager (surface manager), media library (Media Libraries), 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL), etc.
  • surface manager surface manager
  • media library Media Libraries
  • 3D graphics processing library eg: OpenGL ES
  • 2D graphics engine eg: SGL
  • the Surface Manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing.
  • 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer contains at least display drivers, camera drivers, audio drivers, and sensor drivers.
  • a corresponding hardware interrupt is sent to the kernel layer.
  • the kernel layer processes touch operations into raw input events (including touch coordinates, timestamps of touch operations, etc.). Raw input events are stored at the kernel layer.
  • the application framework layer obtains the original input event from the kernel layer, and identifies the control corresponding to the input event. Taking the touch operation as a touch click operation, and the control corresponding to the click operation is the control of the camera application icon, for example, the camera application calls the interface of the application framework layer to start the camera application, and then starts the camera driver by calling the kernel layer.
  • the camera 193 captures still images or video.
  • FIG. 3A is a schematic diagram of a scene of starting a camera application according to an embodiment of the present application.
  • the interface of the electronic device displays a first image 301 .
  • the above-mentioned first image is an image obtained by blurring the last frame of image collected during the last camera operation.
  • the above-mentioned first image may also be a blurred image of the last frame of image collected during the first operation of the camera when the user uses the camera application for the first time.
  • the camera application startup process refers to the process from the camera application receiving the user's opening instruction (for example, the user's click on the control of the camera application displayed on the electronic device interface) to the camera being able to collect still images or videos through the photosensitive element , for example, the process of powering up the camera.
  • the running process of the camera including the camera receiving the start command, the camera powering on and starting, collecting still images or videos through the photosensitive element, the camera receiving the shutdown command, the camera stopping the process of collecting still images or videos, including the process of collecting still images or videos in response to user operations Image and the process of saving the image.
  • the above-mentioned blurring processing refers to using a preset blurring algorithm, and processing the image according to the blurring coefficient corresponding to the blurring algorithm.
  • the preset blur algorithm can be Gaussian blur algorithm, box blur algorithm, Kawase blur algorithm, double blur algorithm, bokeh blur algorithm, tilt-shift blur algorithm, aperture blur algorithm, granular blur algorithm, radial blur algorithm, direction blur
  • One of the algorithms such as fuzzy algorithm.
  • the above-mentioned preset fuzzy algorithm may also be other fuzzy algorithms that may be used, and the above-mentioned preset fuzzy algorithm is not particularly limited in this embodiment of the present application.
  • Different blurring algorithms have different blurring coefficients, wherein the blurring coefficient refers to a set of parameters corresponding to the blurring algorithms, and the parameters are used to adjust the blurring degree of the image.
  • the blur coefficient corresponding to the Gaussian blur algorithm is (scale, radius), where scale represents the magnification factor, and radius represents the radius.
  • the first preset time may be the time when the camera is powered on.
  • the time for the camera to be powered on includes the time when the camera application receives the user's opening instruction (for example, the user clicks on the controls of the camera application displayed on the interface of the electronic device), the time when the camera receives the startup instruction, the time when the camera is powered on and can pass the light sensor. The time elapsed between the capture of a still image or video by an element.
  • the first preset time can also be the time from when the camera application receives the user's opening instruction to the time when the camera can focus to present a clear image, that is, the time when the camera is powered on and the time when the camera completes the focusing process (focusing).
  • the process is that the camera can collect still images or videos through the photosensitive element, and the camera cannot focus to make the blurred image appear until the camera can focus and complete the time to display a clear image.
  • the focusing process is for the camera to adjust the distance of its own focus. the process of making the photos clear).
  • the first preset time may also be the time elapsed from the time when the camera application receives the user's opening instruction to the end of any time point during the camera focusing process.
  • the interface of the electronic device displays the second image 303 collected by the camera.
  • the above-mentioned second image refers to an image collected by the camera after focusing during the operation of the camera. During the operation of the camera, the camera will focus first, and the image collected during the focus will appear blurred to a certain extent. After the focus is completed, a second image 303 is displayed, and the second image is the image collected after the focus is completed.
  • the user currently wants to use the camera to capture an image of a potted plant. If the last time the user used the camera application, the last frame of image collected during the running of the camera was an image of a cup.
  • the blurred image of the cup image is displayed.
  • the camera After the camera application is started, the camera first focuses on the potted plants, and the captured images of the potted plants during the focusing process will be displayed, and the images of the potted plants collected during the focusing process will appear blurry. After the camera is focused, the image of the potted plant captured after focusing is displayed.
  • the problem that the black background is displayed for too long during the startup process of the camera application can be eliminated, and the user experience can be improved.
  • the user can directly open the camera application. For example, if a user wishes to use the camera to take pictures, when taking a video, the camera application can be directly opened.
  • the interface of the electronic device displays the blurred image of the last frame of the image collected during the previous camera operation. image.
  • the camera collects images, and when the electronic device receives the image data collected by the camera, the interface of the electronic device displays the image collected by the camera.
  • the images collected by the camera include images collected during the focusing process and images collected after the focusing is completed.
  • the user can also open the camera application through other applications.
  • the user can click "Scan" in the payment software, open the camera application, and use the camera to scan the payment QR code
  • some applications support the face recognition function to authenticate the user's identity.
  • the user can open the camera application through the application and use the camera to perform face recognition.
  • the electronic device displays the last frame of the image collected during the previous camera operation after blurring processing.
  • Image After the camera application is started, the camera can capture images, and when the electronic device receives the image data captured by the camera, the electronic device displays the image captured by the camera.
  • the images collected by the camera include images collected during the focusing process and images collected after the focusing is completed.
  • the following describes an image display method provided by an embodiment of the present application by taking a user directly opening a camera application as an example.
  • an image display method provided by an embodiment of the present application may include the following steps:
  • Step S401a The camera application sends an instruction to activate the camera to the camera.
  • Step S401b The camera application sends an instruction to acquire the image data of the last frame of image collected during the last camera operation after blurring processing to the image processing module.
  • step S401a and step S401b are parallel steps.
  • Step S402 the image processing module sends an instruction to the internal memory for acquiring the image data of the last frame of the image collected during the last camera operation after being blurred.
  • the internal memory includes RAM and ROM corresponding to the camera application.
  • the above-mentioned image processing module may store, in the RAM corresponding to the camera application, the blurred image data of the last frame of the image collected during the last camera operation.
  • the image processing module can also generate an image file from the image data of the last frame of the image collected during the last camera operation after the blurring process, and store the image file in the ROM corresponding to the camera application.
  • the image processing module can also generate an image file from the last frame of image data collected during the last camera running process, and store it in the ROM corresponding to the camera application.
  • the internal memory queries whether the RAM corresponding to the camera application stores the blurred image data of the last frame of the image collected during the last camera operation. If there is blurred image data of the last frame of image collected during the last camera operation, the blurred image data is sent to the image processing module. If the blurred image data is not available, check whether the ROM corresponding to the camera application stores an image file generated from the blurred image data of the last frame image collected during the last camera operation. If there is an image file generated from the blurred image data of the last frame of image collected during the last camera operation, load the image file into the RAM corresponding to the camera application, generate the corresponding image data, and store the image The data is sent to the image processing module.
  • the preset blur algorithm can be Gaussian blur algorithm, box blur algorithm, Kawase blur algorithm, double blur algorithm, bokeh blur algorithm, axis-shift blur algorithm, aperture blur algorithm, granular blur algorithm, radial blur algorithm, direction blur
  • One of the algorithms such as the fuzzy algorithm may also be other fuzzy algorithms that may be used, and the above-mentioned preset fuzzy algorithms are not particularly limited in this embodiment of the present application.
  • the above-mentioned blurring processing refers to using a preset blurring algorithm, and processing the image according to the blurring coefficient corresponding to the blurring algorithm.
  • the above-mentioned method for extracting the blur coefficient will be described in detail with reference to FIG. 5 and FIG. 6 in subsequent embodiments.
  • Step S403 the internal memory sends the image data of the last frame of the image collected during the last operation of the camera after the blurring process to the image processing module.
  • Step S404 The image processing module sends the blurred image data of the last frame of image collected during the last camera operation to the camera application. After the camera application receives the blurred image data of the last frame of the image collected during the last camera running process sent in step S404, the interface of the electronic device can display the blurred image of the last frame of the image collected during the last camera running process. processed image.
  • Step S405 the camera sends the collected image data to the image processing module.
  • D1 is used to represent the time for performing step S401a, the time for the camera to collect one frame of image, the sum of the time for the camera to perform step S405, and D2 is used to represent the time for performing steps S401b and S402, and the time between steps S403 and S404. and.
  • the duration of D1 is much longer than that of D2.
  • the above-mentioned fuzzy coefficient extraction method may include the following steps:
  • Step S501a the camera application sends an instruction to activate the camera to the camera.
  • Step S501b The camera application sends an instruction to acquire image data to the image processing module.
  • step S501a and step S501b are parallel steps.
  • Step S502 the camera sends the collected image data to the internal memory.
  • the camera After the camera receives the instruction to start the camera sent in step S501a, the camera is powered on, the camera collects images, and sends the collected image data to the internal memory. During the operation of the camera, focus will be performed first, and the images collected during the focusing process will be blurred to a certain extent.
  • the images collected by the camera include images collected during the focusing process and images collected after the focusing is completed.
  • the image data collected in step S502 are sent in the order of collecting images. During the operation of the camera, each time a frame of image is collected, the collected image data will be sent to the internal memory through step S502.
  • Step S503 The image processing module sends an instruction for acquiring image data to the internal memory.
  • the image processing module can obtain and extract the images collected in the above focusing process through the internal memory. Specifically, after the camera collects the image, it will send the collected image data to the internal memory, and the image collected by the camera includes the image in the focusing process.
  • Step S504 The internal memory sends the image data collected by the camera to the image processing module, where the internal memory sends the first frame of image collected by the camera, or the Nth frame of image collected by the camera, where N represents greater than or equal to 1. Integer. It should be explained that the Nth frame of images includes images collected during the focusing process of the camera.
  • Step S505 the image processing module extracts the blur coefficient after acquiring the first frame of image data collected by the camera.
  • the image processing module After the image processing module acquires the first frame of image collected by the camera, the image can be input into the blur coefficient model, and the model coefficient model can output a set of blur coefficients.
  • the above-mentioned fuzzy coefficient model can be obtained by artificial intelligence (artificial intelligence, AI) training according to a preset fuzzy algorithm.
  • the preset blur algorithm can be Gaussian blur algorithm, box blur algorithm, Kawase blur algorithm, double blur algorithm, bokeh blur algorithm, axis-shift blur algorithm, aperture blur algorithm, granular blur algorithm, radial blur algorithm, direction blur
  • One of the algorithms such as the fuzzy algorithm may also be other fuzzy algorithms that may be used, and the above-mentioned preset fuzzy algorithms are not particularly limited in this embodiment of the present application.
  • the fuzzy coefficients output by the fuzzy coefficient model are different.
  • a Gaussian blur algorithm is selected as the preset blur algorithm, and the blur coefficient corresponding to the Gaussian blur algorithm is (scale, radius), where scale represents a magnification factor, and radius represents a radius.
  • Multiple sets of image data with different fuzzy coefficients are used as data training sets, and a neural network is used for multi-layer training to obtain a fuzzy coefficient model.
  • Input the image into the blur coefficient model, and the blur coefficient model can output the blur coefficient corresponding to the Gaussian blur algorithm.
  • the image processing module may input the above-mentioned N frames of images into the above-mentioned blur coefficient model, and the blur coefficient model may output the blur coefficient corresponding to the Gaussian blur algorithm.
  • the image input to the blur coefficient model may be the first frame of image collected during the camera operation, or may be N frame images of different blur states in a single scene during the camera focusing process.
  • the image processing module extracts the blur coefficients of the images collected by the camera through the above method
  • the collected blur coefficients are stored in the RAM corresponding to the camera application, or the collected blur coefficients can be stored in the ROM corresponding to the camera application.
  • the above-mentioned blurring processing method may include the following steps:
  • Step S601a the camera application sends an instruction to close the camera to the camera.
  • Step S601b The camera application sends an instruction to acquire image data to the image processing module.
  • step S601a and step S601b are parallel steps.
  • Step S602 the camera receives the shutdown instruction, and captures the last frame of image.
  • Step S603 the camera sends the last frame of image collected to the internal memory.
  • Step S604 the camera is turned off, and image capture is stopped.
  • Step S605 The image processing module sends an instruction to acquire the last frame of image data to the internal memory.
  • Step S606 the internal memory sends the last frame of image data to the image processing module.
  • Step S607 The image processing module performs blurring processing on the last frame of image collected during the operation of the camera.
  • the image processing module extracts the blur coefficient of the first frame image collected by the camera, and saves the blur coefficient in the RAM corresponding to the camera application, or in the ROM corresponding to the camera application.
  • the image processing module can obtain the blur coefficient of the first frame image collected by the camera in the RAM corresponding to the camera application.
  • the image processing module performs blurring processing on the last frame of image collected during the operation of the camera according to the extracted blur coefficient through a preset blurring algorithm.
  • the preset blur algorithm is a Gaussian blur algorithm
  • the blur coefficient corresponding to the Gaussian blur algorithm extracted by the image processing module is (scale, radius).
  • the image processing module adopts a Gaussian blur algorithm, and according to the blur coefficient (scale, radius) of the Gaussian blur algorithm, blurs the last frame image collected during the operation of the camera to generate a blurred image.
  • the blur coefficient used may also be the blur coefficient preset by the system.
  • the preset fuzzy coefficient of the system is set for the preset fuzzy algorithm according to factors such as electronic device type, camera model, memory compression, and user information security requirements.
  • the preset blur algorithm is a Gaussian blur algorithm
  • the preset blur coefficient of the system is (scale_1, radius_1)
  • the image processing module obtains the last frame of image collected during the operation of the camera through the camera service module, and uses Gaussian.
  • the blurring algorithm performs blurring processing on the image according to the preset blurring coefficients (scale_1, radius_1) to generate a blurred image.
  • Step S608 The image processing module sends the image data obtained by blurring the last frame of the image collected during the operation of the camera to the internal memory.
  • the internal memory can store the blurred image data of the last frame of the image collected during the operation of the camera. Specifically, after the image processing module performs the blurring process on the last frame of the image collected during the operation of the camera, it can store the blurred image of the last frame of the image collected during the operation of the camera in the RAM corresponding to the camera application. data. It is also possible to generate an image file from the blurred image data of the last frame of the image collected during the operation of the camera, and store it in the ROM corresponding to the camera application. It is also possible to generate an image file from the last frame of image data collected during the operation of the camera, and store it in the ROM corresponding to the camera application.
  • the image processing module can send The camera sends an image acquisition instruction for acquiring a frame of image. After the camera captures the image, the captured image data is sent to the internal memory, and then the image processing module can acquire the image, perform blurring processing on the image, and store the blurred image data in the RAM corresponding to the camera application. , or generate an image file from the blurred image data, and store it in the ROM corresponding to the camera application. Alternatively, the image processing module may sample the image data, generate an image file from the image data, and store it in the ROM corresponding to the camera application.
  • FIG. 7 is a flowchart of querying and acquiring blurred image data provided by an embodiment of the present application.
  • the image processing module when the image processing module receives the instruction to obtain the blurred image data, it will query whether the RAM corresponding to the camera application stores the last frame of the image collected during the last camera operation after blurring. image data. If there is blurred image data of the last frame image collected during the last camera operation, the internal memory sends the blurred image data to the image processing module. If the blurred image data is not available, query the ROM corresponding to the camera application to see if there is a blurred image file of the last frame of the image collected during the last camera operation.
  • the blurred image data can be stored in the RAM corresponding to the camera application, or the blurred image data can be stored in the RAM corresponding to the camera application.
  • the processed image data generates an image file, which is stored in the ROM corresponding to the camera application.
  • the image processing module receives the instruction to obtain the blurred image data, it does not query the RAM corresponding to the camera application to find the blurred image data of the last frame of the image collected during the last camera operation, nor does it query the image data in the RAM corresponding to the camera application.
  • the ROM corresponding to the camera application the blurred image file of the last frame of the image collected during the last camera operation was queried, and the last image file collected during the last camera operation was not queried in the ROM corresponding to the camera application.
  • the image file of the frame image the image processing module extracts the blur coefficient of the first frame of image data collected during the focusing process of the camera through the blur coefficient extraction method described in the above embodiment. And by the blurring processing method described in the above embodiment, blurring processing is performed on the last frame of image collected during the operation of the camera.
  • the interface of the electronic device displays the blurred image of the last frame of the image captured during the last camera operation. After the camera finishes focusing, the interface of the electronic device displays the focused image collected by the camera.
  • the images captured by the cameras may have different sharpness and brightness.
  • ⁇ 1 is used to represent the sharpness threshold of the image captured by the camera
  • ⁇ 2 is used to represent the brightness threshold of the image captured by the camera.
  • the above-mentioned sharpness threshold and brightness threshold may be set by the user, or may be preset by the system.
  • the interface of the above-mentioned electronic device displays that the last frame of the image collected during the previous operation of the camera has been blurred. processed image.
  • the interface of the electronic device displays the image collected by the camera.
  • different application programs may acquire image data after blurring of the last frame of image collected during the last camera running process and stored in the RAM corresponding to the camera application.
  • Different application programs can also obtain the blurred image file stored in the ROM corresponding to the camera application from the last frame of the image collected during the last camera running process.
  • Different applications can also acquire the image file of the last frame of image collected during the last camera running process and stored in the ROM corresponding to the camera application.
  • the internal memory can quickly provide the blurred image data of the last frame of the image collected during the last camera running process. What the user sees is the blurred image of the last frame of the image collected during the last camera operation, which improves the user experience.
  • All applications using the system camera service on the electronic device can obtain the blurred image data stored in the internal memory when the camera application is started, and each application does not need to store the last frame of the image collected during the operation of the camera. Blur the image data to save memory.
  • a blur coefficient model is obtained, and the blur coefficient model is used to extract the blur coefficient of the image collected during the camera focusing process, and the blur coefficient is used to blur the last frame of the image collected during the previous camera operation.
  • an image similar to the degree of blurring during the focusing process of the camera can be presented to the user, so that the interface of the electronic device can display the blurred image from the last frame of the image collected during the previous operation of the camera.
  • the process of switching to displaying the image while the camera is in focus, and then to displaying the image after the camera is in focus is smoother.
  • Each functional unit in each of the embodiments of the embodiments of the present application may be integrated into one processing unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units may be implemented in the form of hardware, or may be implemented in the form of software functional units.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Telephone Function (AREA)

Abstract

L'invention concerne un procédé d'affichage d'image, consistant à : détecter une première opération d'un utilisateur et, en réponse à la première opération, un dispositif électronique affichant une première interface de prévisualisation de photographie, la première interface de prévisualisation de photographie comprenant un premier cadre de prévisualisation, le premier cadre de prévisualisation affichant une première image et la première image étant une image floutée ; après une première durée prédéfinie, le dispositif électronique affichant une seconde interface de prévisualisation de photographie, la seconde interface de prévisualisation de photographie comprenant un second cadre de prévisualisation et le second cadre de prévisualisation affichant une image capturée en temps réel par une caméra du dispositif électronique, de telle sorte que, pendant le démarrage d'une application de caméra, ce qu'un utilisateur voit soit une image obtenue par floutage de la dernière trame d'image acquise pendant l'opération précédente de la caméra, ce qui permet d'améliorer l'expérience de l'utilisateur.
PCT/CN2021/137940 2020-12-18 2021-12-14 Procédé d'affichage d'image et dispositif électronique WO2022127787A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011504970.7 2020-12-18
CN202011504970.7A CN114650363B (zh) 2020-12-18 2020-12-18 一种图像显示的方法及电子设备

Publications (1)

Publication Number Publication Date
WO2022127787A1 true WO2022127787A1 (fr) 2022-06-23

Family

ID=81990060

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/137940 WO2022127787A1 (fr) 2020-12-18 2021-12-14 Procédé d'affichage d'image et dispositif électronique

Country Status (2)

Country Link
CN (1) CN114650363B (fr)
WO (1) WO2022127787A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115514871A (zh) * 2022-09-30 2022-12-23 读书郎教育科技有限公司 一种基于智能终端的翻转摄像头预览优化系统及方法
CN115767290A (zh) * 2022-09-28 2023-03-07 荣耀终端有限公司 图像处理方法和电子设备
CN116048379A (zh) * 2022-06-30 2023-05-02 荣耀终端有限公司 数据回灌方法及装置
CN116708753A (zh) * 2022-12-19 2023-09-05 荣耀终端有限公司 预览卡顿原因的确定方法、设备及存储介质
CN116744106A (zh) * 2022-10-25 2023-09-12 荣耀终端有限公司 相机应用的控制方法和终端设备
WO2024082863A1 (fr) * 2022-10-21 2024-04-25 荣耀终端有限公司 Procédé de traitement d'images et dispositif électronique

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117135341A (zh) * 2023-01-19 2023-11-28 荣耀终端有限公司 图像处理的方法及电子设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170126962A1 (en) * 2015-04-30 2017-05-04 Jrd Communication Inc. Method and system for quickly starting camera based on eyeprint identification
CN110505389A (zh) * 2019-09-03 2019-11-26 RealMe重庆移动通信有限公司 摄像头控制方法、装置、存储介质及电子设备
CN111543049A (zh) * 2018-07-16 2020-08-14 华为技术有限公司 一种拍照方法及电子设备
CN111885305A (zh) * 2020-07-28 2020-11-03 Oppo广东移动通信有限公司 预览画面处理方法及装置、存储介质和电子设备
CN112055156A (zh) * 2020-09-15 2020-12-08 Oppo(重庆)智能科技有限公司 预览图像更新方法、装置、移动终端及存储介质

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106161956A (zh) * 2016-08-16 2016-11-23 深圳市金立通信设备有限公司 一种拍摄时预览画面的处理方法和终端
CN110636353B (zh) * 2019-06-10 2022-09-02 海信视像科技股份有限公司 一种显示设备

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170126962A1 (en) * 2015-04-30 2017-05-04 Jrd Communication Inc. Method and system for quickly starting camera based on eyeprint identification
CN111543049A (zh) * 2018-07-16 2020-08-14 华为技术有限公司 一种拍照方法及电子设备
CN110505389A (zh) * 2019-09-03 2019-11-26 RealMe重庆移动通信有限公司 摄像头控制方法、装置、存储介质及电子设备
CN111885305A (zh) * 2020-07-28 2020-11-03 Oppo广东移动通信有限公司 预览画面处理方法及装置、存储介质和电子设备
CN112055156A (zh) * 2020-09-15 2020-12-08 Oppo(重庆)智能科技有限公司 预览图像更新方法、装置、移动终端及存储介质

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116048379A (zh) * 2022-06-30 2023-05-02 荣耀终端有限公司 数据回灌方法及装置
CN116048379B (zh) * 2022-06-30 2023-10-24 荣耀终端有限公司 数据回灌方法及装置
CN115767290A (zh) * 2022-09-28 2023-03-07 荣耀终端有限公司 图像处理方法和电子设备
CN115767290B (zh) * 2022-09-28 2023-09-29 荣耀终端有限公司 图像处理方法和电子设备
CN115514871A (zh) * 2022-09-30 2022-12-23 读书郎教育科技有限公司 一种基于智能终端的翻转摄像头预览优化系统及方法
WO2024082863A1 (fr) * 2022-10-21 2024-04-25 荣耀终端有限公司 Procédé de traitement d'images et dispositif électronique
CN116744106A (zh) * 2022-10-25 2023-09-12 荣耀终端有限公司 相机应用的控制方法和终端设备
CN116744106B (zh) * 2022-10-25 2024-04-30 荣耀终端有限公司 相机应用的控制方法和终端设备
CN116708753A (zh) * 2022-12-19 2023-09-05 荣耀终端有限公司 预览卡顿原因的确定方法、设备及存储介质
CN116708753B (zh) * 2022-12-19 2024-04-12 荣耀终端有限公司 预览卡顿原因的确定方法、设备及存储介质

Also Published As

Publication number Publication date
CN114650363A (zh) 2022-06-21
CN114650363B (zh) 2023-07-21

Similar Documents

Publication Publication Date Title
WO2020259452A1 (fr) Procédé d'affichage plein écran pour terminal mobile et appareil
WO2022127787A1 (fr) Procédé d'affichage d'image et dispositif électronique
CN113645351B (zh) 应用界面交互方法、电子设备和计算机可读存储介质
WO2021104485A1 (fr) Procédé de photographie et dispositif électronique
WO2021036770A1 (fr) Procédé de traitement d'écran partagé et dispositif terminal
WO2019072178A1 (fr) Procédé de traitement de notification, et dispositif électronique
WO2021258814A1 (fr) Procédé et appareil de synthèse vidéo, dispositif électronique, et support de stockage
WO2021159746A1 (fr) Procédé et système de partage de fichiers et dispositif associé
CN113704205B (zh) 日志存储的方法、芯片、电子设备和可读存储介质
WO2022001258A1 (fr) Procédé et appareil d'affichage à écrans multiples, dispositif terminal et support de stockage
WO2022037726A1 (fr) Procédé d'affichage à écran partagé et dispositif électronique
WO2021052139A1 (fr) Procédé d'entrée de geste et dispositif électronique
WO2022100685A1 (fr) Procédé de traitement de commande de dessin et dispositif associé
WO2023056795A1 (fr) Procédé de photographie rapide, dispositif électronique, et support de stockage lisible par ordinateur
WO2021218429A1 (fr) Procédé de gestion d'une fenêtre d'application, dispositif terminal et support de stockage lisible par ordinateur
WO2022105702A1 (fr) Procédé et dispositif électronique d'enregistrement d'image
WO2022143180A1 (fr) Procédé d'affichage collaboratif, dispositif terminal et support de stockage lisible par ordinateur
WO2021052388A1 (fr) Procédé de communication vidéo et appareil de communication vidéo
CN116389884B (zh) 缩略图显示方法及终端设备
CN113542574A (zh) 变焦下的拍摄预览方法、终端、存储介质及电子设备
WO2022170856A1 (fr) Procédé d'établissement de connexion et dispositif électronique
WO2023000746A1 (fr) Procédé de traitement vidéo à réalité augmentée et dispositif électronique
CN115119048A (zh) 一种视频流处理方法及电子设备
WO2022166435A1 (fr) Procédé de partage d'image et dispositif électronique
WO2022062902A1 (fr) Procédé de transfert de fichier et dispositif électronique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21905716

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21905716

Country of ref document: EP

Kind code of ref document: A1