WO2024114257A1 - Procédé de génération d'effet dynamique de transition et dispositif électronique - Google Patents

Procédé de génération d'effet dynamique de transition et dispositif électronique Download PDF

Info

Publication number
WO2024114257A1
WO2024114257A1 PCT/CN2023/128456 CN2023128456W WO2024114257A1 WO 2024114257 A1 WO2024114257 A1 WO 2024114257A1 CN 2023128456 W CN2023128456 W CN 2023128456W WO 2024114257 A1 WO2024114257 A1 WO 2024114257A1
Authority
WO
WIPO (PCT)
Prior art keywords
interface
screen
wallpaper
application
lock
Prior art date
Application number
PCT/CN2023/128456
Other languages
English (en)
Chinese (zh)
Inventor
向星宇
刘学致
万潇潇
张晶
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2024114257A1 publication Critical patent/WO2024114257A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Definitions

  • the present application relates to the field of smart terminal technology, and in particular to a method for generating transition effects and an electronic device.
  • some electronic devices e.g., mobile phones
  • the off-screen display function can support the sharing of continuous transition effects from the off-screen interface to the lock screen interface and then to the main interface.
  • the electronic device only presets some transition effects for users to choose from, and users can only choose one transition effect from the preset transition effects of the electronic device, which cannot meet the user's personalized needs for transition effects.
  • the present application provides a transition animation generation method and an electronic device, and a user can customize the transition animation so that the transition animation meets the user's personalized needs.
  • an embodiment of the present application provides a method for generating a transition effect, including: displaying a first interface, the first interface including a first control; detecting a user's first selection operation on the first control, displaying a second interface, the second interface including a picture; detecting a user's second selection operation on the picture, and determining a target picture according to the first selection operation; displaying a third interface, the third interface including a template control, the template control corresponding to a preset template; detecting a user's third selection operation on the template control, determining a target template control according to the third selection operation, and determining the template corresponding to the target template control as the target template; generating a first transition animation from the off screen to the lock screen and/or a second transition animation from the lock screen to the main interface according to the target picture and the target template.
  • the above-mentioned first interface may, for example, be a first custom transition effect editing interface in a subsequent embodiment; the above-mentioned second interface may, for example, be a picture selection interface in a subsequent embodiment; the above-mentioned third interface may, for example, be a second custom transition effect editing interface in a subsequent embodiment.
  • the method can allow the user to specify a target image and a target template, and generate a first transition animation from the off screen to the locked screen and/or a second transition animation from the locked screen to the main interface based on the target image and the target template, so that the user can customize the transition effect, so that the transition effect meets the user's personalized needs.
  • generating a first transition animation from screen off to screen lock based on the target image and the target template includes: a first application determines a screen off wallpaper and a lock screen wallpaper; the first application generates the first transition animation based on the screen off wallpaper, the lock screen wallpaper and the target template.
  • the generating of the second transition animation from the lock screen to the main interface according to the target image and the target template includes: a first application determines a lock screen wallpaper and a main interface wallpaper; the first application generates the second transition animation according to the lock screen wallpaper, the main interface wallpaper and the target template.
  • the first transition animation from screen off to screen lock is generated according to the target image and the target template, including: a second application determines a screen off wallpaper and a lock screen wallpaper; when the second application determines to switch from the screen off interface to the lock screen interface, the first transition animation is generated according to the target image and the target template.
  • the second transition animation from the lock screen to the main interface is generated according to the target image and the target template, including: a second application determines a lock screen wallpaper and a main interface wallpaper; when the second application determines to switch from the lock screen interface to the main interface, the second transition animation is generated according to the lock screen wallpaper, the main interface wallpaper and the target template.
  • it also includes: when switching from the off-screen interface to the lock-screen interface, the second application plays the first transition animation; and/or, when switching from the lock-screen interface to the main interface, the second application plays the second transition animation.
  • the method further includes: the first application displays a fourth interface, the fourth interface includes the target image and a first border control, the size of the first border control is the same as the size of the screen-off wallpaper border; the first application receives a first confirmation operation from the user, and determines the screen-off wallpaper according to the area of the target image in the first border control.
  • the fourth interface may be, for example, a screen-off wallpaper adjustment interface in a subsequent embodiment.
  • the method further includes: the first application displays a fifth interface, the fifth interface includes the target image, and the target image is used to adjust a display area in the fifth interface under the control of the user; the first application receives a second confirmation operation from the user; The lock screen wallpaper is determined according to the image area of the target image displayed in the fifth interface.
  • the fifth interface may be, for example, a lock screen wallpaper adjustment interface in a subsequent embodiment.
  • the method further includes: the first application displays a sixth interface, the sixth interface includes the target image, and the target image is used to adjust the display area in the sixth interface under the control of the user; the first application receives a third confirmation operation from the user, and determines the main interface wallpaper according to the image area displayed by the target image in the sixth interface.
  • the sixth interface may be, for example, a main interface wallpaper adjustment interface in a subsequent embodiment.
  • an embodiment of the present application provides an electronic device, comprising: a processor; a memory; wherein one or more computer programs are stored in the memory, and the one or more computer programs include instructions, which, when executed by the processor, enable the electronic device to execute any one of the methods described in the first aspect.
  • an embodiment of the present application provides a computer-readable storage medium, wherein the computer-readable storage medium stores a computer program, which, when executed on a computer, enables the computer to execute any of the methods described in the first aspect.
  • the present application provides a computer program, which, when executed by a computer, is used to execute the method of the first aspect.
  • the program in the fourth aspect may be stored in whole or in part on a storage medium packaged together with the processor, or may be stored in whole or in part on a memory not packaged together with the processor.
  • FIG1 is a schematic diagram of a structure of an electronic device provided in an embodiment of the present application.
  • FIG2 is a schematic diagram of the software structure of the electronic device according to an embodiment of the present application.
  • 3A to 3C are schematic diagrams of interfaces of a method for generating transition effects according to an embodiment of the present application.
  • FIG4 is another schematic diagram of an interface of a transition effect according to an embodiment of the present application.
  • FIG5 is a schematic diagram of a flow chart of a method for generating transition effects according to an embodiment of the present application
  • FIG6 is a schematic diagram of a method for generating a transition animation in an embodiment of the present application.
  • FIG7 is a schematic diagram of a flow chart of a method for generating transition effects according to an embodiment of the present application.
  • FIG8 is a schematic diagram of a flow chart of a method for generating transition effects according to an embodiment of the present application.
  • FIG. 9 is a flow chart of a method for generating transition effects according to an embodiment of the present application.
  • Some electronic devices e.g., mobile phones
  • that support the off-screen display function can support the sharing of continuous transition effects from the off-screen to the lock screen and then to the main interface.
  • the above-mentioned transition effect is also called AOD one-shot.
  • the electronic device can provide users with a variety of transition effects for users to choose from. Once the user selects a transition effect, the off-screen wallpaper, lock screen wallpaper, main interface wallpaper, the first transition animation from the off-screen to the lock screen, and the second transition animation from the lock screen to the main interface are set accordingly.
  • the electronic device displays the off-screen interface in the off-screen state, and the off-screen interface includes the off-screen wallpaper; when the electronic device is converted from the off-screen interface to the lock screen interface, the first transition animation from the off-screen to the lock screen is played, and then the lock screen interface is displayed, and the lock screen interface includes the lock screen wallpaper; when the electronic device is converted from the lock screen interface to the main interface, the transition animation from the lock screen to the main interface is played, and then the main interface is displayed, and the main interface includes the main interface wallpaper.
  • the first transition animation generally includes the transition effect from the off screen wallpaper to the lock screen wallpaper
  • the second transition animation generally includes the transition effect from the lock screen wallpaper to the main interface wallpaper
  • the electronic device only presets some transition effects for the user to choose from, and the user can only select one transition effect from the preset transition effects of the electronic device, which cannot meet the user's personalized needs for transition effects.
  • the present application proposes a transition animation generation method and an electronic device, which allow users to customize the transition animation so that the transition animation meets the user's personalized needs.
  • FIG. 1 shows a schematic structural diagram of an electronic device 100 .
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB), serial bus, USB) interface 130, charging management module 140, power management module 141, battery 142, antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone interface 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display screen 194, and subscriber identification module (SIM) card interface 195, etc.
  • a processor 110 an external memory interface 120, an internal memory 121, a universal serial bus (USB), serial bus, USB) interface 130, charging management module 140, power management module 141, battery 142, antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone interface 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display screen 194, and subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, etc.
  • the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the electronic device 100.
  • the electronic device 100 may include more or fewer components than shown in the figure, or combine some components, or split some components, or arrange the components differently.
  • the components shown in the figure may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (AP), a modem processor, a graphics processor (GPU), an image signal processor (ISP), a controller, a video codec, a digital signal processor (DSP), a baseband processor, and/or a neural-network processing unit (NPU), etc.
  • AP application processor
  • GPU graphics processor
  • ISP image signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • Different processing units may be independent devices or integrated in one or more processors.
  • the controller can generate operation control signals according to the instruction operation code and timing signal to complete the control of instruction fetching and execution.
  • the processor 110 may also be provided with a memory for storing instructions and data.
  • the memory in the processor 110 is a cache memory.
  • the memory may store instructions or data that the processor 110 has just used or cyclically used. If the processor 110 needs to use the instruction or data again, it may be directly called from the memory. This avoids repeated access, reduces the waiting time of the processor 110, and thus improves the efficiency of the system.
  • the processor 110 may include one or more interfaces.
  • the interface may include an inter-integrated circuit (I2C) interface, an inter-integrated circuit sound (I2S) interface, a pulse code modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a mobile industry processor interface (MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (SIM) interface, and/or a universal serial bus (USB) interface, etc.
  • I2C inter-integrated circuit
  • I2S inter-integrated circuit sound
  • PCM pulse code modulation
  • UART universal asynchronous receiver/transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus, including a serial data line (SDA) and a serial clock line (SCL).
  • the processor 110 may include multiple groups of I2C buses.
  • the processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces.
  • the processor 110 may be coupled to the touch sensor 180K through the I2C interface, so that the processor 110 communicates with the touch sensor 180K through the I2C bus interface, thereby realizing the touch function of the electronic device 100.
  • the I2S interface can be used for audio communication.
  • the processor 110 can include multiple I2S buses.
  • the processor 110 can be coupled to the audio module 170 via the I2S bus to achieve communication between the processor 110 and the audio module 170.
  • the audio module 170 can transmit an audio signal to the wireless communication module 160 via the I2S interface to achieve the function of answering a call through a Bluetooth headset.
  • the PCM interface can also be used for audio communication, sampling, quantizing and encoding analog signals.
  • the audio module 170 and the wireless communication module 160 can be coupled via a PCM bus interface.
  • the audio module 170 can also transmit audio signals to the wireless communication module 160 via the PCM interface to realize the function of answering calls via a Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus for asynchronous communication.
  • the bus can be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • the UART interface is generally used to connect the processor 110 and the wireless communication module 160.
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to implement the Bluetooth function.
  • the audio module 170 can transmit an audio signal to the wireless communication module 160 through the UART interface to implement the function of playing music through a Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193.
  • the MIPI interface includes a camera serial interface (CSI), a display serial interface (DSI), etc.
  • the processor 110 and the camera 193 communicate via the CSI interface to implement the shooting function of the electronic device 100.
  • the processor 110 and the display screen 194 communicate via the DSI interface to implement the display function of the electronic device 100.
  • the GPIO interface can be configured by software.
  • the GPIO interface can be configured as a control signal or a data signal.
  • the GPIO interface can be used to connect the processor 110 with the camera 193, the display screen 194, the wireless communication module 160, the audio module 170, the sensor module 180, etc.
  • the GPIO interface can also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, etc.
  • the USB interface 130 is an interface that complies with the USB standard specification, and specifically can be a Mini USB interface, a Micro USB interface, a USB Type C interface, etc.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100, and can also be used to transmit data between the electronic device 100 and a peripheral device. It can also be used to connect headphones to play audio through the headphones.
  • the interface can also be used to connect other electronic devices, such as AR devices, etc.
  • the interface connection relationship between the modules illustrated in the embodiment of the present invention is only a schematic illustration and does not constitute a structural limitation on the electronic device 100.
  • the electronic device 100 may also adopt different interface connection methods in the above embodiments, or a combination of multiple interface connection methods.
  • the charging management module 140 is used to receive charging input from a charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 140 may receive charging input from a wired charger through the USB interface 130.
  • the charging management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100. While the charging management module 140 is charging the battery 142, it may also power the electronic device through the power management module 141.
  • the power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110.
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the display screen 194, the camera 193, and the wireless communication module 160.
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle number, battery health status (leakage, impedance), etc.
  • the power management module 141 can also be set in the processor 110.
  • the power management module 141 and the charging management module 140 can also be set in the same device.
  • the wireless communication function of the electronic device 100 can be implemented through the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor and the baseband processor.
  • Antenna 1 and antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve the utilization of antennas.
  • antenna 1 can be reused as a diversity antenna for a wireless local area network.
  • the antenna can be used in combination with a tuning switch.
  • the mobile communication module 150 can provide solutions for wireless communications including 2G/3G/4G/5G, etc., applied to the electronic device 100.
  • the mobile communication module 150 may include at least one filter, a switch, a power amplifier, a low noise amplifier (LNA), etc.
  • the mobile communication module 150 may receive electromagnetic waves from the antenna 1, and perform filtering, amplification, and other processing on the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 150 may also amplify the signal modulated by the modulation and demodulation processor, and convert it into electromagnetic waves for radiation through the antenna 1.
  • at least some of the functional modules of the mobile communication module 150 may be arranged in the processor 110.
  • at least some of the functional modules of the mobile communication module 150 may be arranged in the same device as at least some of the modules of the processor 110.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low-frequency baseband signal to be sent into a medium-high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal.
  • the demodulator then transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the application processor outputs a sound signal through an audio device (not limited to a speaker 170A, a receiver 170B, etc.), or displays an image or video through a display screen 194.
  • the modem processor may be an independent device.
  • the modem processor may be independent of the processor 110 and be set in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide wireless communication solutions including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) network), bluetooth (BT), global navigation satellite system (GNSS), frequency modulation (FM), near field communication (NFC), infrared (IR) and the like applied to the electronic device 100.
  • WLAN wireless local area networks
  • BT wireless fidelity
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication
  • IR infrared
  • the wireless communication module 160 can be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110.
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110, modulate the frequency, amplify it, and convert it into electromagnetic waves for radiation through the antenna 2.
  • the antenna 1 of the electronic device 100 is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), wideband code division multiple access (WCDMA), time-division code division multiple access (TD-CDMA), etc.
  • the GNSS may include a global positioning system (GPS), a global navigation satellite system (GLONASS), a Beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS) and/or a satellite based augmentation system (SBAS).
  • the electronic device 100 implements the display function through a GPU, a display screen 194, and an application processor.
  • the GPU is a microprocessor for image processing, which connects the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • the processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos, etc.
  • the display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (AMOLED), a flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diodes (QLED), etc.
  • the electronic device 100 may include 1 or N display screens 194, where N is a positive integer greater than 1.
  • the electronic device 100 can realize the shooting function through ISP, camera 193, video codec, GPU, display screen 194 and application processor.
  • the ISP is used to process the data fed back by the camera 193. For example, when taking a photo, the shutter is opened, and the light is transmitted to the camera photosensitive element through the lens. The light signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converts it into an image visible to the naked eye.
  • the ISP can also perform algorithm optimization on the noise, brightness, and skin color of the image. The ISP can also optimize the exposure, color temperature and other parameters of the shooting scene. In some embodiments, the ISP can be set in the camera 193.
  • the camera 193 is used to capture still images or videos.
  • the object generates an optical image through the lens and projects it onto the photosensitive element.
  • the photosensitive element can be a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS) phototransistor.
  • CMOS complementary metal oxide semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then passes the electrical signal to the ISP to be converted into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • the DSP converts the digital image signal into an image signal in a standard RGB, YUV or other format.
  • the electronic device 100 may include 1 or N cameras 193, where N is a positive integer greater than 1.
  • the digital signal processor is used to process digital signals, and can process not only digital image signals but also other digital signals. For example, when the electronic device 100 is selecting a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy.
  • Video codecs are used to compress or decompress digital videos.
  • the electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record videos in a variety of coding formats, such as Moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
  • MPEG Moving Picture Experts Group
  • MPEG2 MPEG2, MPEG3, MPEG4, etc.
  • NPU is a neural network (NN) computing processor.
  • NN neural network
  • applications such as intelligent cognition of electronic device 100 can be realized, such as image recognition, face recognition, voice recognition, text understanding, etc.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music and videos can be stored in the external memory card.
  • the internal memory 121 can be used to store computer executable program codes, which include instructions.
  • the internal memory 121 may include a program storage area and a data storage area.
  • the program storage area may store an operating system, an application required for at least one function (such as a sound playback function, an image playback function, etc.), etc.
  • the data storage area may store data created during the use of the electronic device 100 (such as audio data, a phone book, etc.), etc.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one disk storage device, a flash memory device, a universal flash storage (UFS), etc.
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by running instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
  • the electronic device 100 can implement audio functions such as music playing and recording through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone jack 170D, and the application processor.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signals.
  • the audio module 170 can also be used to encode and decode audio signals.
  • the audio module 170 can be set to The processor 110 , or some functional modules of the audio module 170 are set in the processor 110 .
  • the speaker 170A also called a "speaker" is used to convert an audio electrical signal into a sound signal.
  • the electronic device 100 can listen to music or listen to a hands-free call through the speaker 170A.
  • the receiver 170B also called a "earpiece" is used to convert audio electrical signals into sound signals.
  • the voice can be received by placing the receiver 170B close to the human ear.
  • Microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals. When making a call or sending a voice message, the user can speak by putting their mouth close to microphone 170C to input the sound signal into microphone 170C.
  • the electronic device 100 can be provided with at least one microphone 170C. In other embodiments, the electronic device 100 can be provided with two microphones 170C, which can not only collect sound signals but also realize noise reduction function. In other embodiments, the electronic device 100 can also be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify the sound source, realize directional recording function, etc.
  • the earphone interface 170D is used to connect a wired earphone.
  • the earphone interface 170D may be the USB interface 130, or may be a 3.5 mm open mobile terminal platform (OMTP) standard interface or a cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the pressure sensor 180A is used to sense the pressure signal and can convert the pressure signal into an electrical signal.
  • the pressure sensor 180A can be set on the display screen 194.
  • the capacitive pressure sensor can be a parallel plate including at least two conductive materials.
  • the electronic device 100 determines the intensity of the pressure according to the change in capacitance.
  • the electronic device 100 detects the touch operation intensity according to the pressure sensor 180A.
  • the electronic device 100 can also calculate the touch position according to the detection signal of the pressure sensor 180A.
  • touch operations acting on the same touch position but with different touch operation intensities can correspond to different operation instructions. For example: when a touch operation with a touch operation intensity less than the first pressure threshold acts on the short message application icon, an instruction to view the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the short message application icon, an instruction to create a new short message is executed.
  • the gyro sensor 180B can be used to determine the motion posture of the electronic device 100.
  • the angular velocity of the electronic device 100 around three axes i.e., x, y, and z axes
  • the gyro sensor 180B can be used for anti-shake shooting. For example, when the shutter is pressed, the gyro sensor 180B detects the angle of the electronic device 100 shaking, calculates the distance that the lens module needs to compensate based on the angle, and allows the lens to offset the shaking of the electronic device 100 through reverse movement to achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenes.
  • the air pressure sensor 180C is used to measure air pressure.
  • the electronic device 100 calculates the altitude through the air pressure value measured by the air pressure sensor 180C to assist in positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the electronic device 100 can use the magnetic sensor 180D to detect the opening and closing of the flip leather case.
  • the electronic device 100 when the electronic device 100 is a flip phone, the electronic device 100 can detect the opening and closing of the flip cover according to the magnetic sensor 180D. Then, according to the detected opening and closing state of the leather case or the opening and closing state of the flip cover, the flip cover can be automatically unlocked.
  • the acceleration sensor 180E can detect the magnitude of the acceleration of the electronic device 100 in all directions (generally three axes). When the electronic device 100 is stationary, the magnitude and direction of gravity can be detected. It can also be used to identify the posture of the electronic device and is applied to applications such as horizontal and vertical screen switching and pedometers.
  • the distance sensor 180F is used to measure the distance.
  • the electronic device 100 can measure the distance by infrared or laser. In some embodiments, when shooting a scene, the electronic device 100 can use the distance sensor 180F to measure the distance to achieve fast focusing.
  • the proximity light sensor 180G may include, for example, a light emitting diode (LED) and a light detector, such as a photodiode.
  • the light emitting diode may be an infrared light emitting diode.
  • the electronic device 100 emits infrared light outward through the light emitting diode.
  • the electronic device 100 uses a photodiode to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 can determine that there is no object near the electronic device 100.
  • the electronic device 100 can use the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear to talk, so as to automatically turn off the screen to save power.
  • the proximity light sensor 180G can also be used in leather case mode and pocket mode to automatically unlock and lock the screen.
  • the ambient light sensor 180L is used to sense the brightness of the ambient light.
  • the electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touches.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to implement fingerprint unlocking, access application locks, fingerprint photography, fingerprint call answering, etc.
  • the temperature sensor 180J is used to detect temperature.
  • the electronic device 100 uses the temperature detected by the temperature sensor 180J. Execute temperature handling strategy. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 reduces the performance of the processor located near the temperature sensor 180J to reduce power consumption and implement thermal protection. In other embodiments, when the temperature is lower than another threshold, the electronic device 100 heats the battery 142 to avoid abnormal shutdown of the electronic device 100 due to low temperature. In other embodiments, when the temperature is lower than another threshold, the electronic device 100 boosts the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
  • the touch sensor 180K is also called a "touch control device”.
  • the touch sensor 180K can be set on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, also called a "touch control screen”.
  • the touch sensor 180K is used to detect touch operations acting on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to the touch operation can be provided through the display screen 194.
  • the touch sensor 180K can also be set on the surface of the electronic device 100, which is different from the position of the display screen 194.
  • the bone conduction sensor 180M can obtain a vibration signal. In some embodiments, the bone conduction sensor 180M can obtain a vibration signal of a vibrating bone block of the vocal part of the human body. The bone conduction sensor 180M can also contact the human pulse to receive a blood pressure beat signal. In some embodiments, the bone conduction sensor 180M can also be set in an earphone and combined into a bone conduction earphone.
  • the audio module 170 can parse out a voice signal based on the vibration signal of the vibrating bone block of the vocal part obtained by the bone conduction sensor 180M to realize a voice function.
  • the application processor can parse the heart rate information based on the blood pressure beat signal obtained by the bone conduction sensor 180M to realize a heart rate detection function.
  • the key 190 includes a power key, a volume key, etc.
  • the key 190 may be a mechanical key or a touch key.
  • the electronic device 100 may receive key input and generate key signal input related to user settings and function control of the electronic device 100.
  • Motor 191 can generate vibration prompts.
  • Motor 191 can be used for incoming call vibration prompts, and can also be used for touch vibration feedback.
  • touch operations acting on different applications can correspond to different vibration feedback effects.
  • touch operations acting on different areas of the display screen 194 can also correspond to different vibration feedback effects.
  • Different application scenarios for example: time reminders, receiving messages, alarm clocks, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 may be an indicator light, which may be used to indicate the charging status, power changes, messages, missed calls, notifications, etc.
  • the SIM card interface 195 is used to connect a SIM card.
  • the SIM card can be connected to and separated from the electronic device 100 by inserting it into the SIM card interface 195 or pulling it out from the SIM card interface 195.
  • the electronic device 100 can support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • the SIM card interface 195 can support Nano SIM cards, Micro SIM cards, SIM cards, and the like. Multiple cards can be inserted into the same SIM card interface 195 at the same time. The types of the multiple cards can be the same or different.
  • the SIM card interface 195 can also be compatible with different types of SIM cards.
  • the SIM card interface 195 can also be compatible with external memory cards.
  • the electronic device 100 interacts with the network through the SIM card to implement functions such as calls and data communications.
  • the electronic device 100 uses an eSIM, i.e., an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a micro-kernel architecture, a micro-service architecture, or a cloud architecture.
  • a layered architecture software system is used as an example to exemplify the software structure of the electronic device 100.
  • FIG. 2 is a software structure block diagram of the electronic device 100 according to an embodiment of the present invention.
  • the layered architecture divides the software into several layers, each with a clear role and division of labor.
  • the layers communicate with each other through software interfaces.
  • the electronic device 100 of the embodiment of the present application includes: an application layer and a kernel layer.
  • the software system of the electronic device 100 may also include more or fewer layers.
  • the (Android) system may also include an application framework layer, an Android runtime, and system libraries.
  • the application layer may include several applications. As shown in Figure 2, the application may include a first application and a second application, and may optionally include: a camera, a gallery, a call, a short message, etc.
  • the first application is used to support users in setting transition effects, such as a setting application in an electronic device;
  • the second application is used to display the screen-off interface, the lock screen interface, and the main interface, and is also used to play the first transition animation from the screen-off interface to the lock screen when switching from the screen-off interface to the lock screen interface, and play the second transition animation from the lock screen to the main interface when switching from the lock screen interface to the main interface;
  • the gallery application is used to manage pictures in the electronic device, such as photos, etc.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer can contain display drivers, camera drivers, etc.
  • FIG. 3A and FIG. 3B are interface diagrams of a method for generating transition effects according to an embodiment of the present application, as shown in FIG. 3A and FIG. 3B , including:
  • the user opens a first setting interface of the first application, such as shown in interface 300.
  • the first setting interface includes an "AOD all-round” control.
  • the user selects the "AOD all-round” control.
  • the first application detects the selection operation for the "AOD all-round” control.
  • the first application displays a transition effect selection interface, such as that shown in interface 310 .
  • the transition effect selection interface includes: style controls, such as “custom” controls, “style 1” controls, “style 2” controls, etc.
  • style 1" controls, “Style 2” controls, etc. correspond to different transition effects preset in the electronic device, and users can select but cannot edit them.
  • the specific implementation of the above transition effects can refer to the existing related technical implementations, which will not be repeated here.
  • a "custom” control corresponding to a custom transition effect is added, such as shown in control 311 in interface 310; the user selects the "custom” control, and accordingly, the first application detects the user's selection operation on the "custom” control and displays a custom transition effect editing interface, such as shown in interface 320.
  • the custom transition animation editing interface includes: picture selection controls, such as control 321; screen-off wallpaper borders, such as border 322; border style controls, such as “circle” controls, “triangle” controls, etc.; animation effect controls, such as “effect 1" controls, "effect 2" controls, etc.
  • picture selection controls such as control 321
  • screen-off wallpaper borders such as border 322
  • border style controls such as “circle” controls, “triangle” controls, etc.
  • animation effect controls such as “effect 1" controls, "effect 2" controls, etc.
  • Different border style controls correspond to different styles of screen-off wallpaper borders, and different animation effect controls correspond to different animation effects.
  • the border style of the displayed border 322 is circular, which can be a preset default border style.
  • the user can change the border style of the border 322 by selecting the border style control, for example, changing it to a triangle, square, etc.
  • the first application detects the user's selection operation on the picture selection control, and the first application displays a picture selection interface, such as interface 330.
  • the picture selection interface displays a number of pictures, which may include: pictures provided by the first application to the user, and/or pictures obtained by the first application from other applications of the electronic device, such as a gallery application.
  • the user selects a picture, and the first application detects the user's selection operation on the picture and determines the picture as the target picture.
  • the first application displays an off-screen wallpaper adjustment interface, such as shown in interface 340.
  • the off-screen wallpaper adjustment interface displays: a target image, a border control such as shown in control 341, and an off-screen wallpaper adjustment confirmation control, such as an "OK" control 342.
  • the style and size of the border control can be equal to the style and size of the off-screen wallpaper border.
  • the target image can be adjusted in position and/or size
  • the border control 341 can be adjusted in position.
  • the user can adjust the position and/or size of the target image to adjust the area of the target image displayed in the border control.
  • click the adjustment confirmation control in the off-screen wallpaper adjustment interface such as the "OK" control 342.
  • the first application detects the user's selection operation for the off-screen wallpaper adjustment confirmation control, that is, detects the user's adjustment confirmation operation for the off-screen wallpaper, and records the setting parameters of the off-screen wallpaper.
  • the circular control 341 as an example, the coordinates of the diagonal vertices of the inscribed square of the control 341 in the target image and the scaling ratio of the target image can be recorded.
  • the first application displays a lock screen wallpaper adjustment interface, such as shown in interface 350.
  • the lock screen wallpaper adjustment interface displays: a target image, a lock screen wallpaper adjustment confirmation control, such as an "OK" control 351.
  • the target image displayed in the lock screen wallpaper adjustment interface can be adjusted in position and/or size.
  • the user can adjust the position and/or size of the target image to adjust the image area of the target image displayed in the lock screen wallpaper adjustment interface.
  • the lock screen wallpaper adjustment confirmation control such as the "OK" control 351.
  • the first application detects the user's selection operation on the lock screen wallpaper adjustment confirmation control, that is, detects the user's adjustment confirmation operation on the lock screen wallpaper, and records the setting parameters of the lock screen wallpaper.
  • the setting parameters of the lock screen wallpaper may include: the coordinates of the diagonal vertices of the image area (i.e., the lock screen wallpaper) displayed in the lock screen wallpaper adjustment interface in the target image and the scaling ratio of the target image.
  • the first application displays the main interface wallpaper adjustment interface, such as shown in interface 360.
  • the main interface wallpaper adjustment interface displays: a target image, a main interface wallpaper confirmation control such as an "OK" control 361.
  • the target image displayed in the main interface wallpaper adjustment interface can be adjusted in position and/or size.
  • the user can adjust the position and/or size of the target image. Accordingly, the area of the target image displayed in the main interface wallpaper adjustment interface can be adjusted.
  • the main interface wallpaper adjustment confirmation control such as the "OK" control 361.
  • the first application detects the user's selection operation on the main interface wallpaper adjustment confirmation control, that is, detects the user's adjustment confirmation operation on the main interface wallpaper, and records the setting parameters of the main interface wallpaper.
  • the setting parameters of the main interface wallpaper may include: the coordinates of the diagonal vertices of the image area (i.e., the main interface wallpaper) displayed in the main interface wallpaper adjustment interface in the target image and the scaling ratio of the target image.
  • the first application returns to display the custom transition animation editing interface.
  • the image selection control in the custom transition animation editing interface can display the target image or a partial area of the target image, such as shown in interface 370.
  • the user selects a border style control in the custom transition animation editing interface, such as a "circle" control. Accordingly, the first application detects the user's selection operation on the border style control and determines the border style corresponding to the border style control indicated by the selection operation as the border style of the screen-off wallpaper. It should be noted that as the border style changes, the border 322 changes, and the image area displayed in the border 322 can also change adaptively. If the user wants to adjust the image displayed in the border 322, he can reselect the picture selection control to adjust the image area in the border 322.
  • the user selects an animation effect control in the custom transition animation editing interface, such as the "Animation Effect 1" control. Accordingly, the first application detects the user's selection operation on the animation effect item control, and determines the animation effect corresponding to the animation effect control indicated by the selection operation as the animation effect of the custom transition animation.
  • the custom transition animation editing interface may include an editing confirmation control, such as shown in the "Apply" control in interface 370.
  • the user selects the editing confirmation control.
  • the first application detects the user's selection operation on the editing confirmation control and determines that the editing of the custom transition animation is completed. Then, the first application can set the off-screen wallpaper according to the setting parameters of the aforementioned off-screen wallpaper, set the lock screen wallpaper according to the setting parameters of the lock screen wallpaper, and set the main interface wallpaper according to the setting parameters of the main interface wallpaper.
  • the first application determines the off-screen wallpaper, lock screen wallpaper, main interface wallpaper, target border style and target animation effect of the custom transition animation.
  • the first application can generate a first transition animation based on the off-screen wallpaper, the lock-screen wallpaper, the target border style, and the target animation effect, and generate a second transition animation based on the lock-screen wallpaper, the main interface wallpaper, and the target animation effect, and store the first transition animation and the second transition animation as the transition effects set by the user.
  • the second application determines to switch from the off-screen interface to the lock-screen interface
  • the first transition animation is played
  • the second application determines to switch from the lock-screen interface to the main interface
  • the second transition animation is played.
  • the first application can store the above-mentioned target image, the setting parameters of the off-screen wallpaper, the setting parameters of the lock-screen wallpaper, the setting parameters of the main interface wallpaper, and the target template as the information of the above-mentioned custom transition animation set by the user.
  • the second application determines to switch from the off-screen interface to the lock-screen interface
  • the first transition animation is generated and played according to the off-screen wallpaper, the lock-screen wallpaper, the target border style and the target animation effect.
  • the second application determines to switch from the lock-screen interface to the main interface
  • the second transition animation is generated and played according to the lock-screen wallpaper, the main interface wallpaper and the target animation effect.
  • animation effect 1 may include: the first transition animation is that the border of the screen-off wallpaper is continuously enlarged until it disappears from the boundary of the display screen, and the display area of the screen-off wallpaper is enlarged together with the border of the screen-off wallpaper until the complete lock screen wallpaper is displayed, and the second transition animation is gradually enlarged (or reduced) from the lock screen wallpaper to the main interface wallpaper;
  • animation effect 2 may include: the first transition animation is that the border of the screen-off wallpaper is continuously rotated and enlarged until it disappears from the boundary of the display screen, and the display area of the screen-off wallpaper is rotated and enlarged together with the border of the screen-off wallpaper until the complete lock screen wallpaper is displayed, and the second transition animation is continuously rotated and enlarged (or reduced) from the lock screen wallpaper to the main interface wallpaper; and so on.
  • the border style and animation effect in the above embodiment constitute the template of the transition effect.
  • the first application can provide only a template control for the user, such as the interface 380 of the custom transition effect editing interface shown in FIG3C, which includes several template controls, each template control corresponds to a preset template, and each preset template includes: a border style and an animation effect, that is, the first application pre-combines the border style and the animation effect, and the combination of the border style and the animation effect is no longer specified by the user.
  • Template 1 may include: the border style is a circle, and the animation effect is the above-mentioned animation effect 1;
  • Template 2 may include: the border style is a circle, and the animation effect is the above-mentioned animation effect 2;
  • Template 3 may include: the border style is a square, and the animation effect is the above-mentioned animation effect 1.
  • the user selects a template control of the custom transition effect editing interface, such as the "Template 1" control, and accordingly, the first application detects the user's selection operation on the template control, and determines the template corresponding to the template control indicated by the selection operation as the target template.
  • the first application determines the target template, that is, determines the target border style and the target animation effect.
  • the subsequent implementation can refer to the above-mentioned embodiment, which is not repeated here.
  • the first transition animation from the screen off to the lock screen generated by the first application is that the screen off wallpaper border is continuously enlarged until it disappears from the boundary of the display screen, and the screen off wallpaper display area is enlarged together with the screen off wallpaper border until the complete lock screen wallpaper is displayed.
  • the display interface of the first transition animation from the screen off to the lock screen is shown as interface 400, interface 410 and interface 420 in FIG. 4 ;
  • the second transition animation from the lock screen to the main interface generated by the first application is that the lock screen wallpaper gradually enlarges to the main interface wallpaper, and the display interface of the second transition animation from the lock screen to the main interface is shown as interface 420 and interface 430 in Figure 4.
  • the custom transition animation includes a first transition animation from the screen off to the lock screen and a second transition animation from the lock screen to the main interface.
  • the above interface implementation can be adaptively reduced. For example, if the custom transition animation only includes the first transition animation from the screen off to the lock screen, then the above main interface wallpaper adjustment interface can be omitted, and the related processing for generating the second transition animation in subsequent steps can be omitted.
  • the user is taken to adjust and set the off-screen wallpaper, lock screen wallpaper and main interface wallpaper according to the target picture as an example.
  • the first application can also automatically process the off-screen wallpaper, lock screen wallpaper and main interface wallpaper according to the target picture and/or the above-mentioned target template, and the above-mentioned off-screen wallpaper adjustment interface, lock screen wallpaper adjustment interface and main interface wallpaper adjustment interface can be omitted.
  • FIG. 5 is a flow chart of a method for generating transition effects according to an embodiment of the present application. As shown in FIG. 5 , the method includes:
  • Step 501 The first application displays a first setting interface, and the first setting interface includes a transition animation setting control.
  • the first setting interface is shown as interface 300, for example.
  • Step 502 The first application detects a user's selection operation on a transition animation setting control and displays a transition animation selection interface.
  • the transition effect selection interface includes: style controls, which include custom controls.
  • the transition animation effect selection interface is shown in interface 310, for example.
  • Step 503 The first application detects a user's selection operation on a custom control and displays a first custom transition animation editing interface.
  • the first custom transition animation editing interface is shown as interface 320 , for example, and includes: a picture selection control, for example, as shown as control 321 .
  • Step 504 The first application detects a user selection operation on the image selection control and displays an image selection interface.
  • the picture selection interface is shown as interface 330, for example.
  • the first application displays the picture selection interface, which may include:
  • the first application sends a request for obtaining a picture to the gallery application
  • the gallery application receives the image acquisition request and sends the image to the first application
  • the first application displays the received picture in the picture selection interface.
  • the gallery application can send all or part of the managed pictures to the first application.
  • the embodiment of the present application is not limited to this. For example, if the pictures in the gallery application are set with different access permissions, the gallery application can send the pictures to which the first application has access permissions to the first application.
  • Step 505 The first application detects a user's selection operation on a picture in the picture selection interface, determines the picture indicated by the selection operation as a target picture, and determines an off-screen wallpaper, a lock screen wallpaper, and a main interface wallpaper based on the target picture.
  • determining the screen-off wallpaper according to the target image may include:
  • the first application displays an off-screen wallpaper adjustment interface, such as that shown in interface 340;
  • the first application detects a selection operation on an adjustment confirmation control for the screen-off wallpaper, and determines setting parameters for the screen-off wallpaper according to position parameters of an image area in a border control in a target image.
  • the above position parameters can be the coordinates of the diagonal vertices of the inscribed square of the border control, as shown in Figure 6. If the border control is a square, the above position parameters can be the coordinates of the diagonal vertices of the border control.
  • Determining the lock screen wallpaper based on the target image may include:
  • the first application displays a lock screen wallpaper adjustment interface, such as that shown in interface 350;
  • the first application detects a selection operation on a lock screen wallpaper adjustment confirmation control, and determines setting parameters of the lock screen wallpaper according to position parameters of an image area displayed in a lock screen wallpaper adjustment interface in a target image.
  • Determining the main interface wallpaper based on the target image may include:
  • the first application displays the main interface wallpaper adjustment interface, such as shown in interface 360;
  • the first application detects a selection operation on the main interface wallpaper adjustment confirmation control, and determines setting parameters of the main interface wallpaper according to position parameters of an image area displayed in the main interface wallpaper adjustment interface in the target image.
  • the setting parameters of the off-screen wallpaper, the setting parameters of the lock-screen wallpaper, and the setting parameters of the main interface wallpaper can be preset. Then, in this step, the off-screen wallpaper, the lock-screen wallpaper, and the main interface wallpaper can be determined according to the target image and the above setting parameters.
  • the above preset setting parameters of the off-screen wallpaper, the setting parameters of the lock-screen wallpaper, and the setting parameters of the main interface wallpaper can correspond to the templates in the following steps, that is, different templates correspond to different setting parameters of the off-screen wallpaper, the setting parameters of the lock-screen wallpaper, and the setting parameters of the main interface wallpaper. In this case, the implementation of this step can be set after step 506.
  • Step 506 The first application displays a second custom transition animation editing interface, where the second custom transition animation editing interface includes: a template control.
  • the second custom transition animation editing interface is shown in example interface 380 .
  • Step 507 The first application detects a selection operation of the user on a target template control, and determines the template corresponding to the target template control indicated by the selection operation as the target template.
  • the target module may include: border style parameters of the screen-off wallpaper border and animation effect parameters.
  • Step 508 The first application detects the user's editing confirmation operation and determines that the editing of the custom transition animation effect is completed.
  • the screen-off wallpaper, the lock screen wallpaper and the main interface wallpaper can be set.
  • step 504 There is no limitation on the execution order between step 504 and step 506 in the embodiment of the present application.
  • Step 509 The first application generates a first transition animation from the off screen to the locked screen according to the off screen wallpaper, the lock screen wallpaper, and the target template, and generates a second transition animation from the lock screen to the main interface according to the lock screen wallpaper, the main interface wallpaper, and the target template.
  • the border style parameters of the screen-off wallpaper border may include: the shape of the border, such as a circle, a triangle, a square, etc.; the initial display position of the border on the display screen, etc.
  • the animation effect parameters may include: animation effect parameters of the first transition animation, and animation effect parameters of the second transition animation.
  • the first transition animation may include a mask layer and a background layer, the mask layer is used to display the mask and the screen-off wallpaper border, and the background layer is used to display the image.
  • the animation effect parameters of the first transition animation may include but are not limited to: a change indication parameter of the screen-off wallpaper border, used to indicate the change mode of the screen-off wallpaper border in the first transition animation, such as indicating that the border is constantly expanding or shrinking, etc., and optionally, it can further indicate the proportion of expansion or reduction of the border, etc.;
  • a change indication parameter of the mask used to indicate the change mode of the mask layer in the first transition animation, for example, it may include: the initial size and initial display position of the mask, the gradual increase (or decrease) of the mask, the transparency of the mask area in the screen-off wallpaper border, the transparency of the mask area outside the screen-off wallpaper border, etc.; a change indication parameter of the background layer, used to indicate the change mode from the screen-off wallpaper to the lock screen wallpaper in the first transition animation, such as
  • the animation effect parameters of the second transition animation may include but are not limited to: a change indication parameter of the background layer, used to indicate the change method from the lock screen wallpaper to the main interface wallpaper in the second transition animation, such as gradient, or rotation and gradient, etc.
  • the size and display position of the off-screen wallpaper border in each frame of the first transition animation can be determined according to the size and initial display position of the off-screen wallpaper border, the change mode of the off-screen wallpaper border, etc., and the size, display position, transparency of each pixel, etc.
  • the image displayed in each frame of the first transition animation can be calculated according to the change indication parameters of the background layer, combined with the setting parameters of the off-screen wallpaper, the setting parameters of the lock screen wallpaper, etc.; based on the above processing, the first transition animation can be generated.
  • the first application can generate a second transition animation.
  • the position coordinates of the diagonal vertices A1 and B1 of the inscribed square of the screen-off wallpaper border on the display screen can be determined according to the initial size and initial display position of the screen-off wallpaper border
  • the position coordinates of the diagonal vertices C1 and D1 of the mask on the display screen can be determined according to the initial size and initial display position of the mask
  • the screen-off wallpaper that is, the image displayed in the screen-off wallpaper border
  • the end frame of the first transition animation is shown as 630 in Figure 6
  • the diagonal vertices of the lock screen wallpaper are A3 and B3
  • the diagonal vertices of the mask can also be A3 and B3.
  • the position coordinates of A3 and B3 on the display screen can be determined according to the preset end size and end display position of the mask, or the position coordinates of A3 and B3 on the display screen can also be determined according to the display position of the lock screen wallpaper on the display, and the lock screen wallpaper can be determined according to the setting parameters of the lock screen wallpaper.
  • the position coordinates of the diagonal vertices C1 and D1 of the mask in the initial frame on the display screen and the position coordinates of the diagonal vertices A3 and B3 of the mask in the final frame on the display screen can be calculated by using interpolation or other methods.
  • the animation effect parameters are set to gradually enlarge the screen-off wallpaper border according to a preset ratio until it disappears from the boundary of the display screen, under the condition that the initial size and initial display position of the screen-off wallpaper border in the initial frame (such as the position coordinates of A1 and B1 on the display screen) and the frame enlargement ratio are known, the size and display position of the screen-off wallpaper border in each animation frame between the initial frame and the end frame can be calculated, such as the position coordinates of points A2 and B2 on the display screen shown in 620 in FIG. 6;
  • the animation effect parameters are set to continuously increase the image area, under the condition that the position coordinates and scaling ratio of the diagonal vertices of the off-screen wallpaper in the initial frame and the position coordinates and scaling ratio of the diagonal vertices of the lock screen wallpaper in the target image in the ending frame are known, the position coordinates of the diagonal vertices of the image displayed in each animation frame between the initial frame and the ending frame and the scaling ratio of the target image can be calculated.
  • the first transition animation can be generated according to the above calculation results.
  • animation effect parameters are only examples. Based on different animation effects and different animation generation methods, the animation effect parameters can be adaptively changed, and the embodiments of the present application are not limited thereto.
  • FIG. 7 is another flow chart of the method for generating transition effects provided in an embodiment of the present application. As shown in FIG. 7 , compared with FIG. 5 Method, replace steps 506 to 509 with the following steps 701 to 705.
  • Step 701 The first application displays a third custom transition animation editing interface, and the third custom transition animation editing interface includes: a border style control and an animation effect control.
  • the second custom transition animation editing interface is shown in example interface 370 .
  • Step 702 The first application detects a selection operation of a user on a border style control, and determines the border style corresponding to the border style control indicated by the selection operation as a target border style.
  • Step 703 The first application detects a selection operation of the user on the animation effect control, and determines the animation effect corresponding to the animation effect control indicated by the selection operation as the target animation effect.
  • Step 704 The first application detects the user's editing confirmation operation and determines that the editing of the custom transition animation effect is completed.
  • the screen-off wallpaper, the lock screen wallpaper and the main interface wallpaper can be set.
  • Step 705 The first application generates a first transition animation from the off screen to the lock screen according to the off screen wallpaper, the lock screen wallpaper, the target border style and the target animation effect, and generates a second transition animation from the lock screen to the main interface according to the lock screen wallpaper, the main interface wallpaper and the target animation effect.
  • step 507 and step 508 The method for the first application to generate the first transition animation and the second transition animation can be referred to step 507 and step 508, which will not be described in detail here.
  • the first transition animation and the second transition animation can be saved.
  • the second application detects the trigger condition of screen unlocking in the screen-off state, when it is determined to switch from the screen-off interface to the lock screen interface, the first transition animation is played, and the lock screen interface is displayed after the first transition animation is played; if the second application successfully authenticates the user in the lock screen state, when it is determined to switch from the lock screen interface to the main interface, the second transition animation is played, and the main interface is displayed after the second transition animation is played.
  • the first application generates the first transition animation and the second transition animation
  • the second application directly plays the first transition animation or the second transition animation generated by the first application during the interface switching process.
  • the second application generates and plays a first transition animation in real time when determining to switch from the screen-off interface to the lock screen interface, and generates and plays a second transition animation in real time when determining to switch from the lock screen interface to the main interface.
  • FIG8 is another flowchart of the method for generating transition effects according to an embodiment of the present application. As shown in FIG8 , relative to the method shown in FIG5 , step 508 is replaced by the following steps 801 and 802 .
  • Step 801 When the second application detects a trigger condition for unlocking the screen in the screen-off state, and determines to switch from the screen-off interface to the lock screen interface, it generates and plays the first transition animation from the screen-off to the lock screen according to the screen-off wallpaper, the lock screen wallpaper, and the target template, and displays the lock screen interface when the first transition animation is finished.
  • step 508 The implementation of this step may refer to step 508, the main difference being that the execution subject changes from the first application to the second application.
  • the embodiment of the present application can be applicable to the scenario where the screen-off wallpaper can change the display position when the screen is off.
  • the second application in this step can specifically generate and play the first transition animation from the screen off to the lock screen according to the display position of the screen-off wallpaper, the screen-off wallpaper, the lock screen wallpaper, and the target template, and display the lock screen interface when the first transition animation is finished.
  • the implementation method of this step can refer to the corresponding description in the aforementioned step 508.
  • the main difference is that the actual display position of the screen-off wallpaper needs to be calculated based on the actual display position of the screen-off wallpaper, so as to obtain the initial frame of the mask layer and the initial frame of the image layer of the first transition animation.
  • Step 802 When the second application successfully authenticates the user in the lock screen state and determines to switch from the lock screen interface to the main interface, a second transition animation from the lock screen to the main interface is generated and played based on the lock screen wallpaper, the main interface wallpaper, and the target template. After the second transition animation is played, the main interface is displayed.
  • FIG9 is another flowchart of the method for generating transition effects according to an embodiment of the present application. As shown in FIG9 , relative to the method shown in FIG7 , step 705 is replaced by the following steps 901 and 902 .
  • Step 901 When the second application detects a trigger condition for unlocking the screen in the screen-off state, and determines to switch from the screen-off interface to the lock screen interface, it generates and plays a first transition animation from the screen-off to the lock screen according to the screen-off wallpaper, lock screen wallpaper, target border style, and target animation effect, and displays the lock screen interface when the first transition animation is finished.
  • step 705 The implementation of this step may refer to step 705, the main difference being that the execution subject changes from the first application to the second application.
  • the embodiment of the present application can be applicable to the scenario where the screen-off wallpaper can change the display position in the screen-off state.
  • the second application in this step can specifically generate and play the first transition animation from the screen-off to the lock screen according to the display position of the screen-off wallpaper, the screen-off wallpaper, the lock screen wallpaper, and the target template, and display the lock screen interface when the first transition animation is finished.
  • the implementation method of this step can refer to the corresponding description in the aforementioned step 705.
  • the main difference is that the actual display position of the screen-off wallpaper needs to be calculated according to the display position of the screen-off wallpaper, so as to obtain the initial frame of the mask layer and the initial frame of the image layer of the first transition animation.
  • Step 902 When the second application successfully authenticates the user's identity in the lock screen state and determines to switch from the lock screen interface to the main interface, a second transition animation from the lock screen to the main interface is generated and played based on the lock screen wallpaper, the main interface wallpaper, and the target animation effect. After the second transition animation is played, the main interface is displayed.
  • An embodiment of the present application further provides a computer-readable storage medium, in which a computer program is stored.
  • the computer-readable storage medium is run on a computer, the computer executes a method provided in any embodiment of the present application.
  • An embodiment of the present application also provides a computer program product, which includes a computer program.
  • a computer program product which includes a computer program.
  • the computer program When the computer program is run on a computer, it enables the computer to execute the method provided by any embodiment of the present application.
  • “at least one” refers to one or more, and “more than one” refers to two or more.
  • “And/or” describes the association relationship of associated objects, indicating that three relationships may exist.
  • a and/or B can represent the existence of A alone, the existence of A and B at the same time, and the existence of B alone. Among them, A and B can be singular or plural.
  • the character “/” generally indicates that the previous and next associated objects are in an "or” relationship.
  • “At least one of the following” and similar expressions refer to any combination of these items, including any combination of single or plural items.
  • At least one of a, b and c can be represented by: a, b, c, a and b, a and c, b and c, or a and b and c, where a, b, c can be single or multiple.
  • any function can be stored in a computer-readable storage medium if it is implemented in the form of a software functional unit and sold or used as an independent product.
  • the technical solution of the present application or the part that contributes to the prior art or the part of the technical solution, can be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for a computer device (which can be a personal computer, server, or network device, etc.) to perform all or part of the steps of the method described in each embodiment of the present application.
  • the aforementioned storage medium includes: U disk, mobile hard disk, read-only memory (Read-Only Memory; hereinafter referred to as: ROM), random access memory (Random Access Memory; hereinafter referred to as: RAM), disk or optical disk, and other media that can store program codes.
  • ROM read-only memory
  • RAM random access memory
  • disk or optical disk and other media that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Telephone Function (AREA)

Abstract

L'invention concerne un procédé de génération d'effet dynamique de transition et un dispositif électronique. Le procédé consiste à : afficher une première interface, la première interface comprenant une première commande ; lorsqu'une première opération de sélection sur la première commande par un utilisateur est détectée, afficher une seconde interface, la seconde interface comprenant des images ; lorsqu'une seconde opération de sélection sur les images par l'utilisateur est détectée, déterminer une image cible selon la seconde opération de sélection ; afficher une troisième interface, la troisième interface comprenant des commandes de modèle, les commandes de modèle correspondant à des modèles prédéfinis ; lorsqu'une troisième opération de sélection sur les commandes de modèle par l'utilisateur est détectée, déterminer une commande de modèle cible selon la troisième opération de sélection, et déterminer en tant que modèle cible le modèle correspondant à la commande de modèle cible ; et selon l'image cible et le modèle cible, générer une première animation de transition d'un état d'écran éteint à un état d'écran verrouillé et/ou une seconde animation de transition de l'état d'écran verrouillé à une interface principale.
PCT/CN2023/128456 2022-11-30 2023-10-31 Procédé de génération d'effet dynamique de transition et dispositif électronique WO2024114257A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211529518.5A CN118113386A (zh) 2022-11-30 2022-11-30 转场动效生成方法和电子设备
CN202211529518.5 2022-11-30

Publications (1)

Publication Number Publication Date
WO2024114257A1 true WO2024114257A1 (fr) 2024-06-06

Family

ID=91210906

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/128456 WO2024114257A1 (fr) 2022-11-30 2023-10-31 Procédé de génération d'effet dynamique de transition et dispositif électronique

Country Status (2)

Country Link
CN (1) CN118113386A (fr)
WO (1) WO2024114257A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106325882A (zh) * 2016-08-31 2017-01-11 北京云图微动科技有限公司 一种锁屏界面的生成方法及装置
CN108469981A (zh) * 2018-04-12 2018-08-31 何可 一种可自定义屏幕切换规则和屏幕规则的手机桌面系统
CN111176533A (zh) * 2019-12-27 2020-05-19 宇龙计算机通信科技(深圳)有限公司 壁纸切换方法、装置、存储介质以及终端

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106325882A (zh) * 2016-08-31 2017-01-11 北京云图微动科技有限公司 一种锁屏界面的生成方法及装置
CN108469981A (zh) * 2018-04-12 2018-08-31 何可 一种可自定义屏幕切换规则和屏幕规则的手机桌面系统
CN111176533A (zh) * 2019-12-27 2020-05-19 宇龙计算机通信科技(深圳)有限公司 壁纸切换方法、装置、存储介质以及终端

Also Published As

Publication number Publication date
CN118113386A (zh) 2024-05-31

Similar Documents

Publication Publication Date Title
CN110231905B (zh) 一种截屏方法及电子设备
WO2020259452A1 (fr) Procédé d'affichage plein écran pour terminal mobile et appareil
WO2020077511A1 (fr) Procédé permettant d'afficher une image dans une scène photographique et dispositif électronique
WO2020125410A1 (fr) Procédé de traitement d'image et dispositif électronique
WO2022100610A1 (fr) Procédé et appareil de projection d'écran, ainsi que dispositif électronique et support de stockage lisible par ordinateur
WO2021036785A1 (fr) Procédé de rappel de message et dispositif électronique
WO2022007862A1 (fr) Procédé de traitement d'image, système, dispositif électronique et support de stockage lisible par ordinateur
CN110138999B (zh) 一种用于移动终端的证件扫描方法及装置
WO2022001258A1 (fr) Procédé et appareil d'affichage à écrans multiples, dispositif terminal et support de stockage
CN110248037B (zh) 一种身份证件扫描方法及装置
WO2021057626A1 (fr) Procédé de traitement d'image, appareil, dispositif et support de stockage informatique
WO2022143180A1 (fr) Procédé d'affichage collaboratif, dispositif terminal et support de stockage lisible par ordinateur
WO2023241209A9 (fr) Procédé et appareil de configuration de papier peint de bureau, dispositif électronique et support de stockage lisible
CN112527220B (zh) 一种电子设备显示方法及电子设备
CN110286975B (zh) 一种前景元素的显示方法和电子设备
CN113542574A (zh) 变焦下的拍摄预览方法、终端、存储介质及电子设备
CN114500901A (zh) 双景录像方法、装置和电子设备
CN112449101A (zh) 一种拍摄方法及电子设备
WO2024078275A1 (fr) Appareil et procédé de traitement d'image, dispositif électronique et support de stockage
CN114827098A (zh) 合拍的方法、装置、电子设备和可读存储介质
CN113923372B (zh) 曝光调整方法及相关设备
EP4206865A1 (fr) Procédé de génération d'image à effet de pinceau, procédé et dispositif d'édition d'image et support de stockage
WO2022033344A1 (fr) Procédé de stabilisation vidéo, dispositif de terminal et support de stockage lisible par ordinateur
WO2024114257A1 (fr) Procédé de génération d'effet dynamique de transition et dispositif électronique
CN113495733A (zh) 主题包安装方法、装置、电子设备及计算机可读存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23896411

Country of ref document: EP

Kind code of ref document: A1