WO2020244497A1 - Procédé d'affichage pour écran flexible et dispositif électronique - Google Patents

Procédé d'affichage pour écran flexible et dispositif électronique Download PDF

Info

Publication number
WO2020244497A1
WO2020244497A1 PCT/CN2020/093899 CN2020093899W WO2020244497A1 WO 2020244497 A1 WO2020244497 A1 WO 2020244497A1 CN 2020093899 W CN2020093899 W CN 2020093899W WO 2020244497 A1 WO2020244497 A1 WO 2020244497A1
Authority
WO
WIPO (PCT)
Prior art keywords
screen
interface
electronic device
controls
application
Prior art date
Application number
PCT/CN2020/093899
Other languages
English (en)
Chinese (zh)
Inventor
刘敏
范振华
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2020244497A1 publication Critical patent/WO2020244497A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Definitions

  • This application relates to the field of terminal technology, and in particular to a display method and electronic device of a flexible screen.
  • Flexible screens can also be called flexible OLEDs (organic light-emitting diodes). Compared with traditional screens, flexible screens are not only lighter and thinner in volume, but also based on their bendability and flexibility. The durability is also much higher than traditional screens.
  • the present application provides a display method and electronic device for a flexible screen, which can re-lay out various controls in the display interface according to the physical state of the flexible screen, so that the user can get a better experience when watching and using the flexible screen.
  • the present application provides a method for displaying a flexible screen.
  • the flexible screen can be divided into a first screen and a second screen by a folding line when folded.
  • the above method includes: When the included angle is less than the first preset value (for example, 90°) or greater than the second preset value (for example, 180°), the electronic device may display the first interface on the first screen and/or the second screen; when the first screen is detected When the angle between the first screen and the second screen is within the closed interval formed by the first preset value and the second preset value, the flexible screen is in the unfolded state or the stand state.
  • the electronic device can determine that the Further, the electronic device can display a second interface on the first screen, and the second interface includes operation controls; and, the electronic device can display a third interface on the second screen, and the third interface Including non-operational controls.
  • the electronic device can display the operating controls and non-operating controls in the application interface being displayed on the first screen and the second screen of the flexible screen. in.
  • the operating controls after the split-screen display will not block the non-operating controls, so that the user can obtain a better immersive visual and operating experience.
  • the method further includes: the electronic device queries the first Whether the interface is a preset application interface; when the first interface is a preset application interface (such as a game interface), it means that there may be a need for split-screen display when the application is running, and the electronic device may further determine Operational controls and non-operational controls.
  • the electronic device queries the first Whether the interface is a preset application interface; when the first interface is a preset application interface (such as a game interface), it means that there may be a need for split-screen display when the application is running, and the electronic device may further determine Operational controls and non-operational controls.
  • the method further includes: the electronic device determines the first screen And the second screen are in the landscape state; when the first screen and the second screen are in the landscape state, it means that there may be a need for split-screen display when the application is running, and the electronic device can further determine the Operational controls and non-operational controls.
  • the electronic device determining the operating controls and non-operating controls in the first interface includes: the electronic device can obtain the position and size of each control in the first interface; further, the electronic device can set the size larger than The control of the preset size is determined as a non-operation control, and the control located on the layer where the non-operation control is located is determined as the operation control.
  • the electronic device may further determine whether the layer where the first control is located is the topmost layer. If the layer where the first control is located is not the topmost layer, the electronic device may confirm the first control as a non-operation control.
  • the electronic device can recognize the control with a larger display size and close to the underlying layer as a non-operational control, and recognize the control above the non-operational control as an operational control according to the size and positional relationship of each control in the interface.
  • the electronic device determining operation controls and non-operation controls in the first interface includes: the electronic device obtains a preset configuration file corresponding to the first interface, and the first interface is recorded in the configuration file The display position of the operation control in the middle; then, the electronic device can recognize the operation control in the first interface according to the configuration file, and determine the control except the operation control in the first interface as a non-operation control.
  • the second screen is a screen facing the user.
  • the second screen facing the user can display non-operational controls that do not require operation and interaction for the user, and the user can operate the above-mentioned operation controls on another screen (ie, the first screen), so that the user can get immersive viewing And operating experience.
  • the electronic device may also call a camera, an infrared sensor, a proximity light sensor, or a touch device to identify the specific orientation of the first screen and the second screen in the flexible screen.
  • the electronic device can use acceleration sensors and other devices to detect the angle between the first screen and the second screen and the horizontal plane. When the angle between the first screen and the horizontal plane is less than a certain angle, the second screen can be determined to be the screen facing the user. .
  • the method further includes: the electronic device acquires first view information for drawing the first interface (for example, the view tree of the first interface), and The view information includes the layer order between the controls in the first interface; further, the electronic device can generate second view information corresponding to the second interface and third view information corresponding to the third interface according to the first view information.
  • first view information for drawing the first interface (for example, the view tree of the first interface)
  • the view information includes the layer order between the controls in the first interface
  • the electronic device can generate second view information corresponding to the second interface and third view information corresponding to the third interface according to the first view information.
  • the second view information includes the layer order between the operation controls
  • the third view information includes the layer order between the non-operation controls
  • the electronic device displays the second interface on the first screen
  • the electronic device Displaying the third interface on the second screen includes: the electronic device draws the second interface on the first screen according to the second view information; and the electronic device draws the third interface on the second screen according to the third view information.
  • the layer order of the operation controls in the second view information is the same or different from the layer order in the first view information, that is, the layer order of the operation controls in the interface before and after the split screen can be the same or different. different.
  • the layer order of the non-operating controls in the third view information is the same or different from the layer order in the first view information, that is, the layer order of the non-operating controls in the interface before and after the split screen can be the same or different .
  • the configuration file corresponding to the first interface may also record the display position of the operation control in the second interface and the display position of the non-operation control in the third interface; at this time, the electronic device Generating the second view information corresponding to the second interface and the third view information corresponding to the third interface according to the first view information includes: the electronic device can follow the display position of the operation control recorded in the configuration file in the second interface, Split and reorganize the operation controls in the first view information to obtain the second view information; similarly, the electronic device can compare the first view information according to the display position of the non-operation controls recorded in the configuration file in the third interface. The non-operational controls in the control panel are split and reorganized to obtain the third view information.
  • the method further includes: the electronic device receives a first touch event input by the user to the operation control on the second interface; The event is mapped to a second touch event in the first interface; the electronic device executes an operation instruction corresponding to the second touch event. That is, after each operation control is displayed on the second interface, the user can operate each operation control in the second interface to continue to complete the related functions of the operation control.
  • the electronic device mapping the first touch event to the second touch event in the first interface includes: the electronic device obtains the coordinates of the first touch point in the first touch event; And the relative position relationship on the second interface, and calculate the coordinates of the second touch point corresponding to the first touch point in the first interface; wherein the electronic device executes the operation instruction corresponding to the second touch event, including: The coordinates of the two touch points are carried in the second touch event and reported to the first application (the first interface is the interface of the first application), so that the first application executes the operation instruction corresponding to the second touch event.
  • the electronic device after the electronic device detects that the user inputs the first touch event in the second interface after the split screen, it can map the first touch event to the second touch event in the first interface before the split screen, so that related applications Finally, it can respond to the second touch event, that is, respond to the first touch event input by the user on the second interface.
  • the present application provides an electronic device, including: a flexible screen that is divided into a first screen and a second screen by a folding line when folded; one or more processors; one or more memories; And one or more computer programs; wherein, the processor is coupled with the flexible screen and the memory, and the above one or more computer programs are stored in the memory.
  • the processor executes one or more stored in the memory.
  • a computer program to make an electronic device execute the flexible screen display method described in any one of the above.
  • the present application provides a computer storage medium including computer instructions, which when the computer instructions run on an electronic device, cause the electronic device to execute the flexible screen display method as described in any one of the first aspect.
  • this application provides a computer program product, which when the computer program product runs on an electronic device, causes the electronic device to execute the flexible screen display method described in any one of the first aspect.
  • the electronic equipment described in the second aspect, the computer storage medium described in the third aspect, and the computer program product described in the fourth aspect provided above are all used to execute the corresponding methods provided above.
  • the beneficial effects that can be achieved please refer to the beneficial effects in the corresponding method provided above, which will not be repeated here.
  • FIG. 2 is a first structural diagram of an electronic device according to an embodiment of the application.
  • FIG. 3 is a first structural diagram of a flexible screen in an electronic device provided by an embodiment of the application.
  • FIG. 4 is a schematic diagram 2 of the structure of a flexible screen in an electronic device provided by an embodiment of the application;
  • FIG. 5 is a schematic diagram 1 of an application scenario of a flexible screen display method provided by an embodiment of the application;
  • FIG. 6 is a schematic diagram 2 of an application scenario of a flexible screen display method provided by an embodiment of the application;
  • FIG. 7 is a schematic structural diagram of an operating system in an electronic device provided by an embodiment of this application.
  • FIG. 8 is a third schematic diagram of an application scenario of a flexible screen display method provided by an embodiment of the application.
  • FIG. 9 is a schematic diagram 4 of an application scenario of a flexible screen display method provided by an embodiment of the application.
  • FIG. 10 is a schematic diagram 5 of an application scenario of a flexible screen display method provided by an embodiment of the application.
  • FIG. 11 is a sixth schematic diagram of an application scenario of a flexible screen display method provided by an embodiment of the application.
  • FIG. 12 is a schematic diagram 7 of an application scenario of a flexible screen display method provided by an embodiment of the application.
  • FIG. 13 is an eighth schematic diagram of an application scenario of a flexible screen display method provided by an embodiment of this application.
  • FIG. 14 is a schematic diagram 9 of an application scenario of a flexible screen display method provided by an embodiment of this application.
  • 15 is a schematic diagram ten of an application scenario of a flexible screen display method provided by an embodiment of this application.
  • 16 is a schematic diagram eleventh of an application scenario of a flexible screen display method provided by an embodiment of this application.
  • FIG. 17 is a twelfth schematic diagram of an application scenario of a flexible screen display method provided by an embodiment of this application.
  • 18 is a schematic diagram 13 of an application scenario of a flexible screen display method provided by an embodiment of this application.
  • FIG. 19 is a fourteenth schematic diagram of an application scenario of a flexible screen display method provided by an embodiment of this application.
  • 20 is a fifteenth schematic diagram of an application scenario of a flexible screen display method provided by an embodiment of this application.
  • FIG. 21 is a sixteenth schematic diagram of an application scenario of a flexible screen display method provided by an embodiment of this application.
  • FIG. 22 is a second structural diagram of an electronic device provided by an embodiment of this application.
  • the flexible screen display method provided in the embodiments of this application can be applied to mobile phones, tablet computers, notebook computers, ultra-mobile personal computers (UMPC), handheld computers, netbooks, and personal digital assistants (personal digital assistants).
  • UMPC ultra-mobile personal computers
  • PDA personal digital assistants
  • wearable electronic devices virtual reality devices, and other electronic devices
  • the embodiments of the present application do not make any restrictions on this.
  • FIG. 2 shows a schematic structural diagram of the electronic device 100.
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, and an antenna 2.
  • Mobile communication module 150 Wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone interface 170D, sensor module 180, camera 193, display screen 194, etc.
  • the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the electronic device 100.
  • the electronic device 100 may include more or fewer components than shown, or combine certain components, or split certain components, or arrange different components.
  • the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU), etc.
  • AP application processor
  • modem processor modem processor
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • the different processing units may be independent devices or integrated in one or more processors.
  • a memory may also be provided in the processor 110 to store instructions and data.
  • the memory in the processor 110 is a cache memory.
  • the memory can store instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to use the instruction or data again, it can be directly called from the memory. Repeated accesses are avoided, the waiting time of the processor 110 is reduced, and the efficiency of the system is improved.
  • the processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (PCM) interface, and a universal asynchronous transmitter receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and / Or Universal Serial Bus (USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • UART universal asynchronous transmitter receiver/transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB Universal Serial Bus
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger can be a wireless charger or a wired charger.
  • the charging management module 140 may receive the charging input of the wired charger through the USB interface 130.
  • the charging management module 140 may receive the wireless charging input through the wireless charging coil of the electronic device 100. While the charging management module 140 charges the battery 142, it can also supply power to the electronic device through the power management module 141.
  • the power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110.
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the display screen 194, the camera 193, and the wireless communication module 160.
  • the power management module 141 may also be provided in the processor 110.
  • the power management module 141 and the charging management module 140 may also be provided in the same device.
  • the wireless communication function of the electronic device 100 can be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, and the baseband processor.
  • the antenna 1 and the antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in the electronic device 100 can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna can be used in combination with a tuning switch.
  • the mobile communication module 150 can provide a wireless communication solution including 2G/3G/4G/5G and the like applied to the electronic device 100.
  • the mobile communication module 150 may include one or more filters, switches, power amplifiers, low noise amplifiers (LNA), etc.
  • the mobile communication module 150 can receive electromagnetic waves by the antenna 1, and perform processing such as filtering and amplifying the received electromagnetic waves, and then transmitting them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modem processor, and convert it into electromagnetic waves for radiation via the antenna 1.
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110.
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be provided in the same device.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low-frequency baseband signal is processed by the baseband processor and then passed to the application processor.
  • the application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays an image or video through the display screen 194.
  • the modem processor may be an independent device.
  • the modem processor may be independent of the processor 110 and be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the electronic device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), Bluetooth (BT), and global navigation satellites.
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication technology
  • infrared technology infrared, IR
  • the wireless communication module 160 may be one or more devices integrating one or more communication processing modules.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2, frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110.
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110, perform frequency modulation, amplify it, and convert it into electromagnetic wave radiation via the antenna 2.
  • the antenna 1 of the electronic device 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technologies may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite-based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite-based augmentation systems
  • the electronic device 100 implements a display function through a GPU, a display screen 194, and an application processor.
  • the GPU is a microprocessor for image processing, connected to the display 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • the processor 110 may include one or more GPUs, which execute program instructions to generate or change display information.
  • the display screen 194 may specifically be a flexible screen.
  • the flexible screen may include a display device and a touch sensor.
  • the display device is used to output display content to the user, and the touch sensor is used to receive a touch operation input by the user on the flexible screen.
  • the flexible screen 301 can be displayed as a complete display area in the unfolded state.
  • the user can fold the screen along one or more folding lines in the flexible screen 301.
  • the position of the folding line may be preset, or may be formed by the user after folding any position of the flexible screen 301.
  • the flexible screen 301 can be divided into two display areas along the fold line AB, namely, display area 1 and display area 2.
  • the folded display area 1 and the display area 2 can be displayed as two independent display areas.
  • the display area 1 may be referred to as the main screen of the mobile phone 100
  • the display area 2 may be referred to as the secondary screen of the mobile phone 100.
  • the display area of the main screen and the secondary screen can be the same or different.
  • the size of the flexible screen is 2200*2480 (in pixels).
  • the width of the folding line AB on the flexible screen is 166. After folding along the folding line AB, the area on the right side of the flexible screen with a size of 1144*2480 is divided into the main screen 41, and the area on the left side of the flexible screen with a size of 890*2480 is divided into the secondary screen 42.
  • the mobile phone 100 can calculate the angle between the main screen 41 and the sub screen 42 based on the data detected by one or more sensors.
  • a gyroscope and an acceleration sensor may be provided on the main screen 41 and the auxiliary screen 42 of the flexible screen, respectively.
  • the gyroscope on the main screen 41 can detect the rotation angular velocity when the main screen 41 rotates, and the acceleration sensor on the main screen 41 can detect the acceleration generated when the main screen 41 moves.
  • the mobile phone 100 can determine the magnitude and direction of the gravity G according to the data detected by the gyroscope and the acceleration sensor on the main screen 41.
  • the mobile phone 100 can also determine the magnitude and direction of the gravity G according to the data detected by the gyroscope and the acceleration sensor on the secondary screen 42.
  • the sensor A can be provided on the main screen 41 and the sub screen 42 respectively.
  • a gyroscope and an acceleration sensor can be integrated into one sensor and arranged on the main screen 41 and the auxiliary screen 42 respectively.
  • corresponding coordinate systems may be set on the main screen 41 and the sub screen 42 respectively.
  • a Cartesian coordinate system O1 can be set in the secondary screen 42.
  • the x axis is parallel to the shorter side of the secondary screen 42
  • the y axis is parallel to the longer side of the secondary screen 42
  • the z axis is vertical.
  • the plane formed by the x-axis and the y-axis points out of the secondary screen 42.
  • a Cartesian coordinate system O2 can be set in the main screen 41.
  • the x-axis is parallel to the shorter side of the main screen 41
  • the y-axis is parallel to the longer side of the main screen 41
  • the z-axis is perpendicular to the x-axis.
  • the plane formed by the y axis points to the inside of the main screen 41.
  • the gyroscope and acceleration sensor in the secondary screen 42 can detect the magnitude and direction of gravity G in the Cartesian coordinate system O1, and the gyroscope and acceleration sensor in the main screen 41 can detect the magnitude and direction of the gravity G in the Cartesian coordinate system O2.
  • the magnitude and direction of gravity G Since the direction of the y-axis in the Cartesian coordinate system O1 and the Cartesian coordinate system O2 is the same, the component G1 of the gravity G on the x-axis and z-axis planes in the Cartesian coordinate system O1 is the same as the gravity G in the Cartesian coordinate system O2
  • the components G2 on the x-axis and z-axis planes are equal in magnitude but different in direction.
  • the angle between the component G1 and the component G2 is ⁇
  • the electronic device 100 calculates the angle ⁇ between the component G1 of the gravity G in the Cartesian coordinate system O1 and the component G2 of the gravity G in the Cartesian coordinate system O2 to obtain the difference between the secondary screen 42 and the main screen 41 Angle ⁇ .
  • the angle ⁇ between the secondary screen 42 and the main screen 41 may vary within a closed interval formed by 0 to 180°.
  • the included angle ⁇ between the secondary screen 42 and the main screen 41 may vary in a closed interval formed by 180° to 360°.
  • the included angle ⁇ between the secondary screen 42 and the main screen 41 may also vary within a closed interval formed by 0 to 360°, which is not limited in the embodiment of the present application.
  • the flexible screen is in the unfolded state at this time.
  • the angle between the main screen 41 and the secondary screen 42 is reduced from 180° to 170°, and the primary screen 41 and the secondary screen 42 are both in the landscape state, it can be defined that the flexible screen is in the support state at this time.
  • the embodiments of the application do not impose any restriction on this.
  • the horizontal screen state refers to a state where the height of the main screen 41 and the sub screen 42 is less than the width.
  • the aspect ratio (that is, the ratio of height to width) of the display screen (for example, the main screen 41 or the sub screen 42) is less than 1.
  • the height of the display can also be understood as the vertical length of the display
  • the width of the display can also be understood as the horizontal length of the display
  • the aspect ratio of the display can also be understood as the aspect ratio of the display (ie, the vertical length and The ratio of lateral length).
  • the main screen 41 and the sub screen 42 may also be in a vertical screen state.
  • the vertical screen state refers to the state where the height of the main screen 41 and the secondary screen 42 is greater than the width
  • the aspect ratio (that is, the ratio of height to width) of the display screen is greater than 1.
  • the electronic device 100 can determine whether the main screen 41 is currently in the landscape state or the portrait state through the acceleration sensor and the gyroscope on the main screen 41; similarly, the electronic device 100 can determine the secondary screen through the acceleration sensor and the gyroscope on the secondary screen 42.
  • the screen 42 is currently in a landscape state or a portrait state.
  • the electronic device 100 can implement a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, and an application processor.
  • the ISP is used to process the data fed back from the camera 193. For example, when taking a picture, the shutter is opened, the light is transmitted to the photosensitive element of the camera through the lens, the light signal is converted into an electrical signal, and the photosensitive element of the camera transfers the electrical signal to the ISP for processing and is converted into an image visible to the naked eye.
  • ISP can also optimize the image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 193.
  • the camera 193 is used to capture still images or videos.
  • the object generates an optical image through the lens and projects it to the photosensitive element.
  • the light-sensitive element can be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • ISP outputs digital image signals to DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats.
  • the electronic device 100 may include 1 or N cameras 193, and N is a positive integer greater than 1.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 100 selects the frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs. In this way, the electronic device 100 can play or record videos in a variety of encoding formats, such as: moving picture experts group (MPEG) 1, MPEG2, MPEG3, MPEG4, and so on.
  • MPEG moving picture experts group
  • the external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example, save music, video and other files in an external memory card.
  • the internal memory 121 may be used to store one or more computer programs, and the one or more computer programs include instructions.
  • the processor 110 can execute the above-mentioned instructions stored in the internal memory 121 to enable the electronic device 100 to execute the display methods provided in some embodiments of the present application, as well as various functional applications and data processing.
  • the internal memory 121 may include a storage program area and a storage data area. Among them, the storage program area can store the operating system; the storage program area can also store one or more application programs (such as a gallery, contacts, etc.) and so on.
  • the data storage area can store data (such as photos, contacts, etc.) created during the use of the electronic device 101.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, universal flash storage (UFS), etc.
  • the processor 110 executes the instructions stored in the internal memory 121 and/or the instructions stored in the memory provided in the processor to cause the electronic device 100 to execute the display method provided in the embodiments of the present application. , And various functional applications and data processing.
  • the electronic device 100 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. For example, music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into an analog audio signal for output, and is also used to convert an analog audio input into a digital audio signal.
  • the audio module 170 can also be used to encode and decode audio signals.
  • the audio module 170 may be provided in the processor 110, or part of the functional modules of the audio module 170 may be provided in the processor 110.
  • the speaker 170A also called a “speaker” is used to convert audio electrical signals into sound signals.
  • the electronic device 100 can listen to music through the speaker 170A, or listen to a hands-free call.
  • the receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the electronic device 100 answers a call or voice message, it can receive the voice by bringing the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone”, “microphone”, is used to convert sound signals into electrical signals.
  • the user can approach the microphone 170C through the mouth to make a sound, and input the sound signal to the microphone 170C.
  • the electronic device 100 may be provided with one or more microphones 170C.
  • the electronic device 100 may be provided with two microphones 170C, which can implement noise reduction functions in addition to collecting sound signals.
  • the electronic device 100 can also be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and realize directional recording functions.
  • the earphone interface 170D is used to connect wired earphones.
  • the earphone interface 170D may be a USB interface 130, or a 3.5mm open mobile terminal platform (OMTP) standard interface, and a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA, CTIA
  • the sensor module 180 may include a pressure sensor, a gyroscope sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, etc.
  • the aforementioned electronic device 100 may also include one or more components such as buttons, motors, indicators, and SIM card interfaces, which are not limited in the embodiment of the present application.
  • the software system of the above electronic device 100 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the embodiment of the present application takes a layered Android system as an example to illustrate the software structure of the electronic device 100.
  • FIG. 7 is a software structure block diagram of the electronic device 100 according to an embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Communication between layers through software interface.
  • the Android system is divided into four layers, from top to bottom, the application layer, the application framework layer, the Android runtime and system library, and the kernel layer.
  • the application layer can include a series of applications.
  • the above-mentioned application programs may include APPs (applications) such as call, contact, camera, gallery, calendar, map, navigation, Bluetooth, music, video, short message, etc.
  • APPs applications
  • the application interface displayed during runtime of some APPs may include not only controls that cannot interact with users, such as pictures, but also controls that can interact with users, such as buttons and progress bars.
  • the application interface when running an application interface of a game application, the application interface may include a game screen and an operating handle used to control the game.
  • the playback interface of a video application when running, the playback interface may include a video screen being played and control buttons for controlling video playback (such as fast forward, pause, barrage, etc.).
  • the application framework layer provides application programming interfaces (application programming interface, API) and programming frameworks for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer can include a view system, a notification manager, an activity manager, a window manager, a content provider, a resource manager, an input method manager, and so on.
  • the view system can be used to construct the display interface of the application.
  • Each display interface can consist of one or more controls.
  • controls can include interface elements such as icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, and widgets.
  • the view system can obtain view information of the corresponding display interface when drawing the display interface, and the view information records the layer relationship between the various controls in the display interface to be drawn.
  • each control in the display interface is generally organized hierarchically according to a tree structure to form a complete ViewTree (view tree), which may be referred to as the view information of the above display interface.
  • the view system can draw the display interface according to the layer relationship between the controls set in the ViewTree.
  • Each control in the display interface corresponds to a set of drawing instructions, such as DrawLine, DrawPoint, DrawBitmap, etc.
  • the view system can in turn call the drawing instructions of the corresponding control to draw the control according to the layer relationship between the controls in the ViewTree.
  • (a) in Figure 8 is the chat interface 801 of the WeChat APP
  • the bottom control of the chat interface 801 is the root node (root)
  • the basemap 802 also includes the following Controls: title bar 803, chat background 804, and input bar 805.
  • the title bar 803 further includes a return button 806 and a title 807
  • the chat background 804 further includes an avatar 808 and a bubble 809
  • the input bar 805 further includes a voice input button icon 810, an input box 811, and a send button 812.
  • These controls are layered in order to form a view tree A as shown in Figure 8(b).
  • the view tree A records the layer order between the various controls in the chat interface 801.
  • the base map 802 is located under the root node, and the title bar 803, the chat background 804, and the input bar 805 are all child nodes of the base map 802. Both the return button 806 and the title 807 are child nodes of the title bar 803. Both the avatar 808 and the bubble 809 are child nodes of the chat background 804.
  • the voice input button icon 810, the input box 811, and the sending button 812 are all child nodes of the input field 805.
  • a new recognition module may be added to the view system for recognizing operation controls in the application interface.
  • the operation control refers to a control that can interact with a user, that is, the control can present corresponding functions in response to a user's operation.
  • the operation controls in the aforementioned chat interface 801 include a return button 806, an avatar 808, a bubble 809, a voice input button icon 810, an input box 811, and a send button 812.
  • the aforementioned recognition module can also be set at the application framework layer independently of the view system, and the embodiment of the present application does not impose any limitation on this.
  • a status monitoring service can be set in the underlying system below the application framework layer to identify the position and shape changes of the flexible screen.
  • the condition monitoring service can be set in the system library and/or kernel layer.
  • the status monitoring service may call a sensor service to start sensors such as a gyroscope and an acceleration sensor for detection.
  • the condition monitoring service can calculate the angle between the current main screen 41 and the secondary screen 42 based on the detection data reported by each sensor.
  • the status monitoring service can report split-screen events to the view system.
  • the view system may periodically obtain the current physical state of the flexible screen from the state monitoring service.
  • the recognition module in the view system can identify the operation controls in the current application interface, and display the identified one or more operation controls on a screen of the flexible screen (such as the main screen) In 41), at the same time, the controls in the application interface that cannot be interacted with the user (hereinafter referred to as non-operation controls) are displayed on another screen of the flexible screen (for example, the secondary screen 42).
  • the electronic device can display the operating controls and non-operating controls in the application interface being displayed on the first screen and the second screen of the flexible screen.
  • the first screen is the main screen 41
  • the second screen is the secondary screen 42
  • the first screen is the secondary screen 42
  • the second screen is the main screen 41.
  • the electronic device can display the progress bar, fast forward, pause and other operation controls in the video playback interface on the secondary screen, and display the video screen (ie, non-operation control) played in the video playback interface In the main screen.
  • the electronic device may display operation controls such as an operation handle in the game interface on the secondary screen, and display the game screen (ie, non-operation controls) in the game interface on the main screen.
  • operation controls such as an operation handle in the game interface on the secondary screen
  • the game screen ie, non-operation controls
  • the operation controls after the split-screen display will not block the above-mentioned video images or game images, so that the user can obtain an immersive visual and operating experience.
  • the aforementioned activity manager can be used to manage the life cycle of each application.
  • Applications usually run in the operating system in the form of activity.
  • the activity manager can schedule the activity process of the application to manage the life cycle of each application.
  • the window manager is used to manage window programs.
  • the window manager can obtain the size of the display, determine whether there is a status bar, lock the screen, take a screenshot, etc.
  • the content provider is used to store and retrieve data and make these data accessible to applications.
  • the data may include video, image, audio, phone calls made and received, browsing history and bookmarks, phone book, etc.
  • the resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, etc.
  • Android Runtime includes core libraries and virtual machines. Android runtime is responsible for the scheduling and management of the Android system.
  • the core library consists of two parts: one part is the function functions that the java language needs to call, and the other part is the core library of Android.
  • the application layer and the application framework layer run in a virtual machine.
  • the virtual machine executes the java files in the application layer and application framework layer as binary files.
  • the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
  • the system library can include multiple functional modules. For example: surface manager (surface manager), media library (Media Libraries), three-dimensional graphics processing library (for example: OpenGL ES), 2D graphics engine (for example: SGL), etc.
  • the surface manager is used to manage the display subsystem, and provides a combination of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
  • the media library can support multiple audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to realize 3D graphics drawing, image rendering, synthesis, and layer processing.
  • the 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer includes at least a display driver, a camera driver, an audio driver, a sensor driver, etc., which are not limited in the embodiment of the present application.
  • a mobile phone is used as an example of an electronic device, and a method for displaying a flexible screen provided by an embodiment of the present application will be described in detail with reference to the accompanying drawings.
  • the physical state of the flexible screen in the mobile phone may also include a folded state.
  • a folded state As an example of an outward-folding flexible screen, as shown in (a) of FIG. 9, when the angle between the main screen 41 and the auxiliary screen 42 is 360° (or close to 360°), the flexible screen is in a folded state.
  • an inward-folding flexible screen as an example, as shown in FIG. 9(b), when the angle between the main screen 41 and the sub screen 42 is 0 (or close to 0), the flexible screen is in a folded state.
  • the mobile phone may display a desktop 1001 on the main screen 41, and at this time, the secondary screen 42 (not shown in FIG. 10) may be in a black screen state.
  • the condition monitoring service in the mobile phone determines that the angle ⁇ between the main screen 41 and the secondary screen 42 is 360° according to the data reported by the sensor, the condition monitoring service can continue to call the camera, infrared sensor, proximity light sensor or touch device Wait to identify the specific orientation of the phone.
  • cameras can be installed on the main screen 41 and the secondary screen 42. If the camera of the main screen 41 captures facial information, but the camera of the secondary screen 42 does not capture the facial information, the status monitoring service can determine the flexibility of the current folding state The screen is in a state where the main screen 41 faces the user.
  • infrared sensors can be installed on the main screen 41 and the auxiliary screen 42 respectively. If the infrared sensor of the auxiliary screen 42 captures the infrared signal radiated by the human body, but the infrared sensor of the main screen 41 does not capture the infrared signal radiated by the human body, the status monitoring The service may determine that the flexible screen in the current folded state is in a state where the secondary screen 42 faces the user.
  • the mobile phone can also report the user's current touch position through the touch device.
  • a preset grasping algorithm can be used according to the touch position to determine the grasping posture of the user currently grasping the handset.
  • the state monitoring service can also determine the specific orientation of the mobile phone.
  • the touch control device may report the coordinates of the touch point to the state monitoring service after detecting the touch event.
  • the state monitoring service determines the gripping posture of the mobile phone by counting the position and number of touch points in the touch device.
  • the screen facing the user at this time is the secondary screen 42.
  • the threshold indicating that the user's fingers and palm are grasped on the main screen 41
  • the screen facing the user at this time is the secondary screen 42.
  • the threshold it means that the user's fingers and palm are grasped on the secondary screen 42, and the screen facing the user is now the primary screen 41.
  • the mobile phone can display the desktop 1001 shown in FIG. 10 on the main screen 41.
  • the mobile phone can display the above-mentioned desktop 1001 on the secondary screen 42.
  • the mobile phone can also display the application interface of other applications on the main screen 41 (or the secondary screen 42), which is not limited in the embodiment of the present application.
  • the desktop 1001 includes an icon 1002 of a game application. If the user's click on the icon 1002 is detected, as shown in FIG. 11, the mobile phone can open the game application and display the game interface 1101 of the game application. Exemplarily, if the game application supports display in the horizontal screen state but not in the vertical screen state, the mobile phone may automatically display the game interface 1101 in the horizontal screen state. Or, if the game application supports both the horizontal screen state display and the vertical screen state display, the mobile phone may display the game interface 1101 in the horizontal screen state after detecting that the user has rotated the mobile phone from the vertical screen state to the horizontal screen state.
  • the status monitoring service in the mobile phone can periodically detect the angle between the main screen 41 and the sub screen 42 in the flexible screen, and whether the main screen 41 and the sub screen 42 are in a landscape state.
  • the included angle between the main screen 41 and the auxiliary screen 42 is 180°, it indicates that the flexible screen is switched from the above-mentioned folded state to the unfolded state.
  • the included angle between the main screen 41 and the sub screen 42 is within the range of [90, 180°)
  • the main screen 41 and the sub screen 42 are both in landscape mode, it indicates that the flexible screen is switched from the above-mentioned folded state to the stand state .
  • the mobile phone can separate the operating controls and non-operating controls in the game interface 1101, and display the operating controls in the game interface 1101 on a screen of the flexible screen (such as a secondary In the screen 42), the non-operation controls in the game interface 1101 are displayed in another screen (for example, the main screen 41) of the flexible screen.
  • an application whitelist may be preset in the mobile phone, and the application whitelist includes one or more application identities, such as application package names.
  • the applications in the application whitelist all support the split-screen display of operating controls and non-operating controls in the application interface. Then, when the mobile phone detects that the flexible screen is switched to the expanded state (or stand state), the mobile phone can obtain the package name of the currently running application (such as the aforementioned game application), and then query whether the game application is in the aforementioned application whitelist Pre-set application.
  • an interface whitelist can be preset in the mobile phone, and the interface whitelist includes one or more application interface identifiers, such as the activityname of the application interface.
  • the application interfaces in the interface whitelist all support the function of split-screen display of their operating controls and non-operating controls. Then, when the mobile phone detects that the flexible screen is switched to the expanded state (or stand state), the mobile phone can obtain the activityname of the currently running application interface, and then query the whitelist of the interface that the currently displayed application interface is the preset application interface .
  • the mobile phone when the flexible screen is in a folded state, the mobile phone can display the game interface 1101 in a horizontal screen state.
  • the game interface 1101 when the flexible screen is detected to be switched to the expanded state, if the game interface 1101 is a preset application interface, the mobile phone can identify the operation controls and non-operation controls in the game interface 1101, and then The identified non-operating controls are displayed on the main screen 41, and the identified operating controls are displayed on the secondary screen 42 (that is, the non-operating controls and the operating controls are displayed on separate screens).
  • the game interface 1101 is not a preset application or application interface, the mobile phone can continue to display the game interface 1101 according to the existing display mode, and the embodiment of the present application does not impose any limitation on this.
  • the mobile phone can obtain the view tree 1300 corresponding to the view system drawing the above-mentioned game interface 1101 in the folded state.
  • the view tree 1300 records the layer order among the various controls in the game interface 1101.
  • the root node of the game interface 1101 includes a control of a game screen 1301, and the game screen 1301 includes the following controls: an operation handle 1302, an attack button 1303, and a setting bar 1304.
  • the setting column 1304 further includes a first setting button 1305 and a second setting button 1306.
  • the game screen 1301 is a child node of the root node
  • the operation handle 1302, the attack button 1303, and the setting bar 1304 are all child nodes of the game screen 1301, located on the layer where the game screen 1301 is located.
  • the first setting button 1305 and the second setting button 1306 are both child nodes of the setting column 1304 and are located on the layer where the setting column 1304 is located.
  • the mobile phone can record the position and size of each control in the view tree 1300, and the corresponding drawing instruction when drawing each control.
  • the phone When drawing the game interface 1101, the phone can first draw the game screen 1301 according to the view tree 1300, and then draw the operation handle 1302, the attack button 1303, and the setting bar 1304 on the game screen 1301, and then draw the first setting button 1305 on the setting bar 1304 And the second setting button 1306.
  • the mobile phone when the mobile phone detects that the flexible screen is switched from the folded state shown in Figure 12 (a) to the expanded state shown in Figure 12 (b), the mobile phone can obtain the above-mentioned view tree 1300 and query the view tree 1300 The position and size of each control.
  • operation controls are usually located in the upper layer of the non-operation control, and the non-operation control is usually displayed as a base map in the lower layer of the operation control, and the size of the non-operation control is generally relatively large.
  • the mobile phone finds that the size of the control of the game screen 1301 is the same as or close to the size of the entire main screen 41, the mobile phone can confirm the game screen 1301 as a non-operation control in the game interface 1101. In addition, the mobile phone can recognize each child node of the game screen 1301 in the view tree 1300 as an operation control.
  • the mobile phone when the mobile phone finds that the size of the control of the game screen 1301 is the same as or close to the size of the entire main screen 41, it can further determine whether the layer where the game screen 1301 is located is the top layer. If the layer where the game screen 1301 is located is not the top layer, the mobile phone can confirm the game screen 1301 as a non-operation control in the game interface 1101. In addition, the mobile phone may determine each control located on the layer where the game screen 1301 is located as an operation control.
  • the mobile phone can recognize the control with a larger display size and close to the bottom layer of the application screen as a non-operational control according to the size and position relationship of each control when drawing the current application interface, and recognize the control located above the non-operational control It is an operation control.
  • the mobile phone can also obtain the specific type of each control in the view tree 1300.
  • the control may include image control (Image View), text control (Text View), button (Button), navigation bar (ActionBar), text box, dialog box, status bar and other types.
  • the view system in the mobile phone may send the identification or location of each control in the view tree 1300 to the game application, requesting the game application to query the specific type of each control in the view tree 1300 according to the identification or location of the control.
  • the mobile phone may record the type of the control in the corresponding field when drawing each control in the view tree 1300. Then, the mobile phone can search for the specific type of each control in the view tree 1300 from the corresponding field.
  • the mobile phone can confirm that the control is a non-operation control.
  • the game screen 1301 in the aforementioned game interface 1101 is an Image View type non-operation control.
  • a certain control is a button (Button), navigation bar (ActionBar), text box, dialog box, status bar and other types of controls
  • the mobile phone can confirm that the control is an operation control.
  • the operation handle 1302, the attack button 1303, and the controls in the setting bar 1304 in the aforementioned game interface 1101 are all operation controls.
  • the configuration file corresponding to the aforementioned game interface 1101 can also be saved in the mobile phone in advance.
  • the configuration file records the operation controls in the aforementioned game interface 1101.
  • the configuration file may record the specific display position of the operation control in the game interface 1101 before the split screen (that is, when the flexible screen is in the folded state).
  • the configuration file may also record the specific position of the above-mentioned operation control on the main screen or the secondary screen after the screen is split (that is, when the flexible screen is in the expanded or stent state).
  • the configuration file may also record information such as the translation distance, zoom ratio, and rotation angle of the operation control after split screen.
  • the configuration file 1 corresponding to the aforementioned game interface 1101 may be:
  • each configuration file records the corresponding packagename, activityname, and the position of the operation control before split screen.
  • the mobile phone can search according to the packagename and activityname of the currently running game application To the corresponding configuration file (for example, configuration file 1 above).
  • the mobile phone can recognize the operation control 1 in the game interface 1101 according to the configuration file 1.
  • the mobile phone can identify which control is specifically the control 1 in the game interface 1101 according to the specific position of the control 1 before the split screen recorded in the configuration file 1. In this way, by traversing the positions of all operation controls before splitting the screen in the configuration file 1, the mobile phone can identify each operation control in the game interface 1101.
  • the mobile phone can recognize controls other than the operation controls in the game interface 1101 as non-operation controls.
  • the mobile phone can recognize through the above method that the operation controls in the game interface 1101 include: an operation handle 1302, an attack button 1303, and various controls in the setting bar 1304.
  • the non-operation controls in the game interface 1101 include the game screen 1301.
  • the mobile phone can display non-operation controls on the main screen 41 and operation controls on the secondary screen 42.
  • the mobile phone may also display the non-operation controls on the secondary screen 42 and the operation controls on the main screen 41.
  • the above embodiment is an example of the flexible screen switching from the folded state to the unfolded state.
  • the mobile phone can also call the camera, infrared sensor, proximity light sensor or touch device to identify the specific orientation of the mobile phone.
  • the mobile phone can display the identified non-operation controls on the main screen 41 and the identified operation controls on the secondary screen 42. If the secondary screen 42 faces the user at this time, the mobile phone can display the identified non-operation controls on the secondary screen 42 and the identified operation controls on the main screen 41. In this way, the screen facing the user can display controls that do not require operation and interaction for the user (for example, the above-mentioned game screen 1301), and the user can use the operation controls on another screen to update or edit the content in the game screen 1301. Operation controls The game screen 1301 will not be blocked, so that the user can get an immersive viewing experience.
  • the configuration file 1 corresponding to the game interface 1101 not only records the pre-split operation control 1 (For example, the position of the operating handle 1302) in the main screen 41, and the position of the operating handle 1302 in the secondary screen 42 after the split screen is also recorded. Then, according to the specific positions of the operation controls in the configuration file 1 before and after the split screen, the mobile phone can determine the view tree corresponding to the main screen 41 after the split screen and the view tree corresponding to the secondary screen 42 after the split screen.
  • FIG. 15 shows a view tree 1300 of the game interface 1101 in a folded state.
  • the operation control and the non-operation control in the view tree 1300 can be split into two view trees.
  • the mobile phone can use the layer relationship of the operation controls in the view tree 1300 to generate the corresponding view tree 1401. At this time, the layer relationship of the operation controls in the view tree 1300 and the view tree 1401 is the same. Similarly, the mobile phone can use the layer relationship of the non-operational controls in the view tree 1300 to generate the corresponding view tree 1402. At this time, the layer relationship of the non-operational controls in the view tree 1300 and the view tree 1402 is also the same.
  • the mobile phone can also split the operation control from the view tree 1401, and according to the record in the configuration file 1.
  • Each operation control generates a corresponding view tree 1401 at the display position after the split screen.
  • the layer relationship of the operation control in the view tree 1300 and the view tree 1401 may be the same or different, and the embodiment of the present application does not impose any limitation on this.
  • the operation handle 1302, the attack button 1303, and the controls in the setting bar 1304 in the view tree 1300 are all operation controls, these operation controls are displayed on the secondary screen 42 after splitting the screen. Then, the mobile phone can generate a view tree 1401 corresponding to the secondary screen 42.
  • the view tree 1401 includes an operating handle 1302, an attack button 1303, and a setting bar 1304 set under the root node.
  • the setting column 1304 also includes two sub-nodes, namely the first setting button 1305 and the second setting button 1306. The position of each control in the view tree 1401 in the secondary screen 42 is recorded in the configuration file 1.
  • the non-operation control is displayed in the main screen 41 after split screen.
  • the mobile phone can generate a view tree 1402 corresponding to the main screen 41, and the view tree 1402 includes a game screen 1301 set under the root node.
  • the position of the game screen 1301 in the main screen 41 after the split screen is the same as the position of the game screen 1301 when the view tree 1300 is drawn.
  • the mobile phone can redraw the game screen 1301 on the main screen 41 according to the view tree 1402.
  • the mobile phone since the mobile phone records the drawing instructions corresponding to each control when drawing the above-mentioned game interface 1101, when the mobile phone redraws the game screen 1301 in the main screen 41, the game screen 1301 when drawing the above-mentioned game interface 1101 can still be used. Redraw the game screen 1301 with the drawing instructions.
  • the mobile phone can draw various operation controls in the view tree 1403 on the secondary screen 42 according to the view tree 1403.
  • the mobile phone can use the drawing instruction of drawing the operating handle 1302 when drawing the aforementioned game interface 1101, and draw the operating handle 1302 at the position of the operating handle 1302 after the split screen indicated by the configuration file 1.
  • each control in each level of control in the view tree 1403 can be drawn in a corresponding position in the secondary screen 42 in order.
  • the main screen 41 and the sub-screen 42 are the same size, when the mobile phone draws each operation control in the view tree 1403 on the sub-screen 42, it can be adjusted according to the coordinates of each operation control in the main screen 41. Operation controls are drawn in the secondary screen 42.
  • the first setting button 1305 in the game interface 1101 is a 40*30 rectangle, and the coordinates of the upper left corner of the rectangle on the main screen 41 are (380, 10), and the coordinates of the lower right corner are (420, 40).
  • the first setting button 1305 can also be drawn as a 40*30 rectangle, and the rectangle is drawn on the secondary screen 42 with the upper left corner coordinates (380, 10 ), the area with coordinates (420,40) in the lower right corner.
  • the size or shape of the operation controls can also be changed.
  • the above configuration file 1 records that the zoom ratio of the operating handle 1302 on the x-axis is 2 and the zoom ratio on the y-axis is also 2, it means that the size of the operating handle 1302 in the secondary screen 42 after the split screen is before the split screen 2 times the operation handle 1302 in the game interface 1101.
  • the mobile phone can magnify the operating handle 1302 in the game interface 1101 by 2 times to obtain a new operating handle 1503, and draw the operating handle 1503 on the secondary screen 42.
  • the mobile phone can take up more or less display space when drawing operation controls after split screen, so that users can get a better experience when operating these operation controls after split screen.
  • the mobile phone after the mobile phone detects that the flexible screen is switched from the folded state to the stand (or unfolded) state, it can also determine the operation controls and non-operations in the application interface to be drawn before drawing the application interface according to the above configuration file 1. Control. Furthermore, as shown in (a)-(b) of FIG. 15, the mobile phone can split the view tree 1300 that originally needs to be drawn into a view tree 1401 and a view tree 1402. The view tree 1401 is all operation controls that need to be drawn, and the view tree 1402 is all the non-operation controls that need to be drawn. In this way, as still shown in FIG.
  • the mobile phone can draw various operation controls on the secondary screen 42 according to the view tree 1401, and draw various non-operation controls on the main screen 41 according to the view tree 1402.
  • the mobile phone can identify the operating controls and non-operating controls in the application interface before drawing the application interface according to the configuration file, and then display the operating controls and non-operating controls on the main screen 41 and the secondary screen 42 separately, without drawing After the above-mentioned game interface 1101 is displayed, the split-screen interface is drawn twice according to the configuration file.
  • the mobile phone draws the operation controls in the game interface 1101 on the secondary screen 42 to form a first interface 1501, and the mobile phone draws the non-operation controls in the game interface 1101 on the main screen to form a second interface. 1502.
  • the user can operate each operation control on the secondary screen 42 to continue to complete related functions of the game application.
  • the touch sensor on the secondary screen 42 can detect the user's touch operation on the first interface 1501 in real time.
  • the user can touch the operating handle 1503 in the first interface 1501 to control the game character to move up, down, left, and right.
  • the secondary screen 42 can sequentially report the touch information (for example, the coordinate information of the touch point, the touch time, etc.) of the detected series of touch events to the kernel layer of the mobile phone and the application program. Framework layer and application layer.
  • the secondary screen 42 Take the coordinate system in the secondary screen 42 as the first coordinate system, and the coordinate system in the main screen 41 as the second coordinate system. If the coordinates of the touch point in a touch event detected by the secondary screen 42 are P(x, y), combined with the architecture diagram of the Android system shown in FIG. 7, as shown in FIG. 17, the secondary screen 42 can be detected by driving
  • the received touch event is encapsulated as an original touch event and reported to the kernel layer, which then encapsulates the original touch event as an upper-level readable advanced touch event and reports it to the application framework layer.
  • the secondary screen 42 is the first coordinate system set at the lower left corner as the origin, and the mobile phone
  • the operating system may be the first coordinate system set at the upper left corner as the origin.
  • the application framework layer After the application framework layer receives the advanced touch event carrying the coordinates P(x,y) or P'(x',y'), it can follow the positional relationship of each operation control recorded in the configuration file 1 after the split screen , It is determined that this touch event is an operation performed on the operating handle 1503. Furthermore, the application framework layer can convert the first touch event whose touch point is P point to the first touch event whose touch point is Q point in the second coordinate system according to the positional relationship of the operating handle 1503 before and after the split screen in the configuration file 1. Two touch events.
  • the operating handle 1302 before split screen is located in a circular area with a center of A (40, 40) and a radius of 40 in the game interface 1101.
  • the mobile phone displays the operating handle 1503 in the secondary screen 42 in a circular area with a radius of 80 and point B (80, 80) as the center according to the configuration file 1.
  • the operating handle 1503 is enlarged twice in both the x-axis and the y-axis.
  • the application framework layer can determine the touch point corresponding to point P in the second coordinate system
  • the coordinates of Q are (50,40). That is, the touch point P (x, y) on the operating handle 1503 in the first coordinate system corresponds to the touch point Q (0.5x, 0.5y) on the operating handle 1302 in the second coordinate system.
  • the application framework layer can report the second touch event carrying the touch point Q (0.5x, 0.5y) to the running game application in the application layer, so that the game application can be based on the touch point Q (0.5x, 0.5y). ) Respond to the user's touch operation this time.
  • the game application After the mobile phone detects the first touch event of the user inputting the touch point P(x, y) on the secondary screen 42, the game application finally responds to the user at the Q(0.5x, 0.5y) point on the primary screen 41
  • the application can also normally respond to the touch operations performed by the user on the operating controls after the split screen.
  • the foregoing game interface 1101 is used as an example to illustrate the display method of the game interface 1101 before and after the split screen. It is understandable that the mobile phone can also display the operating controls and non-operating controls in the application interface in a split screen according to the above method when running other applications or application interfaces.
  • the outer-folding flexible screen is exemplified. It can be understood that for the inner-folding flexible screen, the mobile phone can also display the operating controls and non-operating controls in the application interface according to the above method. The embodiments of the application do not impose any restriction on this.
  • the inward-folding flexible screen is in a folded state.
  • both the main screen 41 and the sub screen 42 can be in a black screen state. If it is detected that the included angle between the main screen 41 and the secondary screen 42 is 180°, it means that the flexible screen is switched from the folded state to the unfolded state at this time. If it is detected that the angle between the main screen 41 and the sub screen 42 is within the range of [90, 180), and the main screen 41 and the sub screen 42 are both in landscape mode, it means that the flexible screen is switched from the folded state to the stand state. .
  • the mobile phone can unlock the main screen 41 or the sub-screen 42 of the flexible screen, and can also use the main screen 41 and the sub-screen 42 as a complete display area for display after unlocking.
  • the desktop 1801 can be displayed on the main screen 41.
  • the desktop 1801 includes an icon 1802 of a video application. If the user's operation to open the video application is detected, the mobile phone can open the video application to display various application interfaces of the video application.
  • the mobile phone can automatically identify the operation controls and non-operation controls in the playback interface 1901 according to the method in the foregoing embodiment. If it is recognized that the video screen 1902 in the playback interface 1901 is a non-operation control, the progress bar 1903, the playback button 1904, the fast forward button 1905, the barrage button 1906, and the return button 1907 in the playback interface 1901 are the operation controls, as shown in Figure 20
  • the mobile phone can display the video screen 1902 (ie, non-operation control) on the main screen 41 facing the user, and display the progress bar 1903, the play button 1904, the fast forward button 1905, the pop-up button 1906 and the return The button 1907 (that is, the operation control) is displayed in the secondary screen 42. Subsequently, the user can operate various operation controls on the secondary screen 42 to control the video application. These operation controls will not block the video screen 1902 on the primary
  • the mobile phone when it is detected that the flexible screen is switched from the folded state to the unfolded state (or stand state), and the screen is unlocked, the mobile phone can automatically display the video screen 1902 (that is, non- The operation controls) are displayed on the main screen 41, and the progress bar 1903, the play button 1904, the fast forward button 1905, the popup button 1906, and the return button (ie, operation controls) are displayed on the secondary screen 42. Subsequently, the user can operate various operation controls on the secondary screen 42 to control the video application. These operation controls will not block the video screen 1902 on the primary screen 41, so that the user can obtain an immersive viewing and using experience.
  • the mobile phone can also use the entire flexible screen as a complete display area to display the first interface (for example, the aforementioned playback interface 1901 or Game interface 1101).
  • the flexible screen is switched to the expanded state or the stand state, that is, when the angle between the main screen 41 and the secondary screen 42 is in the range of [90, 180°]
  • the mobile phone can also recognize the first interface according to the display method provided in the above embodiment Display the identified operation controls on one of the main screen 41 and the secondary screen 42, and display the identified non-operation controls on the other screen of the primary screen 41 and the secondary screen 42
  • the embodiments of the present application do not impose any restriction on this.
  • the embodiment of the present application discloses an electronic device including a processor, and a memory, an input device, an output device, and a communication module connected to the processor.
  • the input device and the output device may be integrated into one device.
  • the input device and the output device may be a flexible screen, and the flexible screen may include a display device and a touch sensor.
  • the aforementioned electronic device may include: a flexible screen 2201; one or more processors 2202; a memory 2203; one or more application programs (not shown); and one or more computer programs 2204
  • the above-mentioned devices may be connected through one or more communication buses 2205.
  • the one or more computer programs 2204 are stored in the aforementioned memory 2203 and configured to be executed by the one or more processors 2202.
  • the one or more computer programs 2204 include instructions, and the aforementioned instructions can be used to execute the aforementioned implementations.
  • the steps in the example Among them, all relevant content of the steps involved in the above method embodiments can be cited in the functional description of the corresponding physical device, which will not be repeated here.
  • the foregoing processor 2202 may specifically be the processor 110 shown in FIG. 2
  • the foregoing memory 2203 may specifically be the internal memory 121 and/or the external memory 122 shown in FIG. 2
  • the foregoing flexible screen 2201 may specifically include FIG. 2
  • the display screen 194 and the touch sensor in the sensor module 180 shown are not limited in the embodiment of the present application.
  • the functional units in the various embodiments of the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated unit can be implemented in the form of hardware or software functional unit.
  • the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a computer readable storage medium.
  • a computer readable storage medium includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor to execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage media include: flash memory, mobile hard disk, read-only memory, random access memory, magnetic disk or optical disk and other media that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

Les modes de réalisation de la présente invention concernent un procédé d'affichage pour un écran flexible et un dispositif électronique, se rapportant au domaine technique des terminaux, et pouvant réagencer les diverses commandes dans l'interface d'affichage sur la base de l'état physique de l'écran flexible, de telle sorte que l'utilisateur vit une meilleure expérience lorsqu'il regarde ou utilise l'écran flexible. Le procédé comprend les étapes suivantes : lorsque l'angle d'ouverture entre un premier écran de l'écran flexible et un second écran de l'écran flexible est inférieur à une première valeur prédéfinie ou supérieur à une seconde valeur prédéfinie, le dispositif électronique affiche une première interface sur le premier écran et/ou sur le second écran ; lorsque l'angle d'ouverture entre le premier écran et le second écran se situe dans une plage fermée formée par la première valeur prédéfinie et la seconde valeur prédéfinie, le dispositif électronique détermine une commande de fonctionnement et une commande de non-fonctionnement dans la première interface ; le dispositif électronique affiche une deuxième interface sur le premier écran, la deuxième interface comprenant la commande de fonctionnement ; et le dispositif électronique affiche une troisième interface sur le second écran, la troisième interface comprenant la commande de non-fonctionnement.
PCT/CN2020/093899 2019-06-05 2020-06-02 Procédé d'affichage pour écran flexible et dispositif électronique WO2020244497A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910488438.1 2019-06-05
CN201910488438.1A CN110389802B (zh) 2019-06-05 2019-06-05 一种柔性屏幕的显示方法及电子设备

Publications (1)

Publication Number Publication Date
WO2020244497A1 true WO2020244497A1 (fr) 2020-12-10

Family

ID=68285277

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/093899 WO2020244497A1 (fr) 2019-06-05 2020-06-02 Procédé d'affichage pour écran flexible et dispositif électronique

Country Status (2)

Country Link
CN (3) CN113157130A (fr)
WO (1) WO2020244497A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113129753A (zh) * 2021-04-15 2021-07-16 维沃移动通信有限公司 一种内折式电子设备
CN114428534A (zh) * 2022-01-12 2022-05-03 Oppo广东移动通信有限公司 电子设备及其控制方法
CN116055597A (zh) * 2022-08-08 2023-05-02 荣耀终端有限公司 一种折叠角度的检测方法及电子设备
CN116257235A (zh) * 2021-12-10 2023-06-13 华为技术有限公司 绘制方法及电子设备
WO2024017090A1 (fr) * 2022-07-21 2024-01-25 华为技术有限公司 Procédé d'affichage d'informations et dispositif électronique
CN117519864A (zh) * 2023-09-19 2024-02-06 荣耀终端有限公司 界面显示方法、电子设备及存储介质

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113157130A (zh) * 2019-06-05 2021-07-23 华为技术有限公司 一种柔性屏幕的显示方法及电子设备
CN113010076A (zh) * 2019-10-30 2021-06-22 华为技术有限公司 一种显示要素的显示方法和电子设备
CN111124561B (zh) * 2019-11-08 2021-02-12 华为技术有限公司 应用于具有折叠屏的电子设备的显示方法及电子设备
KR20210068853A (ko) * 2019-12-02 2021-06-10 삼성전자주식회사 카메라를 포함하는 폴더블 전자 장치
CN111190559A (zh) * 2019-12-04 2020-05-22 深圳市东向同人科技有限公司 一种投屏控制同步方法、移动终端及计算机可读存储介质
CN110968252B (zh) * 2019-12-18 2021-10-22 华为技术有限公司 交互系统的显示方法、交互系统及电子设备
CN114237530A (zh) * 2020-01-21 2022-03-25 华为技术有限公司 一种折叠屏的显示方法及相关装置
CN111405339B (zh) * 2020-03-11 2022-08-12 咪咕互动娱乐有限公司 一种分屏显示方法、电子设备及存储介质
CN113542453A (zh) * 2020-03-31 2021-10-22 北京小米移动软件有限公司 一种折叠屏终端设备、其状态检测方法和电子设备
CN111522523A (zh) * 2020-04-30 2020-08-11 北京小米移动软件有限公司 显示处理方法及装置、计算机存储介质
CN114356196B (zh) * 2020-09-30 2022-12-20 荣耀终端有限公司 一种显示方法及电子设备
CN113553015B (zh) * 2020-10-22 2022-05-31 华为技术有限公司 一种显示方法及电子设备
CN112565490B (zh) * 2020-12-10 2022-08-19 维沃移动通信有限公司 电子设备
CN114125137A (zh) * 2021-11-08 2022-03-01 维沃移动通信有限公司 一种视频显示方法、装置、电子设备和可读存储介质
CN116107482A (zh) * 2021-11-08 2023-05-12 Oppo广东移动通信有限公司 应用控制方法、装置、终端、存储介质及计算机程序产品
CN115562535B (zh) * 2022-01-14 2024-05-17 荣耀终端有限公司 应用控制方法和电子设备
CN115686407B (zh) * 2022-06-30 2023-08-29 荣耀终端有限公司 显示方法与电子设备
CN115016709B (zh) * 2022-08-09 2022-12-09 荣耀终端有限公司 应用界面的显示方法、电子设备及存储介质
CN115686334B (zh) * 2022-10-31 2023-11-28 荣耀终端有限公司 操作控制的方法、电子设备及可读存储介质
CN115904151A (zh) * 2022-11-30 2023-04-04 南京欧珀软件科技有限公司 显示方法与装置、电子设备

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105549785A (zh) * 2015-12-29 2016-05-04 广东欧珀移动通信有限公司 一种终端
US20160246558A1 (en) * 2015-02-24 2016-08-25 Samsung Display Co., Ltd. Foldable display
CN107728901A (zh) * 2017-10-24 2018-02-23 广东欧珀移动通信有限公司 界面显示方法、装置及终端
US20180059721A1 (en) * 2016-08-26 2018-03-01 Semiconductor Energy Laboratory Co., Ltd. Data processing device, display method, input/output method, server system, and computer program
CN109542380A (zh) * 2018-11-26 2019-03-29 Oppo广东移动通信有限公司 显示模式的控制方法、装置、存储介质及终端
CN110389802A (zh) * 2019-06-05 2019-10-29 华为技术有限公司 一种柔性屏幕的显示方法及电子设备

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101301032B1 (ko) * 2007-03-08 2013-08-28 엘지전자 주식회사 통신기기 및 이를 이용한 영상 통화시 데이터파일의전송방법
CN100478881C (zh) * 2007-05-16 2009-04-15 珠海金山软件股份有限公司 一种用户操作区域反遮挡的装置和方法
KR20130080937A (ko) * 2012-01-06 2013-07-16 삼성전자주식회사 플랙서블 디스플레이를 구비하는 단말장치의 화면 표시장치 및 방법
KR102561200B1 (ko) * 2014-02-10 2023-07-28 삼성전자주식회사 사용자 단말 장치 및 이의 디스플레이 방법
US10579238B2 (en) * 2016-05-13 2020-03-03 Sap Se Flexible screen layout across multiple platforms
CN107678724A (zh) * 2017-10-19 2018-02-09 广东欧珀移动通信有限公司 一种信息显示方法、装置、移动终端及存储介质
CN108459815B (zh) * 2018-03-16 2020-06-02 维沃移动通信有限公司 一种显示控制方法及移动终端
CN109710161A (zh) * 2018-12-29 2019-05-03 中兴通讯股份有限公司 一种显示控制方法及装置、电子设备

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160246558A1 (en) * 2015-02-24 2016-08-25 Samsung Display Co., Ltd. Foldable display
CN105549785A (zh) * 2015-12-29 2016-05-04 广东欧珀移动通信有限公司 一种终端
US20180059721A1 (en) * 2016-08-26 2018-03-01 Semiconductor Energy Laboratory Co., Ltd. Data processing device, display method, input/output method, server system, and computer program
CN107728901A (zh) * 2017-10-24 2018-02-23 广东欧珀移动通信有限公司 界面显示方法、装置及终端
CN109542380A (zh) * 2018-11-26 2019-03-29 Oppo广东移动通信有限公司 显示模式的控制方法、装置、存储介质及终端
CN110389802A (zh) * 2019-06-05 2019-10-29 华为技术有限公司 一种柔性屏幕的显示方法及电子设备

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113129753A (zh) * 2021-04-15 2021-07-16 维沃移动通信有限公司 一种内折式电子设备
CN113129753B (zh) * 2021-04-15 2023-04-07 维沃移动通信有限公司 一种内折式电子设备
CN116257235A (zh) * 2021-12-10 2023-06-13 华为技术有限公司 绘制方法及电子设备
CN114428534A (zh) * 2022-01-12 2022-05-03 Oppo广东移动通信有限公司 电子设备及其控制方法
CN114428534B (zh) * 2022-01-12 2023-11-28 Oppo广东移动通信有限公司 电子设备及其控制方法
WO2024017090A1 (fr) * 2022-07-21 2024-01-25 华为技术有限公司 Procédé d'affichage d'informations et dispositif électronique
CN116055597A (zh) * 2022-08-08 2023-05-02 荣耀终端有限公司 一种折叠角度的检测方法及电子设备
CN116055597B (zh) * 2022-08-08 2023-11-14 荣耀终端有限公司 一种折叠角度的检测方法及电子设备
CN117519864A (zh) * 2023-09-19 2024-02-06 荣耀终端有限公司 界面显示方法、电子设备及存储介质

Also Published As

Publication number Publication date
CN113157130A (zh) 2021-07-23
CN110389802B (zh) 2021-05-18
CN110389802A (zh) 2019-10-29
CN113268196A (zh) 2021-08-17

Similar Documents

Publication Publication Date Title
WO2020244497A1 (fr) Procédé d'affichage pour écran flexible et dispositif électronique
WO2020244492A1 (fr) Procédé d'affichage par projection d'écran et dispositif électronique
WO2020244495A1 (fr) Procédé d'affichage par projection d'écran et dispositif électronique
CN112714901B (zh) 系统导航栏的显示控制方法、图形用户界面及电子设备
CN113645351B (zh) 应用界面交互方法、电子设备和计算机可读存储介质
WO2020000448A1 (fr) Procédé et terminal d'affichage d'écran flexible
WO2020253758A1 (fr) Procédé de disposition d'interface utilisateur et dispositif électronique
CN110495819B (zh) 机器人的控制方法、机器人、终端、服务器及控制系统
CN111669459B (zh) 键盘显示方法、电子设备和计算机可读存储介质
WO2021036770A1 (fr) Procédé de traitement d'écran partagé et dispositif terminal
WO2022068483A1 (fr) Procédé et appareil de démarrage d'application, et dispositif électronique
WO2022022575A1 (fr) Procédé et appareil de commande d'affichage et support de stockage
WO2021078032A1 (fr) Procédé d'affichage d'interface utilisateur et dispositif électronique
CN112130788A (zh) 一种内容分享方法及其装置
WO2021057699A1 (fr) Procédé de commande d'un dispositif électronique à écran flexible et dispositif électronique
WO2021190524A1 (fr) Procédé de traitement de capture d'écran, interface utilisateur graphique et terminal
CN114579016A (zh) 一种共享输入设备的方法、电子设备及系统
WO2022143180A1 (fr) Procédé d'affichage collaboratif, dispositif terminal et support de stockage lisible par ordinateur
CN115016697A (zh) 投屏方法、计算机设备、可读存储介质和程序产品
US20220317841A1 (en) Screenshot Method and Related Device
WO2022228043A1 (fr) Procédé d'affichage, dispositif électronique, support de stockage et produit-programme
WO2022152174A1 (fr) Procédé de projection d'écran et dispositif électronique
WO2022062902A1 (fr) Procédé de transfert de fichier et dispositif électronique
WO2022002213A1 (fr) Procédé et appareil d'affichage de résultat de traduction, et dispositif électronique
WO2021196980A1 (fr) Procédé d'interaction multi-écran, dispositif électronique, et support de stockage lisible par ordinateur

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20818985

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20818985

Country of ref document: EP

Kind code of ref document: A1